
The Agenda
The Agenda brings together all our audio content into a single place. Our lawyers and special guests help you navigate the challenges facing businesses today, looking from a big picture perspective as well as a legal one. Listen, subscribe and leave us a comment. More on us and what we do at www.lewissilkin.com.
The Agenda
LS@SXSW AI and creativity - Hero or Villain?
Recorded live at SXSW UK, this podcast brings together leading voices from the creative, legal, and technology
sectors to debate whether AI is a hero or villain in the world of creativity. The panel delves into the profound questions
raised by generative AI, including issues of authorship, originality, and the economic impact on creative professionals.
With perspectives from photographers, tech innovators, and legal experts, the conversation examines the current legal
frameworks in the UK, US, and EU, the challenges of fair compensation and licensing, and the urgent need for
transparency and regulatory reform. The discussion also highlights the rapid displacement of creative jobs, the
opportunities for new forms of artistic expression, and the importance of developing ethical, creator-focused AI models.
Listeners gain a nuanced understanding of the risks and opportunities AI presents for the future of the creative
industries.
about billions of creative work, those are being ingested and the consequence of course is that we've got these
generative AI programs that are beginning to compete with creators who've spent lifetimes building up their skill set
whether that's you know whether that's in a short space of time a student trying to learn their craft to those emerging
trying to establish the businesses to those who are established and of course trying to, I suppose, continue their
livelihood in that respect. So we conducted, for context, there's about 50,000 professional photographers in the UK.
We don't represent that many, we'd love to, but essentially when we conducted a survey with our members, we did
one back in September 2024. And they reported back to say, about 30% had said we've lost work to generative AI
already. So this is where they're being commissioned. They were put in their treatment. Several of them, of course, will
be pitching for a job. The client has come back and basically said, well, do you know what? We're going to use
generative AI. Suddenly, they're not working. Suddenly, they won't be bringing in a team of people to work with in that
respect. So of course, it's not just a photographer in isolation. It's the team that works with them. So you're losing that
whole economic infrastructure there. When we conducted the interview for the consultation, we found this had jumped
to 58.1%. That's enormous, and that was in five months. And that gives you some kind of scale in that respect. But to
put that in context of economic loss, from our members in that particular cohort, it was about 43 million. 43 million
pounds that essentially has gone. It's just vanished.
Of course, on the flip side, you could imagine, well of course, those who are making those decisions are thinking about
the benefits, the costs they're saving. We need to look at the whole ecosystem. The creative industries is an
ecosystem of 2.5 billion people. Sorry, million. I'd love it to be billion, but million people. And of course, we've
contributed 126 billion to the UK economy. If we start displacing those people, what is happening? How is that going to
really affect the second largest industry in the UK? And so therefore, that's one of the reasons why we've been
pushing government to say we need an economic report. We need to actually look at the stats, look at the figures, and
see what's happening. And of course, at the same time, we know that there are, as I was saying, people
experimenting, etc. People are trying to decide whether, and creators are trying to decide whether, should I be
embracing this, should I be taking this into my business practice, how can I incorporate it? But I think one of the
reasons I wanted to come on the panel was to try and explain the fact that actually at the same time there's this
significant displacement and it's affecting authors and illustrators.
Phil Hughes:
It is interesting that I think it's hitting the creator and creative economy earlier than other industries and being impact
much more impactful than it is for example, and it will be on the health industry and it is thus lawyers is affecting our
industry, but the creative industry we see has been particularly impacted and I think yeah, that's a really insightful
viewpoint. Thank you I wish to follow up on a point with actually with Guy if that's all right, Guy? On that kind of job
losses and loss of economic return what's your view? Do you think we'll see a shorter term kind of market and job loss
problems for creators but will ultimately see more jobs being created by AI longer term or do you think there's going to
be an entirely different ecosystem that's just going to set up and what's your viewpoint on that?
Guy Gadney:
So we're in a transition phase without a shadow of a doubt. Technology has always impacted creativity, whether it's a
paintbrush or whether it's the printing press or photography, know, or digital photography and the word process and so
forth. So there's always been that. The difference that we're sitting in here is the scale of it, the speed of it, and the fact
that it's, I talk about AI, and its broadest umbrella sense as being not just a new trend, like for those of you who may
remember the metaverse or NFTs, God forbid, this is much broader. It is a proper tsunami in the sense of the breadth
and volume that is changing. Probably the seats that you're sitting on have had some form of AI involved in their
design and everything that we have. So it's a broader piece. The way I look at it generally is that it's a two-sided coin at
the moment. The moment we're going through this transitional phase where you're seeing a lot words like productivity,
efficiency and those sorts of things. Those carry baggage, financial baggage and emotional baggage. The financial
baggage is stuff's being done cheaper and often by the way it's in the advertising industries the clients who are
pushing the agencies to say well look with AI as Isabelle said you know with the clients saying they want that particular
reduction that's what I'm hearing from those agencies. And then, but the emotional side of it is that within that comes
job losses and it is, I don't think we can get around that. I think there is a transition that's going to happen on that.
That's short term. I'm an optimist generally. And I think the other side of the coin is more interesting, which is what can
we do now that we couldn't do before? There is an insatiable desire for storytelling, which is a space I work in a lot. As
we've seen, and by the way, other, the AI is, you know, to me is a sort of incredible catalyst at the moment. It's like this
sort of lightning that's hit and sparking, but there have been macro trends around the creator economy coming through
over the last sort of 10, 20 years, whether that's again, smartphones with photography and Instagram or YouTube with
videos and so forth. That's been rising and has fundamentally shifted the entire economic model of the entire media
ecosystem. And it's done it almost invisibly. Then comes AI on top of it that suddenly is this sort of flash point. So
we've got this sort of this enormous thing happening incredibly fast in front of us and it sort of feels like it's been born
fully grown, you know, that we haven't had time to evolve and I see, you know, we run workshops with young
filmmakers from diverse backgrounds. We do a lot of work trying to bring up just around here as well, you know, trying
to educate people into it. Part of the problem we find is that in the UK we don't have access to these tools until about
three or four months after the US, which sets us back in terms of the adoption rates which sets us back in terms of the
education. So we're trying to push that to enable the new forms of creativity to come out. That's the bit that interests
me. What stories can we tell now that we couldn't do before? What creativity can we have that we couldn't have
before? How do we push the forms? Because that's how it's always been. We've always invented new things. We are
a creative species. So that is a good thing. And then as both Ben and Isabelle were saying, and we've talked about
before, you need to have a new economic model that underpins it, that fairly recompenses it. And Isabelle and I sit on
the same part of a contributor to this thing called ACCCT, 3Cs. Act, thank you. It's much easier to say act. But it's a
search term, so if you Google it, you know, the 3Cs. Anyway, the principle is that the short answer is blockchain, right?
Because we're in a very, very fluid environment where copyright, where things are moving around very fast. There's a
fluidity in the space. Blockchain is not a bad model for it and this is a proposal that's gone up to government to look at
a methodology to do that.
The whole point and having seen a number of digital transformations in my background going way back to when I was
running digital for Penguin Books back in days, you know, in almost analog
days, these transformations are hectic,
they are stressful. There is a sense, I think, that we go through the five stages of grief psychologically as we do it. We
feel very angry about it, we deny it, and then we're sad about it, and then we come through it the other way. So there is
hopefully a happy ending at end but I think in terms of the hero or villain you know there is a
responsibility that we all
have now and I think we're all working to do to actually take control of that debate not cast it. It's the mistake to cast it
as either a hero or a villain, you know, it's not a binary, it's much more nuanced than that. It's a six-dimensional cube.
Sorry I'm answering the question before we got the end of the panel, but I think I think that's the sort of, that's how
we've got to look at it. We can stick, we can change stuff, things can change but we need to change as well. We can't
just, we can't block and blocking now would be terrible for the UK economy. Absolutely the wrong move in my view.
Isabelle Doran:
Actually, if I can comment on that, off the back of that as well, I think the key thing we need to be thinking about though
is with these, we're all sort of, I suppose, with generative AI in that respect because we're in the creative environment
here. But we've got to think about where are these generative AI programs coming from? I mean, most of them, the
largest ones, are coming out of the US. And I think it's really interesting in terms of the journey that I've been taking to
learn and understand what these programs do, how they function, how they operate. But then when you look at the
broader geopolitical issue, it's the fact that most of these programs are from the US.
So essentially, at the moment, what we're struggling also with, and it's something we were having a conversation
through Baroness Kidron's bringing together of creatives with tech experts, was the fact that we were not really looking
at AI sovereignty in that respect. So there isn't the capacity at this stage, which there should be, about bringing in
creatives with tech experts to be able to develop our own ethical AI models. At the moment we're relying on these US
models such as chat GPT, such as mid journey etc. who really is their interest in getting bigger and getting us all to
consume that rather than actually developing our own nascent models.
Guy Gadney:
Do you think there's space for us to develop our own models and catch up?
Isabelle Doran:
If we maintain our copyright laws, if we maintain our gold standard
Phil Hughes:
My view, and you probably won't agree with it, but there's an inevitable delay that comes in over focusing on copyright.
I know, I'm a lawyer, I know how important it is. And trying to perfect what the approach should be what will happen.
But I think what we've seen happen and will continue to happen for a little bit longer is a delay of implementing an
appropriate copyright law, which when you're looking at the US and UK and how do you balance it and Donald Trump
randomly do what he wants and the relationship that he has with Keir Starmer and his legislation, all happens. People
just lose more and more money. So, and I want to talk a little bit about kind of compensation for creators, therefore,
and licensing, and whether we think that's the answer. My question, I suppose, is in the age of AI, as brands do use
models and likeness and content that you've talked about, and we'll talk a little bit later about should there be a
moratorium for big tech companies or not to sort this out. But the question is, I suppose, is what should fair
compensation look like for talent and how can we ensure that creators and formers aren't left behind in this new
landscape? I don't know you want to go first Ben and when you answer us if you could also talk a little bit about kind of
CMOs and collective bodies maybe as well that might be quite interesting. What do want to yeah what do think Ben?
Benjamin Woolams:
Fair compensation is a tall order right now, I think, because where do you start and where do you stop? I think any sort
of compensation is probably where we should start to begin with. And I think the requirement for that requires
transparency from these tech industries, which we're not yet at, unfortunately. In terms of, and I've seen some of the
licensing agreements, you see these kind of big publishers licensing a lot of data towards some of these tools and, you
know, great PR and headlines around that to be ethical, so to speak. But even then, when you really dig into what
those licensing agreements look like, their licensing agreements for three years with no continuance, the cost per data
or per volume isn't that significant. And if you look at that compared to creators individually, it doesn't stack up. so
licensing is a framework where I'm not sure to what extent it will help us backdate because it requires a lot of
transparency. And like said, we've had this conversation many a time around, well, what are they going to be
transparent about? What can they share? What will they argue they can't share and everything else? But the one thing
we need to get right is what it looks like to license forward as well. And I think we're almost waiting to resolve the
backdated issues to then allow us to move forward to the licensing framework of future. But we should try to be doing
them in tandem.
Phil Hughes:
Do you think we can do it in tandem or just that people talk like I was on a panel last week without the speaker was so
much. Should we explore this idea of a moratorium where actually everything's been backdated, should it be let go in
order to be able to solve it going forward. If we were to do that the only people that's really going to suit is big tech and
I think it'll be a start up
UK startup market that will suffer from that as well as obviously the craters. But do you have a
view on the moratorium and.
Isabelle Doran:
I don't think we should stop. don't think we should particularly where we're at with regards to legislation. The point is,
and we've seen, so I don't know if many of you would know this in this respect about what's been happening in the
House of Lords and House of Cormons
at the moment. There's a couple of colleagues of mine in the audience who
would do because we've all been following this with bated breath. Because essentially, one of the key things which
Ben you brought up was about transparency. Transparency is essential. We need to know what has gone into it.
effectively, we've been told everything's been scraped
. But what's really important with generative AI models is the fact
that they're weighted on the different types of content. So for example, all our social media content, everything we've
uploaded, will be the biggest weight as such. Then you've got these more fine-tuned, specialist sort of creative works
that would really inform how we see all these fantastic images. And of course, we're all uploading constantly. So it's
constantly refining, constantly fine tuning. The challenge we've got is that if we don't know what's gone into those
models, we can't start the process by really pushing for licensing and pushing for compensation. Now, I definitely
believe we need to seek compensation. If we go back to the Napster moment, on the one hand, you had everybody
going, wow, great. I can listen to music without having to buy a record or whatever method you're going to take. But as
soon as that challenge and threat came from the music industry, essentially what we saw was suddenly the rise of
Apple Music, the iPod, and then we saw the rise of Spotify. Now, they still exist. And there are, of course, arguments
about the fact that not a huge amount of money comes back to the musician We're all, you know, we're we're solidly
behind the fact that they need to get more money. At least they're getting something. And again, that point was made
being made in Parliament by the House of Lords because we're asking for transparency. There's I suppose the
creators champion, Baroness Biban Kidron, is a who is a very well known
BAFTA filmmaker. She is pushing for this
amendment to be added into the Data Use and Access Bill. And it's been ping-ponging back and forth between, well,
it's a rarity to go beyond three. It's now... We're at five, so only this week. In fact, yesterday, the House of Lords really
impassioned speeches, including the lovely Baroness Floella Benjamin that many of you might remember from your
childhood, if you're that old enough. But essentially, they were speaking so passionately about the fact that we need to
understand, we need transparency in order to be able to either push for compensation or licensing of both. I mean, I'm
a both kind of person, but you're saying, you know, obviously from this point onwards. But it's important to establish
that money needs to be coming in, it needs to go to creators, because obviously you can't, I suppose the best way of
looking at it is you've gone to sleep at night, you've woken up in the morning and you found your car was stolen. And
you think, Christ almighty, you know, that was valuable, etc. And the police just can't do anything about it. And that's
where we're at. And what we want to be able to do is be able to enforce properly.
Phil Hughes:
It's actually an interesting analogy, it? actually the law of theft is there, but actually there's like a resource in the police
to enforce that and it's just a really big societal problem. So it's an interesting analogy beyond just the idea of
intellectual property theft.
Benjamin Woolams:
We're talking about input and training again though in these instances. And I think there's a massive opportunity cost
for the generation. I copyright protects expression. It doesn't protect style. And so if you wanted to create content in the
style of, there's an opportunity cost there for creators as well. And hence why I'm talking about the licensing
framework, backdated and moving forward. Because this talent has amass that following that skill set, everything that
they've done in terms of community building. And they want to be scalable to an extent. AI should be a solution that
they can use safely to be more present, more relevant in more communities, but they need the framework to go and do
so. And so it is both a training issue and also a generation issue.
Guy Gadney:
And by the way, the other myth to dispel is that this is not new. This debate has been around for a while, we were
talking about. The first time it crossed my path was back in 2018, 2016 I think which was what I said Google create me
from on this in 2006 or 08 was sued by the Authors Guild in America for a Google books. So Google was ingesting
arguably well tens of millions of books into their system and it went up through the courts in the US took I think already
saying 12 or 10 years to go through, 10 years to go through the courts and the court in America came down on on
Google's side on the basis that it was transformative in other words what was going in and, to Ben's point, what was
coming out was so different that you couldn't equate the two, which is a logic. So this has been around for a while, and
I was writing about this in the bookseller going, hang on a second, I think this is theft around here. The trouble is that
it's taken this long for the creative industries to wake up. And the only reason I think that it came through was through
visuals. We knew a little bit about audio, when you see something, when you see a mid-journey image and it's got little
bits of Getty Image logo on it, it's visual that something's happened. In all that time the tech has grown and we've seen
the evolution, the speed accelerate and everything. It's taken that long. So secondly, when we see these bills going
between the Lords and the House of Commons, and I've read some commentary today where it's like, yay, it's been
rejected. That's a disaster. It's a disaster that these things are ping-ponging. This is not a game like a game of tennis
where six is better than five. This is slowing down this entire country during which time the creators are not getting
compensated, the creative AI industry cannot move because while there's uncertainty in that there's going to be no
investment into that. It's like we've we're blocked from going to the stadium and the runners are already around the
track lapping. We have to move faster in this. It has to happen fast. We all want roughly the same thing. It's the
nuance. And if we get caught in the minutiae of it, it's going to damage the entire creative sector, which is struggling at
the moment, is my view. But we've known about this for a while. And if you want the background, do please look up
that case, because it's a fascinating read, even for non-lawyer.
Phil Hughes:
And I'll come back to you the role of government, and he's for an employer. I'll come back to you on the role of
government and industry in a moment. I just wanted to ask a little while on incentivisation, where we've obviously seen
the issues around the use of James L. James' voice in the Star Wars stuff on Fortnite, which my son is a big fan of,
that's what I know about Fortnite, which has now been referred to SAG-AFRRA. But what kind of actually, practically
speaking, what kind of licensing regimes or forms of remuneration are you seeing in the
industry like gaming industry which I know you do a lot of work in and or in content industry, what are you seeing in
real life?
Laura Harper:
Yeah, so when we're looking at licensing, it's sort of moved relatively quickly in a short period of time. you know, a few
years ago we started seeing maybe in talent contracts reference to the right to use your likeness, actual or digital, or
actual or synthetic. And these sort of two tiny words, you know, were just creeping into agreements, but they had
profound effects on the rights to use that input, if you like, that content, whether it's an image or whether it's piece of
creative work. And then they've started to become a little bit more sophisticated so in terms of whether that is just
saying look no AI use until and unless terms are agreed but that goes to your point finite periods of time is that
workable, potentially. And then, know, whole schedules of terms and conditions which split out the rights and the
remuneration for those rights as well. And so there is a vast spectrum at the moment and there's no sort settles
approach at the moment for the reasons that we've been talking about here.
Phil Hughes:
All right, thank you.
Benjamin Woolams:
There's never been any standardisation when it comes to influencers talent creators and usage rights come from the
influencer space and so did eight years in that industry, brokering deals for brands and talent and you saw this
evolution over its particularly last three or four years where brands are being savvy to how they can squeeze, you
know this content into marketing plans. How do we take it and reuse it? One issue is ill education or a gap in education
between these big brands talking to creators and they're not understanding all the nuance behind these usage causes,
but also there's never been a standardisation around how you price. Where are you using it? You can almost combat
licensing on the front foot. Where are you using it, using it in this market on these channels in these formats? What
does that look like? How do we price against that? We're actually bringing out a calculator to help talent educate
themselves on what it looks like to license. That's a way from the AI game. But then you bring in a proliferation of AI
content, what happens there? There's way more content, way more usage, way more versions of how the talent keep
on track of it.
Laura Harper:
Yeah, absolutely right.
Isabelle Doran:
And this is where things like collective rights management comes in in that respect. Because obviously if you've got
large organisations, a large company effectively, and we've seen this with news publishing, they've been able to
negotiate with large generative AI companies like OpenAI, etc. And because they've got a repository of content
effectively, whether that's writing or images, they've been able to agree these huge licensing terms and they've been
going from different newspaper to different newspaper in that respect and you know some book publishers as we know
but when you're talking about individual creators as you say how in the world do you start thinking well, is it possible?
And that's where that seed of doubt starts creeping in. It is certainly possible and it's certainly something that we see.
The challenge we have is because creators' works have been already taken. They're on the back foot saying,
I don't
know if I really want my works licensed in that way. I want to keep the fact that my works, I've spent a huge amount of
time and investment in them. I mean, for example, our president, Tim Flack, who is both an incredible animal
photographer and conservationist, you know, he brings animals into his studio or sets them up in a studio-like setting
and he will invest 10 grand to bring, for example, a tiger into a studio. Now, AI can generate, you know, in less than a
second, a picture that's identical, you'd put in Tiger by Tim Flack, you've got it. The challenge is obviously trying to find
that balance. And this is where collective licensing for individual creators has an opportunity. Because essentially,
we've got collective rights management organisations that already exist. They already license for the works that are
copied in lots of different scenarios. And they have that opportunity because they've been given, I suppose, a license
from rights holders to say, go on behalf of us, go and talk to these companies and bring back a license and then divide
the income between individual creators. So it does exist. They are monopolies to a certain extent. But the key point is
about consent and about getting that remuneration.
Phil Hughes:
We've seen broadly a line that there should be some form of consent, transparency is important, licensing is going to
play a role. So practically speaking, let's go to Guy then, see what you think about this, Guy. Because we've seen,
we've talked about the Commons and the Lords and it's going back and forth and that's not a good thing. It's been
positioned as quite combative in terms of the people being diametrically opposed. But how, what's your view on how
government and industry can work together to foster innovation and AI whilst kind of protecting the interests of both
sides? It's a big, big question, but I don’t know if you've got a viewpoint of what should happen next and how
government could maybe be doing better.
Guy Gadney:
So I think we can already see what the future looks like sort of for the you know and to Isabelle's point a lot of it's
happening in the US and the issue is that it's happening, well it doesn't it's not taking the US in any way shape or form
the fact is it's happening outside of the UK that's the problem from my perspective and if we if we look to those models
we've got an idea of where the future might look so we can see the future already that's what it looks like so how we
then look at how we want to take control of that future and build it for ourselves is the way to do it, which is what, you
know, with what we're doing with Charismatic, now are, just over a year ago, we started the project with Channel 4 and
Arden Animations. We're funded by Innovate UK to look, as you said, about storytelling. Storytelling is the almost, no
disrespect to photography, it's almost the bit that we're most concerned about as humans, because it's the start of
everything. would include photography as one of the outputs of storytelling. So we knew we could see that this was
going to be one of the things that the tech companies were going to be interested in because YouTube, because
TikTok and everything else. So we thought it would be a really good idea to start looking at this from within the creative
industries, look at the bridge between AI and storytelling and come up with a solution. Because that's the key, you
know, geopolitically, if you look at anything where a vacuum is is
created, bad things go into a vacuum. Generally,
politically speaking. If we can come in and not create a vacuum, but create a solution, which is for creative people
who, know, who want to evolve the idea of the story they want to create, then we can do that. Having Channel 4 on
board gave us access to the independent production sector to build in and co-design the project with us. Having
Aardman on board, I just said very bluntly, it's like, if I can keep Aardman animations happy, I'm okay, you know,
because they're so craft. So it was very much consultative in that, in an area that no one was looking. Now, as we just
go into launch now, everyone's starting to look at it. We've come from a point, and by the way, on the ethics side, you
know, which is another thing like AI that's really hard to define, we had a full-time digital ethicist that I funded into the
project. And she did this lovely approach, strategy that I would recommend to everyone, which is called consequence
scanning. So she was at all the exec meetings with me twice a week, talking about these things, and she was
consequence scanning, looking for intended and unintended consequences of what we were doing and raising them
up. That didn't mean she had power of veto. It meant that she would raise them up and then we could make the
decision out of it. So as we go into this space, the problem we've got in the debate is that
it's very heated, it's very
emotional. I don't see enough solutions being put by either side on what this could look like. You know, the act is a
good model that we were starting to look at. And it's not about frameworks, it's about what the end result is and how
creators are going to get more money because at the end of the day, this is not about rights, it's not about copyright,
it's not about jobs, it's about money. Because all of those three things lead to money. And as the UK, we're really bad
at acknowledging that. The US it's easy, can say money. Here it's all money, that's not so good. It's money, we need to
get more money into the heart of the creators, into the hands of the creators. And to your point about collective,
granted you're right, it's sometimes a middleman. Record labels, my god, that's a whole other debate, we can have it
offline. But back into the musicians, back into the photographers, back into the artists and writers, that's the goal.
Phil Hughes:
What one thing would you ask for to make if we try and make AI a hero what would you ask for? Who wants to go
first?
Isabelle Doran:
I mean, well it's interesting because obviously I know you did, you know, the title of this was Hero, Villain, so you're
trying to do that. It isn't a binary situation as we, it's very nuanced. It’s very, very nuanced. I think for us the key issue,
the key challenge would be about trying to instill
personality rights. It sits outside copyright. We've got the copyright
framework, but personality rights hasn't really been explored in the UK at all. It would sit under trademarks and it's
effectively, we would know it as passing off, but passing off is fairly weak and it's on a case by case basis, but
personality rights would help protect the talent of the creator that means that they can protect their investment, would
protect their work and it would give them much more opportunity to be able to continue in their work confidently whilst
we see the rise in generative AI.
Phil Hughes:
I entirely agree. I'd love to see that. Guy, do you want to go? One thing.
Guy Gadney:
Hero or villain? I'm with Isabelle. I think we've just invented gunpowder again. And we can make fireworks, we can
make mining, all sorts of stuff, or we can make weaponry. And it's going to be a little bit of both.
Phil Hughes:
Ben, what one thing would you like to see happen?
Benjamin Woolams:
I don’t want to just copy. It would be personality rights. would also be transparency in tech. think there needs to be
some mandates around the communication between sectors. I think, and I need to be really careful because we keep
saying intellectual personality. I actually took that from someone else. So we can actually, I will drop a name, Terry
Cruz is the guy who came up with it. So need be a little bit careful. But the interesting thing with this and personality
rights is that these generative
AI tools, like we're saying, we're looking back and they're now looking here. What can
we start to do more with personality with image, with likeness and it's not just your image it's your height, your weight,
the way you walk, the way you talk your mannerisms, your ethnicity, your accent, how you are as a person, are you
joyful, are you sad? These are all now things that are starting to be copied and licensed into content and so we do
need to see an advancement in personality rights we also need to see an advancement in tools for ownership because
that needs to be consolidated in a way that allows talent to have ownership and compensation for it.
Phil Hughes:
Thank you. I agree. Laura.
Laura Harper:
Yeah, strengthening that raft of rights, you like, personality rights, obvious one.
⁓
So that we don't, it's not about
diluting the integrity of the system at the moment, it is protecting our world-leading creative industries, and whilst
allowing for the support and growth of the AI sector.
Phil Hughes:
Thank you. I appreciate you being succinct as well. So thank you, everyone. I think we might actually end it there and
give everyone a minute back. And all that really remains for me is to say thank you, everyone, for coming and for
listening. Your time is appreciated and valuable, so thank you. But thank you also to Isabelle, Guy, Ben, and Laura for
your time and your unique insights. Again, your time is valuable, so I do really much appreciate it. If we could give a
quick round of applause to the panel, that would be great.