"AI" Means Throwing Creatives Under the Bus
UK Government plans will tear up copyright law, legalise theft, and risk the livelihoods of thousands of creatives
The government is throwing creatives big and small under a shiny new bus.
A new consultation on “AI”, so called Artificial Intelligence, proposes to tear up UK copyright law, which is what protects all creatives from theft.
(Most Creative Industry bodies have combined to fight this)
Resources at the bottom of the page
AI is basically promising us magic beans.
Jack and the Beanstalk by Louis Glackens (via artvee)
What’s the problem?
AI is not ‘intelligent’ in any meaningful sense.
LLMs – large language models – are trained on data. For creatives, this is not ‘data’ – it is our art. Years of sweat and skill writing a book or a play, painting or photographing, developing skills as an actor, editor or translator, etc.
The tech companies usually trained on copyrighted material without asking, knowing that this probably broke copyright protections in the US, UK and EU.
The government proposes that it would be impossible for creatives to sue the AI companies or receive any other form of compensation for past wrongs.
The government does propose that you can ‘opt out’ of your stuff being used in future. Opt out is wrong in principle – UK copyright law is usually ‘opt in’. It is also cumbersome and impractical. Personally I have dozens of projects across 25 platforms I would need to keep track of.
The proposal of some compensation for future use is meaningless without opt-in, and without payment for past theft.
This is a capitulation to large, unscrupulous and untrustworthy companies driven by greed.
It attacks the UK creative industries - which at over £100 bn in value, 5% of GDP, and are vastly bigger than industries the government sweats to protect.
There are many moral, aesthetic, and philosophic reasons to tame AI. But copyright is one a government can legitimately act on, for good or ill.
Don’t give in without a fight. This can be fought.
(PS: not all LLMs are terrible. It’s possible, but not proven, specific technical analysis of data under professional control might help, eg identify tumours, etc. This sort of work can be done ethically.)
Creative Industries statement against AI training (online petition.)
A longwinded and biased government consultation concludes 25 February. Society of Authors has a guide on filling it in.
Please do more than just sign the letter. We have a short window to put a brake on this.
Normal service will be resumed.
I don’t propose to make this an anti-AI newsletter and I have cheery news on works in progress, the rise in SFF sales, and other bookish stuff. I’ll write again soon.