I would like to make one thing absolutely clear from the start. If you are looking at these words and, er, reading them, that’s OK. Really, it is. In fact, it’s kind of the point.
I mention this because, for more than a year, my inbox has received messages from the Authors’ Licensing and Collecting Society (ALCS) seeking my views – and expressing theirs – on the rights of Artificial Intelligence (AI) to read an author’s published work. I seem to be in a minority on this topic – possibly even a minority of one – so let me just say that reading my work is still OK even if you are a machine. In fact, dear automaton, I say: “Go for it” (or “Have at it” if you are an appliance resident in America). Chances are, if you are a machine, you’ve already gone for it and had at it.
Authors are not the only people engaging in a discussion about AI. The creative industries generally have been campaigning against the use of their work to train machines. They have called their campaign “Make It Fair”. In a recent letter to The Times, a range of creatives from Ed Sheeran to Sir Simon Rattle and from Helen Fielding to Sir Tom Stoppard railed against AI “stealing our copyright”.
Now, I’m not trying to dismantle copyright. In fact, with a new book out last month (peaking at No 1 in Amazon’s Mass Media sales, thank you for asking), I’m all in favour of earning royalties. But let’s go back to first principles. Very few writers have ever picked up a pen and written without first reading the work of others and learning from what they have read.
Copyright is a protection for writers against having their work copied; it is not a protection against other people learning from their work. That is why Oasis could safely release She’s Electric, Up In the Sky and a load of other Beatle-esque songs, but George Harrison had to pay up for singing My Sweet Lord.
If the AI’s output is a copy of an author’s work, that is a breach of copyright. I don’t know of anyone who is trying to change the law so that it isn’t. But if the AI simply learns from earlier writers and writes something derivative, but new, I don’t see any real justification for claims of unfairness or moral outrage on the basis of copyright theft.
I’m not suggesting that writers have nothing to fear from AI. Once AI has been trained, these machines can write thousands of words in just minutes (and maybe even faster than that). They can also write whole pieces of music in much the same period of time. But let’s not forget that Elton John wrote Your Song in 20 minutes, after Bernie Taupin had written the lyrics in 10 minutes whilst eating his breakfast, and they were honoured with a knighthood and a CBE, not vilified.
I recently asked a lawyer specialising in AI why creatives thought they had grounds for complaint. He pointed to the fact that AI devices have to make an electronic copy of the work in order to learn from it, which he seemed to think made it an open-and-shut case of copyright breach.
But we have been here before with copyright and tech. Internet browsers create a copy of the web pages that the user has viewed (known as “caching”). The Supreme Court decided that was not a breach of copyright. If the court had decided the other way, Parliament would undoubtedly have stepped in with a new law, rather than allowing the courts to delay the progress of internet surfing.
The legal argument that earned caching a clean bill of health doesn’t necessarily apply to machine learning. It varies from one jurisdiction to another. That has pushed the debate into the hands of the legislature. The EU passed a law which created a copyright exemption for text and data mining. In the UK, the government recently consulted on a proposal that would allow AI machines to be trained on copyrighted works without seeking permission unless the copyright owner expressly opts out. That is what triggered the Make It Fair campaign.
I remain of the view that it is the output from AI that is the issue (and the input if it was stolen), not the use of material for training. It remains to be seen (and heard) whether AI can produce material good enough to engage the interest of human beings. If it can’t, problem solved. If it can, human creators are going to have to be even more inventive. After all, the whole basis on which AI works would suggest that it can do no more than generate derivate material based on an analysis of what went before. Can AI create a new genre or subvert an existing one in ways not yet seen?
