MAKE IT FAIR: UK publishers unite against government's AI copyright madness
Once again, those in power fail to understand where the value sits in the content creation chain and have allowed themselves to become bewitched by AI promises.
Being able to unite separate entities in a common cause is never easy, even more so when those entities enjoy a long and at times bitter history of enmity, both ideological and commercial.
Step forward then the government of the United Kingdom. In an event without precedent, every single major commercial news publisher in the country has this week united to denounce government plans to weaken the country's copyright laws for the purposes of making it easier for AI companies to train their systems on creative content, including songs, books and movies, as well as news media.
The "Make It Fair" campaign featured on every single front page, online and print. Such a powerful unified response, matched by direct lobbying of the government by the News Media Association, the industry umbrella grouping that has achieved such a moment of remarkable common purpose, has actually made politicians sit up and notice.
The news organisations were joined by leading UK musicians, including such luminaries as Annie Lennox and Kate Bush, to release a completely silent album, the track listing of which spells out the message: "The British government must not legalise music theft to benefit AI companies."
A government spokesman responded that the UK's "current regime for copyright and AI is holding back the creative industries, media and AI sector from realising their full potential - and that cannot continue".
I'd like to "realise the full potential" of the spokesman's lunch by eating all of it, as it amounts to the same thing.
How then, did we come to such a situation, a situation in which AI, a sector awash with cash, is about to be given commercial precedence over those who produce the original content AI must feed on, and who aren't generally awash with cash? What thinking takes us in this commercially illiterate direction?
To understand, it is necessary to take a step and look at the current state of the relationship between power with tech. While we'll use the UK as an example for this, the lessons are surely applicable to many other nations, where similar forces are contending for influence.
For politicians in a bind, AI is a panacea. For them, and more widely, the very use of the term evokes a wondrous technologically-enabled future. It enables them to sell a dream, and when political reality is anything but a dream, it's not surprising they grasp at such notions.
We have a Prime Minister who recently dangled the prospect of AI being used to locate the nation's potholes. Given that the average Brit now has pet names for the most spectacular of such holes in their neighbourhood, such is the time they've had to become familiar with them, locating the hazards that need to be fixed isn't really an issue. It's really quite insulting to think that inserting the magic "AI" term would fool us into thinking otherwise.
To illustrate this, it's worth noting that the UK Prime Minister's Adviser on AI Opportunities, Matt Clifford, also had a role in the previous administration. While it's not unknown for an advisor to serve governments of a different political hue, it's not usual, given that different governments have different views on how to do things, and use people broadly in line with those views. Normally, the Downing Street cat is the main survivor from one administration to another.
Yet AI is above this political fray, it seems. Uniquely, it sits above normal considerations and is treated to an almost theological position within the current body politic. The fact that an AI investor occupies such a position may explain how we ended up where we are about copyright protections.
I'm not offering an opinion on Mr Clifford's abilities, as they do seem quite considerable, however, as Beeban Kidron, Lady Kidron, has pointed out, there are potential conflicts of interest in his position, saying "It is obvious, that if you only listen to those who stand to benefit from a policy then you will hear that it is a great idea … This is a shameful policy based on lobbyist numbers and takes no account of the national interest."
We, as publishers, can be confident that our voice is a more distant one to the centres of power than such a gentlemen as Mr Clifford. We are not saying the things the government wants to hear, all wrapped up in the promise of technology. At least the tech bro on the other side of the Atlantic conducts his troublesome business in public.
Speaking of which, it's no secret that the UK government is seeking to attract US AI investment. Hence the UK recently sitting out on signing the recent AI safety agreement put together by the EU, and adopting the position of the US on it. The copyright move is part of that effort to attract investment too, no doubt. By making the nation's content production open to AI data harvesting, the government has grasped on a solution that uses the primary creative capital we have as a lure for AI players to consider the UK as base.
Yet the problem with that, other than the fact it's stupid, is that it's a one-time only deal. Once done, it's hard to undo, and the longer term consequences of making every producer of content in the UK work to improve the bottom line of businesses they have no association with, and no gain from, remain an unknown, but it's hard to imagine how they will be good.
Selecting the right CMS is about more than having the right technology.
Glide delivers not only a powerful headless CMS but the proactive support, collaborative mindset, and strategic expertise needed to move your business forward.
Request a demo to experience the Glide difference for yourself.
Noosphere launched
Jane Ferguson, a veteran war reporter with a career at CNN, Al Jazeera, and PBS NewsHour, has launched Noosphere, a news app that will feature content from independent journalists. With a focus on high-quality reporting and operating on an invite-only basis, Noosphere will share revenue with journalists based on the level of engagement their content brings to the platform.
Read
Crawl issues
An observant SEO noticed that some websites are seeing a decline in Google’s crawl rate alongside longer server response times. If you’re encountering this issue, the number one recommended action is to check whether your CDN provider has updated their IP ranges for Googlebot.
Read
Knowledge graph and entity SEO tips
We love entities and knowledge graphs at Glide, and they often become major talking points at our SEO events for the power they can bring to content and SEO. Not sure how they can benefit your sites? Read this great guide by Paul DeMott, Chief Technology Officer at Helium SEO, to learn how they can help search engines better understand your content.
Read
Success with fewer zeros
The ultra-wealthy buying media outlets is nothing new - from Jeff Bezos to Marc Benioff (and even Musk recently joking about buying MSNBC), yet there's a unique appeal when those without massive fortunes find success. The Reuters Institute for the Study of Journalism highlights success stories of five employee-owned newspapers, sharing their journey, how they made it work, and the lessons they've learned along the way.
Read
''I think I'm stuck''
xAI announced Grok 3, branding it ''a lightning-fast AI agent'' designed to ''relentlessly seek the truth''. These bold claims are nevertheless challenged, as one Grok conversation has raised eyebrows due to its filtering of sources that label Musk or Trump as misinformation spreaders.
Read
Channeling readers
UK publishers are experimenting with Facebook Channels, Meta's one-way messaging tool for publishers and creators, as a new distribution channel to better engage their readers and drive traffic. Here’s what Reach and News UK titles are doing, and lessons to be learned.
Read
Another AI lawsuit
Keeping track of AI lawsuits? Update it with the latest Chegg vs. Google entry where Chegg, an education tech company, is suing the search engine boss claiming that its AI Overviews have negatively impacted Chegg's traffic and caused a dip in revenue.
Read
Google kills off data voids
A recent study revealed that Google has stopped showing the so called ''data voids'' - warning banners for low-quality search results - just ahead of the 2024 US presidential election, despite no significant improvements in result quality. These banners, introduced in 2022, were designed to warn users of unreliable information, but were phased out after being rarely shown.
Read
Deep Research
The theory of "model collapse" suggested that AI systems trained on its own outputs would eventually lose track of true data distributions and degrade. A recent deep dive into OpenAI, Google, and Perplexity’s deep research tools suggests this is already happening, as they're increasingly relying on AI-generated content and again turning the web into an all you can eat buffet for AI to feed on.
Read
Security concerns
Worried about AI tools retaining the info you enter into them? A cybersecurity company has uncovered that more than 20,000 previously public GitHub repositories are still visible in Copilot because of Bing's caching, despite this data no longer being available on the web.
Read
Rise in misinformation?
Starting this spring, Meta plans to shut down their third party fact-checking program launched in 2016 and replace it with a Community Notes model, claiming this will allow ''more speech''. But this, as ProPublica reports, coincides with a new incentive program for creators based on high engagement numbers, which could give way to more exploits of the loosened guardrails.
Read
Instagram's moderation struggles
A surge of NSFW content recently disrupted Instagram users and their Reels section, following Meta’s changes to moderation policies. While this issue appears to have been addressed, it’s far from the first time their moderation has caused problems, raising even more questions about whether they can effectively manage it.
Read