The UK government has halted plans to expand copyright exceptions for AI training, citing strong industry opposition and the need for further evidence to balance creator rights with innovation.

The UK Government has stepped back from a planned expansion of copyright exceptions that would have allowed text and data mining of protected works for AI training unless rights holders opted out, citing strong opposition from creators and industry groups. According to the government report published on 18 March 2026, the proposal "is no longer the government’s preferred way forward", and ministers say they will not press ahead with legislative change until they are satisfied any reform would serve the economy and UK citizens.

The consultation carried out under the Data (Use and Access) Act 2025 drew more than 11,500 responses, the government notes, many reflecting coordinated submissions from rights holders and the creative sectors. Industry bodies and collecting societies were prominent among those urging caution; DACS reported that roughly 3% of respondents supported the broad opt-out exception previously favoured by officials.

Officials and lawyers framed the decision as a recalibration rather than a retreat, stressing the need for further evidence and analysis. The report and accompanying impact assessment, published alongside the progress statement, emphasise a desire to strike a balance between protecting creators' control and enabling AI developers to access high-quality data for innovation, but conclude that current evidence does not justify immediate reform to existing copyright law.

Although the government has paused the opt-out route, it proposes at least one substantive change: removing copyright protection for works that are wholly computer-generated, on the basis that copyright should "incentivise and protect human creativity". The report also flags continued monitoring of international moves on transparency for training data and suggests industry-led work on labelling of AI-generated content.

Legal specialists warned that commercial users of AI should prepare for a period of uncertainty and more stringent controls. Luke Galloway of EIP described the white paper as signalling "a slower, more deliberate approach" and said firms will likely need to build consent mechanisms, transparency and provenance into their models. Alastair Shaw of Hogan Lovells estimated that no concrete legislative changes on the major copyright options or on related issues such as disclosure of training materials and labelling should be expected before the end of 2026 at the earliest.

Representative organisations urged the government to keep creators' interests central as policy development continues. The Law Society’s chief executive Ian Jeffery argued that solicitors and other professionals who produce specialist digital content need clarity and protections, saying the government "must prioritise transparency in how AI developers use copyrighted material safeguarding the rights of creators regardless of the mechanism used (opt in or out)". Government papers indicate officials will continue engagement and review developments overseas as they refine options for supporting innovation while protecting intellectual property.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph: - Paragraph 1: [5], [2] - Paragraph 2: [5], [2] - Paragraph 3: [5], [7] - Paragraph 4: [1], [5] - Paragraph 5: [1], [7] - Paragraph 6: [1], [3]

Source: Noah Wire Services

Verification / Sources

Noah Fact Check Pro

The draft above was created using the information available at the time the story first emerged. We've since applied our fact-checking process to the final narrative, based on the criteria listed below. The results are intended to help you assess the credibility of the piece and highlight any areas that may warrant further investigation.

Freshness check

Score: 8

Notes: The article reports on the UK government's recent decision to halt the expansion of copyright exceptions for AI training, citing strong opposition from creators and industry groups. This development aligns with reports from mid-March 2026, indicating that the narrative is current and not recycled. However, the article's publication date is 31 March 2026, which is over a week after the government's announcement on 18 March 2026. This delay may affect the freshness score.

Quotes check

Score: 7

Notes: The article includes direct quotes from government reports and industry bodies. However, the earliest known usage of these quotes cannot be independently verified, raising concerns about their originality. Without independent verification, the reliability of these quotes is uncertain.

Source reliability

Score: 6

Notes: The article is published by the Law Society Gazette, a reputable source within the legal community. However, the article relies heavily on government reports and statements from industry bodies, which may have inherent biases. The lack of independent verification of some claims raises concerns about the overall reliability of the information presented.

Plausibility check

Score: 8

Notes: The article's claims about the UK government's decision to pause the opt-out route for AI training and the proposed removal of copyright protection for wholly computer-generated works are plausible and align with recent developments in AI and copyright law. However, the article's reliance on unverified quotes and the lack of independent verification of some claims slightly diminish its overall plausibility.

Overall assessment

Verdict (FAIL, OPEN, PASS): PASS

Confidence (LOW, MEDIUM, HIGH): MEDIUM

Summary: The article provides current information on the UK government's decision to pause the expansion of copyright exceptions for AI training, aligning with recent developments. However, concerns about the originality and verification of quotes, as well as the reliance on potentially biased sources, slightly diminish the overall confidence in the content's accuracy and reliability. Further independent verification of the claims made would enhance the credibility of the article.