UK Data Bill facing possible delay in Parliamentary ‘ping pong’ between Commons and Lords

The UK Data (Use and Access) Bill has proceeded to Report Stage in the House of Commons. We previously commented on key provisions in the Bill, and in particular one which poses a threat to legal professional privilege.

Among the features of the Bill that did not survive Committee Stage in the Commons, were amendments added in the House of Lords to address concerns about the use of copyright material for the training of AI models. Those amendments, by Baroness Kidron, were intended to require transparency by technology firms engaged in the scraping of copyright materials for commercial text and data mining purposes such as the training of AI models. The adoption of those amendments by the House of Lords was a notable loss for the Labour Government on the issue.

With those amendments now having been stripped from the Bill, it is anticipated that its progress may be delayed by what is known as ‘ping pong’ procedure between the two Houses, as the Lords may again add the amendments back into the Bill; if this happens, the Commons could seek to remove the amendments.

In addition to those legislative developments in Parliament, the government is set to consider the more than 11,500 responses submitted in its consultation on copyright and AI; the UK High Court is also expected in the coming months to hear a landmark claim in the UK High Court by Getty Images against Stability AI in relation to the use of copyright images for training an AI model.

Ofcom fines OnlyFans for failure to provide accurate information on age verification measures

Ofcom, the regulator responsible for enforcing compliance with the Online Safety Act (OSA), has issued a £1.05M fine against the operator of the OnlyFans website. The fine resulted from an investigation launched following the provision of inaccurate information from Fenix about the age verification practices in place for its OnlyFans website.

Ofcom sought information from Fenix in order to confirm whether it was meeting the requirements of the Communications Act 2003, in relation to protecting minors from restricted content such as pornography. Whilst the ensuing investigation and ultimate fine were issued under that 2003 act, the more recent Online Safety Act 2023 further strengthens the age verification measures that online platforms will need to have in place by July 2025.

Fenix used a third party’s facial age estimation tool, which evaluated a user’s uploaded ‘selfie’ in order to confirm that the user met a minimum age threshold.  Fenix initially stated that the ‘challenge’ age for its facial estimation tool, was set at 23 years, but later confirmed to Ofcom that the age had in fact been set to only 20 years; it appears that Fenix was not itself aware of the threshold age setting until it enquired of its technology supplier. Fenix is then reported to have required the age to be raised to 23 years, but it was later again lowered to 21 years.

In setting the fine at £1.05M, Ofcom noted that despite being a large and well resourced company, it took Fenix over 16 months to correct inaccurate information provided to Ofcom about its age verification practices. The fine reflected a reduction of 30% in recognition of the cost savings to Ofcom upon Fenix accepting its conclusions and the issuance of a fine.

Meta settles lawsuit over targeted advertising, avoiding judgment on validity of legal basis

Meta agreed to stop processing the personal data of a human rights campaigner in order to deliver personalised advertising to her, shortly before her claim was due to proceed to a court hearing.

The claimant had objected to Meta’s use of her personal data to deliver tailored ads to her, on the basis of the right to object to processing for direct marketing purposes under the GDPR. Meta was set to argue that personalised advertising did not constitute direct marketing, and so the GDPR right invoked by the claimant – and the corresponding requirement for the data controller to stop using personal data for that purpose – did not apply. The UK Information Commissioner’s Office had filed submissions supportive of the claimant’s position that Meta’s personalised advertised qualified as direct marketing.

It appears that Meta intends to treat future objections from users in a similar way, ceasing personalised advertising based on user data where a user states their objection. Another consequence of the settlement, though, may be that Meta will move toward a ‘pay or consent’ model for the use of Facebook and Instagram in the UK, whereby users are given the option of subscribing for a fee in order not to have targeted advertising as part of their service.