Breaking: Uk Regulator Ofcom Opens A Formal Investigation Into X Over Csam...
The UK’s media regulator has opened a formal investigation into X under the Online Safety Act. "There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people — which may amount to intimate image abuse or pornography — and sexualized images of children that may amount to child sexual abuse material (CSAM)," Ofcom said.
The investigation will focus on whether X has "has complied with its duties to protect people in the UK from content that is illegal in the UK." That includes whether X is taking appropriate measures to prevent UK users from seeing "priority" illegal content, such as CSAM and non-consensual intimate images; if the platform is removing illegal content quickly after becoming aware of it; and whether X carried out an updated risk assessment before making "any significant changes" to the platform. The probe will also consider whether X assessed the risk that its platform poses to UK children and if it has ”highly effective age assurance to protect UK children from seeing pornography.”
The regulator said it contacted X on January 5 and received a response by its January 9 deadline. Ofcom is conducting an "expedited assessment of available evidence as a matter of urgency" and added that it has asked xAI for "urgent clarification" on the steps the company is taking to protect UK users.
"Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning," an Ofcom spokesperson said. "Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children. We’ll progress this investigation as a matter of the highest priority, while ensuring we follow due process. As the UK’s independent online safety enforcement agency, it’s important we make sure our investigations are legally robust and fairly decided."
If Ofcom deems that a company has broken the law, it can "require platforms to take specific steps to come into compliance or to remedy harm caused by the breach." The regulator can additionally impose fines of up to £18 million ($24.3 million) or 10 percent of "qualifying" worldwide revenue, whichever of the two figures is higher. It can also seek a court order to stop payment providers or advertisers from working with a platform, or to requ
Source: Engadget