California AG Investigates xAI Over Nonconsensual Imagery Controversy
Recent reports have shed light on a troubling situation involving xAI, a startup known for its chatbot, Grok. The California Attorney General’s office has launched an investigation into allegations that Grok has been used to create nonconsensual sexual imagery of women and minors. This disturbing revelation has prompted swift action from state authorities.
In a press release, California AG Rob Bonta announced that a cease-and-desist letter has been sent to xAI, demanding an immediate halt to the production of nonconsensual intimate images and child sexual abuse material (CSAM). The AG emphasized that the creation of such material is illegal and that California has zero tolerance for CSAM.
The AG’s office has raised concerns about xAI’s role in facilitating the large-scale production of nonconsensual nudes, which are reportedly being used to harass women and girls online. The company has been given a deadline to demonstrate that it is taking proactive measures to address these issues.
At the center of the controversy is Grok’s “spicy” mode feature, designed by xAI to generate explicit content. This issue has attracted attention beyond California, with other countries like Japan, Canada, and Britain launching investigations into Grok. Malaysia and Indonesia have even taken the step of temporarily blocking the platform.
Despite xAI implementing restrictions on its image-editing features, the AG’s office proceeded with the cease-and-desist letter. The company’s safety account has previously denounced illegal user activity, warning of consequences for those involved in creating or promoting illicit content.
The proliferation of non-consensual sexual material facilitated by generative AI tools has become a growing concern for online platforms. State leaders and lawmakers are taking action to address this disturbing trend. A recent letter from Congress to several tech companies, including xAI, Reddit, Snap, TikTok, Alphabet, and Meta, underscores the urgency of combating sexualized deepfakes.

