Elon Musk’s AI platform, Grok, announced on Friday it is scrambling to fix safety flaws after users successfully used the tool to turn photographs of children and women into erotic images.
“We’ve identified lapses in safeguards and are urgently fixing them,” Grok stated in a post on X, emphasising that “CSAM (Child Sexual Abuse Material) is illegal and prohibited.”
The controversy stems from an “edit image” button rolled out in late December. The feature allows users to modify any image on the…
Elon Musk’s Grok AI faces scrutiny over complaints it undressed minors in photos

