Taylor Swift's dirty photo made by AI goes viral, here's the chronology


Jakarta, CNBC Indonesia – Taylor Swift became a victim of artificial intelligence, aka AI. AI-generated, sexually explicit photos of her have been circulating on X (formerly Twitter) for the past few days.

One of the most popular posts on X has even been viewed 45 million times, 24,000 reposts, and hundreds of thousands of likes and bookmarks. Before finally the verified account that shared the image was suspended for violating platform policies.


Even though the account that spread it has been suspended, many users are reposting it and a flood of new fake images are appearing. In some regions, the term “Taylor Swift AI” has become a trending topic.

Swift fans also criticized X for allowing many posts to still circulate. In response, fans flooded the hashtag used to circulate the images with messages promoting the original clip of Swift's performance to hide the pornographic images.

A report from 404 Media found that the images circulating may have come from 4chan and Telegram groups, where users share explicit images of women created with AI. These images are often created with Microsoft Designer.

Users in the group reportedly joked about how Swift's picture went viral on X.

According to a 404 Media report, the circulating images are not classic deepfakes in the true definition, where a generative network is trained on one face and replaces another face in the target video, but were created using commercially available AI image generation tools. In other words, these images were created completely by AI, rather than just using Taylor Swift's face into existing pornographic images.

“I don't know whether I should be flattered or annoyed that some of these photos stolen from Twitter are my genes,” said one user in a Telegram group.

“Which one of you took the trash here and threw it on Twitter,” said another user.

Telegram groups recommend that their members use Microsoft's AI image generator called Designer, and users often share pointers to help others circumvent Microsoft's implemented protections.

Before Swift's AI image went viral on Twitter, a user in a Telegram group recommended that its members use the phrase “Taylor 'the singer' Swift” to generate the image. 404 Media was unable to reproduce the type of image posted to Twitter, but found that Microsoft Designer would not produce an image of “Taylor Swift” but instead generated an image of “Taylor 'singer' Swift”.

Microsoft spoke about this incident. The company said it is investigating these reports and taking appropriate action to address them.

Microsoft said that their Code of Conduct prohibits the use of tools for the creation of adult intimate or non-consensual content. And any repeated attempts to produce content that violates company policies may result in loss of access to the service.

“We have a large team working on developing guardrails and other safety systems in line with our responsible AI principles, including content filtering, operational monitoring, and abuse detection to reduce system abuse and help create a safer environment for users,” said Microsoft spokesperson told 404 Media, quoted Monday (29/1/2024).

Telegram did not immediately respond to a request for comment.

[Gambas:Video CNBC]

(fab/fab)