Taylor Swift Fans Are In An Uproar As Someone Created “Disgusting” NSFW A.I. Photos Of The Pop Star That Are Going Viral Online

Taylor Swift speaking KANSAS CITY, MISSOURI – SEPTEMBER 24: Taylor Swift is seen during a game between the Chicago Bears and the Kansas City Chiefs at GEHA Field at Arrowhead Stadium on September 24, 2023 in Kansas City, Missouri. (Photo by Jason Hanna/Getty Images)
Taylor Swift fans are rioting online following the viral circulation of some truly disgusting AI-generated images.

While it’s still unclear where the NSFW photos originated from, they have been shared from one account over 100,000 times, while plenty of others have them posted too. X has taken steps, with one of the biggest culprits getting their account suspended, though various images could still been found on the platform at the time of writing.

The deepfake photos show Swift in lewd sexual positions at Chiefs games and have generated massive backlash, with “PROTECT TAYLOR SWIFT” now trending on X as her supporters are attempting to bury the NSFW photos with positive content.

“Y’all see how mean and pathetic these people in making Taylor swift AI ?? PROTECT TAYLOR SWIFT,” one fan wrote.

“When i saw the taylor swift AI pictures, i couldn’t believe my eyes. Those AI pictures are disgusting,” another wrote.

“Taylor Swift AI is as disgusting as hell Please PROTECT TAYLOR SWIFT,” a user said.

“People sharing the ai pics are sick and disgusting. protect taylor swift at all costs,” one added.

“The situation with AI images of Taylor Swift is insane. Its disgusting,” wrote another.

Fans have also wondered why there’s no law protecting people from such acts.

“How is this not considered sexual assault??” a user queried. “We are talking about the body/face of a woman being used for something she probably would never allow/feel comfortable how are there no regulations laws preventing this.”

U.S. President Joe Biden did sign an executive order to further regulate AI back in October that prohibits “generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals.”

Nonconsensual deepfake NSF imagery is illegal in multiple states, including Texas, Minnesota, New York, Hawaii, and Georgia, but policing the circulation of such photos is something else altogether.