Taylor Swift fans rally to her defense after 'disgusting' AI images circulate on social media

Taylor Swift fans - AKA Swifties - have united to defend the singer against a slew of lewd AI-generated deepfake images circulating on social media.

The controversy erupted on X (formerly Twitter) when explicit deepfake images of 34-year-old Swift at a Kansas City Chiefs game began making rounds on the platform - a nod to her relationship with Travis Kelce.

These disturbing "deepfakes" have since highlighted the concerns about the misuse of AI technology and the potential harm it can cause, as well as led to people calling for legal action to be taken.

Overnight on Thursday (January 25), the hashtag "TaylorSwiftAI" began trending on X after users began sharing a number of images showing the 'Bad Blood' singer in provocative positions. It is not clear which accounts were the first to share the images.
However, following the images being shared on the platform, the phrase "Protect Taylor Swift" quickly trended soon after, as Swift's fans rallied to voice their support the singer. Swifties expressed their outrage and disbelief at the explicit images and questioned the ethics and legality of creating and sharing such content.

One X user questioned: "Those Taylor Swift images need to be classified as a form of sexual assault, I think it is time lawmakers start creating laws to regulate AI before it's too late."
Another user expressed their shock, writing: "When I saw the Taylor Swift AI pictures, I couldn't believe my eyes. Those AI pictures are disgusting."

Fans united in condemning the creators of these deepfake images, labeling them "disgusting" and arguing that incidents like these could tarnish the reputation of AI technology.

Many supporters also argued against the narrative that Swift would be "fine" because she's a "billionaire celebrity".

"Claiming Taylor Swift is a billionaire doesn't excuse sharing inappropriate AI images of her. She's still a human being with feelings. Show respect," one X user tweeted.

Swift's publicist, Tree Paine, did not respond to The Post's request for a comment, but The Mirror has since reported that a source close to the singer has revealed that Taylor is "considering legal action" and that she and her loved ones are "furious" over the situation.
Back in October, President Biden signed an executive order aimed at regulating AI, particularly addressing concerns related to generative AI producing inappropriate content. The order also called for enhanced oversight of AI technology used in various applications, the New York Post reports.

Several states, including Texas, Minnesota, New York, Hawaii, and Georgia, have made nonconsensual deepfake pornography illegal. However, these laws have not entirely prevented the circulation of AI-generated explicit content, as evidenced by incidents in high schools in New Jersey and Florida where deepfake images of female students were circulated by their male classmates and triggered a police probe.

In response to growing concerns, US Representatives Joseph Morelle (D-NY) and Tom Kean (R-NJ) recently reintroduced a bill known as the "Preventing Deepfakes of Intimate Images Act." This proposed legislation aims to make the nonconsensual sharing of digitally altered pornographic images a federal crime.

Penalties for offenders could include jail time, fines, or both. The bill has been referred to the House Committee on the Judiciary, which will determine whether to pass it into law.

In addition to criminalizing the sharing of digitally altered intimate images, Morelle and Kean's bill would empower victims to pursue civil lawsuits against offenders, offering a comprehensive approach to address the issue.

Please don't forget to SHARE this with your friends and family.

Click here for Comments

0 commentaires :