• About
  • Advertise
  • Contact
Friday, May 16, 2025
Manhattan Tribune
  • Home
  • World
  • International
  • Wall Street
  • Business
  • Health
No Result
View All Result
  • Home
  • World
  • International
  • Wall Street
  • Business
  • Health
No Result
View All Result
Manhattan Tribune
No Result
View All Result
Home National

“It can destroy a life”: American teenagers victims of fake nudes created by AI

manhattantribune.com by manhattantribune.com
24 November 2023
in National
0
“It can destroy a life”: American teenagers victims of fake nudes created by AI
0
SHARES
0
VIEWS
Share on FacebookShare on Twitter


Ellis, a 14-year-old Texas teenager, woke up one October morning to several missed calls and frantic messages on her phone. False photographs of her, naked, were circulating on social networks.

• Read also: OpenAI announces the return to leadership of Sam Altman

• Read also: Microsoft announces hiring former OpenAI boss Sam Altman

• Read also: American TV presenters, despite themselves stars of AI trickery

“They were photos of one of my best friends and me from Instagram. Naked bodies had been edited on it,” the schoolgirl tells AFP.

“I remember being very, very scared because these fake nudes were being sent around me. And I’ve never done anything like that.”

Like several classmates from her college on the outskirts of Dallas, Texas, the young girl was the victim of hyperrealistic montages (deepfakes) of a sexual nature, made without their consent by a student and then shared on the Snapchat network.

With the popularization of artificial intelligence (AI), it has become easier to make these photo or video montages and make them hyperrealistic, paving the way for their use for harassment or humiliation.

“The girls were just crying and crying and crying, they were ashamed,” recalls Anna Berry McAdams, immediately informed by her daughter Ellis.

“She said to me: ‘Mom, it looks like me’, and I replied ‘but no my darling, I was with you when the photo was taken and I know that’s not the case'”, continues the fifty-year-old, who says she is “horrified” by the realism of the photos.

A growing phenomenon?

At the end of October, other “deepfakes” of a sexual nature were discovered in a high school in New Jersey, in the northeast of the United States. An investigation was opened to identify all the victims and the perpetrator(s).

“I think that this will happen more and more frequently (…) and that there are victims who do not even know that they are victims and that there are photos of them,” laments Dorota Mani , mother of one of the identified students, also aged 14.

“We are starting to see more and more cases emerging (…) but when it comes to sexual abuse, “revenge porn” (malicious disclosure of intimate images) or “pornographic deepfakes”, many people do not do not demonstrate and suffer in silence because they are afraid of making the matter public,” explains Renée Cummings, criminologist and artificial intelligence researcher.

While it is impossible to assess the extent of the phenomenon, “anyone with a smartphone and a few dollars can now create a deepfake,” explains this professor from the University of Virginia.

And this, thanks to recent progress and the democratization of generative AI, capable of producing texts, lines of code, images and sounds upon simple request in everyday language.

In fact, hyperrealistic montages, which previously affected the image of celebrities “having piles and piles of photos and videos of themselves online” now concern everyone, explains Hany Farid, professor at the University from California to Berkeley.

“If you have a LinkedIn profile with a photo of your head, someone can create a sexual image of you,” continues this specialist in detecting digitally manipulated images, noting that these montages “mainly target women and young girls “.

Absent legal framework

Faced with this growing threat, the school and judicial systems appear overwhelmed.

“Although your face was superimposed on a body, that body isn’t really yours, so it’s not like someone shared a nude of you” in the eyes of the law, says Ms. Cummings.

In the United States, no federal law punishes the making and transmission of false sexual images, and only a handful of states have specific legislation.

At the end of October, Joe Biden urged legislators to establish safeguards, in particular to prevent “generative AI from producing child criminal content or non-consensual intimate images of real people”.

If the responsibility of the creators of these images – today difficult to identify – is central, that of the companies behind the sites or software used, and of the social networks, vectors of the content, must also be addressed, insists Mr. Farid.

Because if these images are false, “the trauma is very real”, adds Renée Cummings, describing people “suffering from anxiety, panic attacks, depression or even post-traumatic syndromes after having been victims of “deepfakes pornographic. “It can destroy a life.”

Texan Ellis, who describes herself as a “sociable” and athletic teenager, says she is now “constantly afraid”, although the student behind the “deepfakes” has been identified and temporarily excluded from school.

“I don’t know how many photos he was able to take or how many people were able to receive them,” she explains, saying she asked to change schools.

Faced with this unknown, his mother mobilizes to have these images recognized as “child pornography” and to see them punished as such.

“It could affect them for the rest of their lives. This is never going to get off the internet. So when they apply to university for example, who knows if (these images) won’t resurface?” worries Anna Berry McAdams.

Tags: Americancreateddestroyfakelifenudesteenagersvictims
Previous Post

This Israel has no future in the Middle East

Next Post

his brain is 90% liquid!

Next Post
his brain is 90% liquid!

his brain is 90% liquid!

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Category

  • Blog
  • Business
  • Health
  • International
  • National
  • Science
  • Sports
  • Wall Street
  • World
  • About
  • Advertise
  • Contact

© 2023 Manhattan Tribune -By Millennium Press

No Result
View All Result
  • Home
  • International
  • World
  • Business
  • Science
  • National
  • Sports

© 2023 Manhattan Tribune -By Millennium Press