Thursday, December 19

Anna McAdams has always kept a close eye on her 15-year-old daughter Elliston Berry’s life online. So it was hard to come to terms with what happened 15 months ago on the Monday morning after Homecoming in Aledo, Texas.

A classmate took a picture from Elliston’s Instagram, ran it through an artificial intelligence program that appeared to remove her dress and then sent around the digitally altered image on Snapchat.

“She came into our bedroom crying, just going, ‘Mom, you won’t believe what just happened,'” McAdams said.

Last year, there were more than 21,000 deepfake pornographic videos online — up more than 460% over the year prior. The manipulated content is proliferating on the internet as websites make disturbing pitches — like one service that asks, “Have someone to undress?”

“I had PSAT testing and I had volleyball games,” Elliston said. “And the last thing I need to focus and worry about is fake nudes of mine going around the school. Those images were up and floating around Snapchat for nine months.”

In San Francisco, Chief Deputy City Attorney Yvonne Mere was starting to hear stories similar to Elliston’s — which hit home.

“It could have easily been my daughter,” Mere said.

The San Francisco City Attorney’s office is now suing the owners of 16 websites that create “deepfake nudes,” where artificial intelligence is used to turn non-explicit photos of adults and children into pornography. 

“This case is not about tech. It’s not about AI. It’s sexual abuse,” Mere said.

These 16 sites had 200 million visits in just the first six months of the year, according to the lawsuit.

City Attorney David Chiu says the 16 sites in the lawsuit are just the start.

“We’re aware of at least 90 of these websites. So this is a large universe and it needs to be stopped,” Chiu said.

Republican Texas Sen. Ted Cruz is co-sponsoring another angle of attack with Democratic Minnesota Sen. Amy Klochubar. The Take It Down Act would require social media companies and websites to remove non-consensual, pornographic images created with AI.

“It puts a legal obligation on any tech platform — you must take it down and take it down immediately,” Cruz said.

The bill passed the Senate this month and is now attached to a larger government funding bill awaiting a House vote.

In a statement, a spokesperson for Snap told CBS News: “We care deeply about the safety and well-being of our community. Sharing nude images, including of minors, whether real or AI-generated, is a clear violation of our Community Guidelines. We have efficient mechanisms for reporting this kind of content, which is why we’re so disheartened to hear stories from families who felt that their concerns went unattended. We have a zero tolerance policy for such content and, as indicated in our latest transparency report, we act quickly to address it once reported.”

Elliston says she’s now focused on the present and is urging Congress to pass the bill.

“I can’t go back and redo what he did, but instead, I can prevent this from happening to other people,” Elliston said.

https://www.cbsnews.com/news/deepfake-pornography-victim-congress/

Share.

Leave A Reply

seventeen − twelve =

Exit mobile version