AI and Deep Fake: How technology can turn on you | News

AI, or artificial intelligence, is one of the fastest-growing technologies in the world.

It allows volumes of work to be completed in seconds and is responsible for reinventing how much work gets done. But AI is also doing a lot of damage. Damage to adults and damage to kids in ways many parents are not even aware of.

Beth Jackson is a program manager and therapist at the National Children’s Advocacy Center. She said the risk of this tech are scary, and many make it easy for perpetrators.

“We all put our images out there, right innocent pictures of Christmas pageants, dance recitals, football, family pictures, vacations,” Jackson said.

Plenty of people have used silly “You as your favorite fictional character” filters, but silly can quickly become sinister in the wrong hands.

“The folks that are in the exploitation business are gathering those types of things that we all put out in the public domain, and they’re utilizing that to make exploitive images of children,” Jackson said.

“Go to and upload a single image, and AI technology will re-render that image with the person without their clothes,” said Hany Farid, a professor at the University of California, Berkeley.

Simulated and seemingly real sex images created from simple, everyday, fully clothed pictures. Just over a month ago in Spain, police launched an investigation into boys who took innocent photos of their classmates and turned them into anything but innocent, all by using an easily found AI app. The victims found out after seeing fake pictures of themselves – fully nude – spread around their school.

Then there is the Twitch streamer “Sweet Anita,” a star in the gaming community, and now, without her consent, of deep fake porn videos.

“I watched some of one of them. Like a few seconds, and I was like, no, I can’t do this, I can’t watch through all of these – this is too much,” Anita said. “It’s often hardcore pornography. It’s also usually degrading or aggressive sex acts.”

Deep fake videos are a massive technological leap from altered still images but require no more work for shameless souls. In a matter of minutes, a deep fake artist can make it seem like someone is saying or doing things they have never even thought about doing or saying.

Jackson said the pace at which this technology is advancing is alarming.

“Just within like a year’s time, just how quickly the neural networks are expanding and how it works and how really as we go how little control they have,” Jackson said.

And those who can not do the work themselves are hiring kids to do it for them. Feeds on Reddit and other social sites are dedicated to deep fake requests; on TikTok, some filters create images for you. It is all a click away for kids. This means parents have to police themselves.

“We can all be a little more protective of our images and be more choosey about where we put it and who has access to it,” Jackson said.

And what does a person do if they find out they have become a victim?

“Go to law enforcement, don’t share the pictures, don’t do anything, just you know they’re there, just go to law enforcement straight off the bat,” Jackson said.

President Biden issued an executive order on Oct. 30 to work towards regulating AI. Experts say even if the president’s order protects us from foreign actors, there is still a high-risk that kids currently online can be exposed or victimized themselves.

Additional Resources:

Software to check if images or audio is AI generated can be found HERE.

Department of Homeland Security’s brief on threats of deep fake identities can be found HERE.

Tips on how to avoid deep fake scams can be found HERE.

HERE is the website for the National Center for Missing and Exploited Children. There you can report and find resources for exploited children. 

HERE is the website for the National Children’s Advocacy Center. The office is located at 210 Pratt Ave NE, Huntsville, AL 35801 and their phone is 256-533-KIDS(5437)

Source link

One thought on “AI and Deep Fake: How technology can turn on you | News

Leave a Reply

Your email address will not be published. Required fields are marked *