Are you familiar with undress AI? We share the risks to look out for, and what the law says about this worrying genre of apps - our expert offers insight on what to do if you think your child is at risk.
In a digital world, children are accessing online content at increasingly younger ages. Parents are rightly going to be worried about their children coming across unsafe, age-restricted and illegal content - which can still happen under close supervision. Hearing the news that violent crime is becoming a normal part of children's lives due to time spent online, isn't going to do much to alleviate parental concerns.
When considering how to keep kids safe online, artificial intelligence continues to provide many and varied ways to scupper the best laid safety plans. Undress AI is now something parents should have on their radar. This is a genre of apps and tools that use AI to remove the clothes of people from an image. The final image can't portray the victim's actual body, but will imply it.
Therefore, images shared by youngsters to social media can be put through an undress AI tool and the subsequent nude image shared widely. Pictures can leave the victims open to incidences of sexual coercion and bullying. Worryingly, AI altered of children are making their way to the dark web. Children of celebrities have also been subject to having their images altered in this way.
According to Internet Matters, undress AI tools use suggestive language to draw users in. Such language can pique the curiosity of children as they can sound 'fun.' While a lot of legitimate generative AI tools ask for payment or subscription to create images, free deepnude websites will draw children in more if no payment or email is required. The security of such sites is less likely to be robust, opening up the chance for nude images to be leaked.
The organisation also suggested most undress AI tools are 'trained' to remove clothes from female victims, meaning women are girls are often the primary targets - some even make it clear their technology won't work on male bodies.
Something else for parents to be aware of, is the chance children might engage with such tools and share the images of themselves or others with friends because they think it's funny - this could result in them breaking the law. Until recently, it was only an offence to create sexually explicit deepfake images if they were of children. A new law announced this week states that perpetrators sharing explicit deepfake pictures of adults without without their consent will face prosecution.
We spoke to Dr Rachel O'Connell, the founder and CEO of TrustElevate, a company specialising in child age verification and parental consent. As a leading expert in digital identity, Rachel told us "Social media platforms, including Instagram, allow harmful AI nude generation app ads that create nude images of people in the absence of their consent. The circulation of these images can have detrimental psychological effects on victims' sense of safety and well-being.
Despite policies against such content, these ads persist, highlighting Meta's (Instagram's parent company) unwillingness or inability to enforce ad buyer restrictions effectively. The ads explicitly advertise "undressing" apps, some even using celebrities like Kim Kardashian. Other ads showcase the app's functionality with split images of clothed and nude versions of a woman. There are increasing concerns about these apps being used to create child abuse materials."
As a preventative measure, Rachel calls for companies to implement, and regulators to enforce tighter restrictions and more thorough vetting of ads buyers and apps available on digital platforms. As well as involving the police if parents are worried or suspicious about deepfake images, Rachel added the following information:
"The National Center for Missing & Exploited Children has created a powerful tool to help children regain control. Today, the nation’s leading child protection organization is announcing a new tool to combat child sexual exploitation and help kids remove their sexually explicit images from the internet."
She concluded "The Digital Services Act came into force earlier this year, and Article 21 contains provisions for users to get an out-of-court dispute resolution process. These provisions recognize a new right for users of online platform services to select a certified out-of-court dispute settlement body, i.e., an independent third party, in order to resolve disputes relating to decisions taken by the provider of the online platform service."
For more on children and internet matters, we've already suggested girls are more likely to be targeted by undress AI - research shows online harassment of girls to be considered 'standard.' There's life saving questions you can ask your children if they like gaming, that could keep them safe, plus we have more clever internet safety tips, because you never arm yourself with too much information.