As long as the sun has been shining, there have been so-called “nudify” apps and websites There are many websites that enable people to make nonconsensual, abusive pictures of other people. women and girlsIncluded child sexual abuse material. New research indicates that despite some legislators and tech companies taking measures to limit harmful services, each month millions of people still visit the websites. The sites’ creators could be earning millions every year.
The analysis of 85 nudify “undress” websites—which allow people to upload photos and use AI to generate “nude” pictures of the subjects with just a few clicks—has found that most of the sites rely on tech services from Google, Amazon, and Cloudflare to operate and stay online. These findings are: revealed by IndicatorThe publication, which investigates digital fraud, says that these websites have had an average combined of 18.5 million visitors in each of the six-month period and may earn up to $36 million annually.
Alexios Mantzarlis says nudifiers have become an unreliable source of information. “lucrative business” You can also find out more about us here. “Silicon Valley’s laissez-faire approach to generative AI” Has allowed for persistence. “They should have ceased providing any and all services to AI nudifiers when it was clear that their only use case was sexual harassment,” Mantzarlis speaks of technology companies. The law is becoming increasingly stricter. create or share explicit deepfakes.
The research shows that Amazon and Cloudflare are responsible for hosting or content distribution services on 62 out of 85 websites. Google’s login system is used by 54 websites. Nudify’s websites use many other mainstream services such as payment systems.
Amazon Web Services spokesperson Ryan Walsh said AWS’s terms of service are clear and require that customers adhere to them. “applicable” laws. “When we receive reports of potential violations of our terms, we act quickly to review and take steps to disable prohibited content,” Walsh adds that the safety team is available to receive any concerns.
“Some of these sites violate our terms, and our teams are taking action to address these violations, as well as working on longer-term solutions,” Google spokeswoman Karl Ryan explained that Google’s sign-in process requires that developers agree to Google’s policies, which prohibit content that is illegal or harasses other people.
Cloudflare hadn’t responded to WIRED at the point of writing. WIRED has not named the nudifier sites in this article, so as to not give them more exposure.
Sites can be nudified and dressed down. bots You can also find out more about us on our website. flourished since 2019After being born from the original tools and processes that were used to make the first explicit “deepfakes.” Unified Networks interconnected companiesBellingcat reports that online, there are many people offering to sell the system and make money.
The services transform images into explicit, non-consensual imagery using AI. They often earn money through selling “credits” Or subscriptions which can be used for the generation of photos. In the last few years, a wave of AI image-generators has appeared that uses generative AI. Its output can be extremely harmful. Social media photos The stolen items have been used. create abusive images; In the meantime, new form of cyberbullying and abuseAround the globe, teenage boys have created images of their classmates. Images can be a traumatic experience for the victims of such intimate images abuse. difficult to scrub from the web.

