Applications that use artificial intelligence to evacuate women in pictures make popularity: relationship

Only in September, 24 million people had changed the websites, they found Graphika Social Network Analysis.


According to researchers, apps and websites that use artificial intelligence to expose women in photos are becoming increasingly popular. Social analytics company Graphika said 24 million people visited strip sites in September alone. According to Graphika, many of these striptease or “nude” services use popular social media networks for marketing. For example, Applications that use artificial intelligence since the beginning of this year, the number of links promoting app takedowns has increased by more than 2,400% on social networks, including X and Reddit, the researchers said. The services use artificial intelligence to recreate an image of a naked person. Many services are intended for women only. 


These apps are part of a disturbing trend in the development and distribution of non-consensual pornography, driven by advances in artificial intelligence, a type of invented media known as fake porn. Its popularity faces serious legal and ethical hurdles, as images are often taken from social media and distributed without the consent, control or knowledge of the person concerned.


The growing popularity corresponds to the launch of several popular open source models, explains Graphika, that is, artificial intelligence capable of generating much better images than those created just a few years ago. Since it is open source, the templates used by app developers are available for free. “You can create something that looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deep fakes were often blurry. A photo is posted above Meanwhile, an app that pays for Google-sponsored content on YouTube appears first when searching for the word “nude.”


Gratuitous pornography of public figures has long been an epidemic on the Internet, but privacy experts are increasingly concerned that advances in artificial intelligence technology have helped deepface programs become easier and more effective. “We see these actions increasingly being carried out by ordinary people with common goals,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it in high school students and people in college.”

 

Many victims never discover the photos, but even those who do may have difficulty convincing law enforcement to investigate or raising money to do so, Galperin said. There are currently no federal laws prohibiting the creation of deep fake pornography, although the United States government prohibits the creation of these types of images of minors.

 

According to the app, TikTok has blocked the keyword “strip,” a popular search term associated with the service, warning anyone searching for that word “may be associated with conduct or content that violates copyright principles” and our rules. A TikTok representative declined to provide further details. To answer the questions, Meta Platforms Inc. I also started blocking keywords related to searches for stripping apps. A spokesman declined to comment.

 

deck

By cdaglobsf