AI Ya Yi.... Bizarre

spaminator

Hall of Fame Member
Oct 26, 2009
35,932
3,052
113
A.I.-generated model with more than 157K followers earns $15,000 a month
Author of the article:Denette Wilford
Published Nov 28, 2023 • Last updated 1 day ago • 2 minute read
AI-generated Spanish model Aitana Lopez.
AI-generated Spanish model Aitana Lopez. PHOTO BY AITANA LOPEZ /Instagram
A sexy Spanish model with wild pink hair and hypnotic eyes is raking in $15,000 a month — and she’s not even real.


Aitana Lopez has amassed a massive fan base of 157,000 strong online, thanks to her gorgeous snaps on social media, where she poses in everything from swimsuits and lingerie to workout wear and low-cut tops.


Not bad for someone who doesn’t actually exist.

Spanish designer Ruben Cruz used artificial intelligence to help create the animated model look as real as possible — where even the most discerning eyes might miss the hashtag, #aimodel.

Cruz, founder of the agency, The Clueless, was struggling with a meagre client base due to the logistics of working with real-life influencers.

So they decided to create their own influencer to use as a model for the brands they were working with, he told EuroNews.



Aitana was who they came up with, and the virtual model can earn up to $1,500 for an ad featuring her image.

Cruz said Aitana can earn up to $15,000 a month, bringing in an average of $4,480.

“We did it so that we could make a better living and not be dependent on other people who have egos, who have manias, or who just want to make a lot of money by posing,” Cruz told the publication.

Aitana now has a team that meticulously plans her life from week to week, plots out the places she will visit, and determines which photos will be uploaded to satisfy her followers.



“In the first month, we realized that people follow lives, not images,” Cruz said. “Since she is not alive, we had to give her a bit of reality so that people could relate to her in some way. We had to tell a story.”

So aside from appearing as a fitness enthusiast, her website also describes Aitana as outgoing and caring. She’s also a Scorpio, in case you wondered.

“A lot of thought has gone into Aitana,” he added. “We created her based on what society likes most. We thought about the tastes, hobbies and niches that have been trending in recent years.”

The pink hair and gamer side of Aitana is the result.



Fans can also see more of Aitana on the subscription-based platform Fanvue, an OnlyFans rival that boasts many AI models.

Aitana is so realistic that celebrities have even slid into her DMs.

“One day, a well-known Latin American actor texted to ask her out,” Cruz revealed. “He had no idea Aitana didn’t exist.”

The designers have created a second model, Maia, following Aitana’s success.

Maia, whose name — like Aitana’s — contain the acronym for artificial intelligence – is described as “a little more shy.”
View attachment 20185
i want one! ❤️ 😊 ;)
 

spaminator

Hall of Fame Member
Oct 26, 2009
35,932
3,052
113
Apps that use AI to undress women in photos soaring in use
Many of these 'nudify' services use popular social networks for marketing

Author of the article:Bloomberg News
Bloomberg News
Margi Murphy
Published Dec 08, 2023 • 3 minute read

Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.


In September alone, 24 million people visited undressing websites, the social network analysis company Graphika found.


Many of these undressing, or “nudify,” services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2,400% on social media, including on X and Reddit, the researchers said. The services use AI to recreate an image so that the person is nude. Many of the services only work on women.

These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence — a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.


The rise in popularity corresponds to the release of several open source diffusion models, or artificial intelligence that can create images that are far superior to those created just a few years ago, Graphika said. Because they are open source, the models that the app developers use are available for free.

“You can create something that actually looks realistic,” said Santiago Lakatos, an analyst at Graphika, noting that previous deepfakes were often blurry.

One image posted to X advertising an undressing app used language that suggests customers could create nude images and then send them to the person whose image was digitally undressed, inciting harassment. One of the apps, meanwhile, has paid for sponsored content on Google’s YouTube, and appears first when searching with the word “nudify.”


A Google spokesperson said the company doesn’t allow ads “that contain sexually explicit content.”

“We’ve reviewed the ads in question and are removing those that violate our policies,” the company said.

A Reddit spokesperson said the site prohibits any non-consensual sharing of faked sexually explicit material and had banned several domains as a result of the research. X didn’t respond to a request for comment.

In addition to the rise in traffic, the services, some of which charge $9.99 a month, claim on their websites that they are attracting a lot of customers. “They are doing a lot of business,” Lakatos said. Describing one of the undressing apps, he said, “If you take them at their word, their website advertises that it has more than a thousand users per day.”


Non-consensual pornography of public figures has long been a scourge of the internet, but privacy experts are growing concerned that advances in AI technology have made deepfake software easier and more effective.

“We are seeing more and more of this being done by ordinary people with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “You see it among high school children and people who are in college.”

Many victims never find out about the images, but even those who do may struggle to get law enforcement to investigate or to find funds to pursue legal action, Galperin said.


There is currently no federal law banning the creation of deepfake pornography, though the US government does outlaw generation of these kinds of images of minors. In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for using undressing apps on photos of his patients, the first prosecution of its kind under law banning deepfake generation of child sexual abuse material.

TikTok has blocked the keyword “undress,” a popular search term associated with the services, warning anyone searching for the word that it “may be associated with behavior or content that violates our guidelines,” according to the app. A TikTok representative declined to elaborate. In response to questions, Meta Platforms Inc. also began blocking key words associated with searching for undressing apps. A spokesperson declined to comment.