WASHINGTON (AP) — Disinformation researcher Wenping Liu investigated China’s efforts to influence the U.S. Taiwan’s Recent Elections Using fake social media accounts, the most successful profiles stood out with anomalies.
They were women, or at least they appeared to be, and the fake profiles posing as women received more engagement, attention, and influence than the accounts purporting to be men.
“Pretending to be a woman is the easiest way to gain trust,” said Liu, the investigator for Taiwan’s Ministry of Justice.
Whether it is Chinese or Russian propaganda agencyBeing female is an advantage when it comes to protecting yourself from online scammers and AI chatbots, proving that even as technology becomes more sophisticated, the human brain remains surprisingly easy to hack, thanks to age-old gender stereotypes that have migrated from the real world to the virtual.
People have long assigned human characteristics, such as gender, to inanimate objects — ships are one example. So it makes sense that human-like traits would make fake social media profiles and chatbots more appealing. But as voice assistants and AI-enabled chatbots enter the market, questions about how these technologies can reflect and reinforce gender stereotypes are gaining attention. Blurring the lines Between man (and woman) and machine.
“If you want to inject emotion and warmth, choosing a woman’s face and voice is an easy way to do it,” says Sylvie Borreau, a marketing professor and online researcher in Toulouse, France. Whose work is this? It turns out that internet users prefer “female” bots over “male” bots, perceiving them as more human.
Borau told the Associated Press that women tend to be seen as warmer, less intimidating and more approachable than men. Meanwhile, men are more likely to be seen as more competent, but also more likely to be intimidating or hostile. This may be why, consciously or unconsciously, people are more likely to engage with fake accounts posing as women.
When OpenAI CEO Sam Altman was looking for a new voice for his ChatGPT AI program, He got close to Scarlett Johanssonshe said Altman told users she would find her voice. The voice assistant of the same name Johansson declined Altman’s request, then threatened to sue when the company chose a voice that was “eerily similar.” OpenAI put development of the new voice on hold.
Feminine profile picturesIn particular, photos of women with perfect skin, full lips and large eyes wearing revealing clothing can be another attraction online for many men.
Users treat bots differently depending on their gender: Borau’s research found that “female” chatbots are much more likely to be sexually harassed or threatened than “male” bots.
Women’s social media profiles get, on average, three times as many views as men’s, according to an analysis of more than 40,000 profiles conducted for The Associated Press by Sciabra, an Israeli technology company that specializes in bot detection. Profiles of women who claim to be young get the most views, Sciabra’s research found.
According to the Cyabra report, “creating a fake account and presenting it as a woman increases the account’s reach compared to presenting it as a man.”
Online influence campaigns conducted by countries such as China and Russia have long used fake women to Spreading propaganda and disinformationThese campaigns often exploit public perceptions of women, with some portraying them as wise, benevolent grandmothers offering simple wisdom, while others impersonate young, traditionally attractive women eager to discuss politics with older men.
Last month, researchers at NewsGuard found hundreds of fake accounts, including ones boasting AI-generated profile photos, were being used to criticize President Joe Biden after some Trump supporters began posting personal photos with the statement, “I’m not voting for Joe Biden.”
While many of the posts were real, more than 700 were from fake accounts. Most of the profiles claimed to be young women from states like Illinois or Florida, including one named PatriotGal480. But many of the accounts shared similar language and their profile pictures were either AI-generated or stolen from other users. It was unclear who was running the fake accounts, but dozens were found to have ties to countries like Russia and China.
After NewsGuard contacted the platform, X deleted the account.
The UN report suggests there’s an even more obvious reason. Too many fake accounts and chatbots They are women and were made by men. Are robots sexist?“, ” I tried Gender Gap in Technology They concluded that greater diversity in programming and AI development could lead to fewer sexist stereotypes being built into products.
For programmers who want to make their chatbots as human as possible, this creates a dilemma, Borau said: If they choose a female persona, won’t they reinforce sexist views of real women?
“It’s a vicious cycle,” Borau said. “Humanizing AI has the potential to dehumanize women.”