Russia, China, and Big Tech use fake females to generate clicks

WASHINGTON When disinformation researcher Wen Ping Liu examined China’s attempts to influence Taiwan’s recent elections using fake social media profiles, something unusual was apparent about the most successful profiles.

The profiles were fake, but they looked like women. Fake profiles that appeared to be female received more engagement, eyeballs and influence than accounts claiming to be men.

Liu, a Taiwanese Ministry of Justice investigator, said that “pretending to be female is the easiest and fastest way to gain credibility.”

It pays to be a woman, whether it’s Chinese, Russian, or online scammers, or AI chatbots. This shows that, while technology is becoming more advanced, the human mind remains easy to hack, thanks to gender stereotypes from the real to the virtual world.

It is not surprising that people have assigned human characteristics to inanimate objects, such as ships.

As more AI-enabled voice assistants, chatbots, and other technologies enter the market, the question of how they can reinforce gender stereotypes is getting more attention. This blurs the line between man and machine.

Sylvie Borau is a professor of marketing and an online researcher from Toulouse, France. Her research has shown that Internet users prefer female bots, and view them as being more human.

Borau, a spokesperson for The Associated Press, said that people tend to perceive women as being warmer, less threatening, and more agreeable. Men are perceived as more competent but also more likely hostile or threatening. Many people are more likely to engage with fake accounts that pretend to be females, whether consciously or subconsciously.

OpenAI CEO Sam Altman approached Scarlett Johansson to find a voice for ChatGPT. She said that Altman told her users would find Johansson’s voice, which was the voice of the titular voice assistant in Her, “comfortable”. Johansson refused Altman’s request, and she threatened to sue the company if they chose a voice that was “eerily similar”. OpenAI has put the new voice in a hold.

Men can also be attracted by feminine profile pictures. These include those that show women in outfits with revealing clothes, and have flawless skin, lips, and eyes.

Users treat bots differently depending on how they perceive their gender: Borau’s study found that “female chatbots” are more likely to be subjected to sexual harassment and threats compared with “male” bots.

According to Cyabra’s analysis of over 40,000 profiles, an Israeli tech company that specializes in bot detection, female social media profiles get on average three times as many views compared with male profiles. Cyabra discovered that female profiles who claim to be young get the most views.

According to Cyabra, “creating a fake account will allow it to gain more exposure than presenting it a man.”

Online influence campaigns by countries like China and Russia, which have been around for a long time, use fake females as a way to spread propaganda and misinformation. These campaigns exploit the perceptions of women. Some women appear as grandmothers who impart wisdom and nurture their grandchildren, while others are young women with conventionally attractive looks, eager to discuss politics with older men.

Researchers at NewsGuard discovered hundreds of fake accounts last month – some with AI-generated profiles pictures – that were used to criticise US president Joe Biden. This happened after Trump supporters started posting personal photos with an announcement that they would “not be voting for Joe Biden”.

More than 700 of the posts came from fake accounts. The majority of profiles were young women claiming to live in Illinois, Florida or other states. One was called PatriotGal480. Many of the profiles used almost identical language and profile photos were AI generated or stolen from another user. While they could not say with certainty who operated the fake accounts they did find dozens of them that had links to countries such as Russia and China.

NewsGuard contacted X and they removed the accounts.

UN report suggests that there is an even more obvious explanation for why fake accounts and chatbots tend to be female: They were created by males. The report “Are Robots Sexist?” examined gender disparities within the tech industry and concluded that more diversity in programming and AI could lead to less sexist stereotypes in their products.

Borau, a programmer who wants to make chatbots look as real as possible, said that this presents a dilemma for programmers: by selecting a female persona are they promoting sexist attitudes towards women in the real world?

Borau stated, “It is a vicious circle.” “Humanising AI might dehumanise women.” – AP

Related Articles