Alarming Rise of Voice Cloning Fraud Targeting the Elderly through AI

A concerned woman was distressed upon hearing her daughter crying during a phone call where a man claimed to have kidnapped the girl and demanded a ransom. 

However, the voice tone of the girl did not sound genuine; it was generated using artificial intelligence technology in an attempt to deceive the mother. 

This alarming issue highlights the current proliferation of this technology.

Experts indicate that the greatest danger of artificial intelligence lies in its ability to blur the line between truth and fiction, providing criminals with effective and inexpensive tools. 

US authorities have expressed concerns about recent phone scam operations that utilize AI-based voice cloning tools readily available on the internet.

Jennifer D'Stefano, a mother residing in Arizona, received a call where she heard a voice saying, "Help me, Mom, please help me." She believed it was her fifteen-year-old daughter's voice, who had gone out for skiing. Speaking to a local television station in April 2023, she said, "The voice matched my daughter's tone and the way she cries. I never doubted for a moment that it might not be her."

The fraudster, calling from an unknown number, demanded one million dollars in exchange for the girl's release.

Currently, the relevant authorities are investigating this incident, which quickly came to a peaceful end after D'Stefano managed to communicate with her daughter. It shed light on potential fraud operations facilitated by internet criminals using artificial intelligence programs.

Highly convincing impersonation operations have emerged.

In an interview with Agence France-Presse, Waseem Khalid, the CEO of "Blackbird AI," asserts that "voice cloning through artificial intelligence, which has become nearly indistinguishable from human voices, enables malicious individuals to obtain information and extort money from victims more effectively than before."

Numerous free applications available online allow the replication of someone's voice using an AI-based program, based on a short recording of their voice. Fraudsters can obtain similar recordings from publicly available content of their victims on the internet.

Khalid states, "By utilizing a short voice recording, a cloned voice can be created using artificial intelligence to leave messages and voice clips claiming to be from the victim. The cloned voice can also be used as a modified voice during live calls, as fraudsters mimic different accents and imitate the victim's speech patterns," confirming that this technology "enables the creation of highly convincing impersonation operations."

A survey conducted among approximately 7,000 individuals in nine countries, including the United States, revealed that one in four people has been targeted by an AI-based voice scam attempt or knows someone who has experienced a similar incident. 

According to the survey published last month, seventy percent of respondents stated that they were unsure of their ability to distinguish between a real voice and a cloned voice.

US authorities recently issued a warning about the increasing "fraud targeting older adults," with the Federal Trade Commission stating in its alert, "You may receive a call where you hear the voice of your panicked grandchild, claiming to be in a dire situation after a car accident and being detained by the police, but you can help them by sending money."

In the comments posted under the US committee's warning, a significant number of elderly individuals mentioned being deceived using this method.

The victims of these fraud operations are often so convinced by what they hear that they begin collecting money and may even consider mortgaging their houses before realizing it is all a scam.

Hani Fareed, a professor at the School of Information at the University of California, Berkeley, states in an interview with Agence France-Presse that the ease of voice cloning means that "every internet user is at risk." He emphasizes that "these fraud operations are on the rise."

Startup company "ElevenLabs" was forced to acknowledge that their AI-based voice cloning tool can be used for malicious purposes after several internet users published a fake clip of actress Emma Watson reading excerpts from Adolf Hitler's book "Mein Kampf."

Gal Tal Hershkovitz, an executive at "Team 8," a technology investment firm, says, "We are rapidly approaching a stage where we can no longer trust the content published online, and we will need new technologies to ensure that the person we believe we are talking to (over the phone) is indeed the person we are communicating with."

Discover :

Life 3.0: Being Human in the Age of Artificial Intelligence

On Sale

“This is a compelling guide to the challenges and choices in our quest for a great future of life, intelligence and consciousness—on Earth and beyond.”—Elon Musk, Founder, CEO and CTO of SpaceX and co-founder and CEO of Tesla Motors