AI voice cloning is the latest threat to election integrity

AI voice cloning is the latest threat to election integrity

Imagine it’s Election Day. You’re getting ready to go vote when you get a call from a poll worker asking you to vote at an early polling place instead of your Election Day polling place. So you go there and find out it’s closed. It turns out the call isn’t from the poll worker but from a replica created by voice cloning technology.

It may sound like something out of a science fiction movie, but many New Hampshire voters had a similar experience two days before the 2024 presidential primary. They received robocalls containing a deepfake simulating President Joe Biden’s voice, discouraging them from participating in the primary.

While there is no indication that Biden’s fake robocalls had any discernible impact on the New Hampshire primary, the incident is a stark reminder of the growing threat posed by tactics like this, which are increasingly being used by malicious actors to target elections not only in the United States but also in Slovakia, Argentina and elsewhere.

As AI tools become more accessible and affordable, deepfake attacks (of which voice cloning is just one example) are becoming more common. How can voters protect themselves from such attempts to ensure they make informed decisions for the November general election? Here are some tips:

• Avoid answering calls from unknown numbers: Answering a call from an unknown number increases the risk of falling for a scam. Additionally, if you answer a call from an unknown number and speak, a scammer can record your voice and use it to create fake cloned calls to trick your family and friends.

• Verify the caller’s identity: If you answer a suspicious call, take steps to verify the caller’s identity. Several New Hampshire voters did this after receiving Biden’s robocall and were able to confirm that the voice was fake. Try contacting the person (or their campaign) through another channel to confirm that the call was indeed from the person/organization it claimed to be from.

• Report possible voice cloning: If you’ve received a fraudulent call using AI, contact the appropriate authorities so they can use their expertise to investigate further. This can help solve your scam, as well as others, and deter similar behavior in the future. After New Hampshire voters alerted law enforcement and their attorney general about the robocall that used AI to impersonate Biden, the alleged perpetrator was identified and charged with 13 counts of voter suppression, a felony, and 13 counts of candidate impersonation, a misdemeanor. He also faces a proposed $6 million fine from the Federal Communications Commission.

• Educate yourself: Knowledge is your best defense against emerging threats. Take the time to educate yourself and others about the dangers of voice cloning. Be wary of unsolicited calls, especially if they involve urgent requests that offer suspicious information or attempt to trick you into engaging in behavior that seems “strange” (such as sending gift cards to supposed friends or family).