Imagine it’s Election Day. You’re getting ready to go vote when you get a call from a poll worker asking you to vote at an early polling place instead of your Election Day polling place. So you go there and find out it’s closed. It turns out the call isn’t from the poll worker but from a replica created by voice cloning technology.
It may sound like something out of a science fiction movie, but many New Hampshire voters had a similar experience two days before the 2024 presidential primary. They received robocalls containing a deepfake simulating President Joe Biden’s voice, discouraging them from participating in the primary.
While there is no indication that Biden’s fake robocalls had any discernible impact on the New Hampshire primary, the incident is a stark reminder of the growing threat posed by tactics like this, which are increasingly being used by malicious actors to target elections not only in the United States but also in Slovakia, Argentina and elsewhere.
As AI tools become more accessible and affordable, deepfake attacks (of which voice cloning is just one example) are becoming more common. How can voters protect themselves from such attempts to ensure they make informed decisions for the November general election? Here are some tips:
• Avoid answering calls from unknown numbers: Answering a call from an unknown number increases the risk of falling for a scam. Additionally, if you answer a call from an unknown number and speak, a scammer can record your voice and use it to create fake cloned calls to trick your family and friends.
• Verify the caller’s identity: If you answer a suspicious call, take steps to verify the caller’s identity. Several New Hampshire voters did this after receiving Biden’s robocall and were able to confirm that the voice was fake. Try contacting the person (or their campaign) through another channel to confirm that the call was indeed from the person/organization it claimed to be from.
• Report possible voice cloning: If you’ve received a fraudulent call using AI, contact the appropriate authorities so they can use their expertise to investigate further. This can help solve your scam, as well as others, and deter similar behavior in the future. After New Hampshire voters alerted law enforcement and their attorney general about the robocall that used AI to impersonate Biden, the alleged perpetrator was identified and charged with 13 counts of voter suppression, a felony, and 13 counts of candidate impersonation, a misdemeanor. He also faces a proposed $6 million fine from the Federal Communications Commission.
• Educate yourself: Knowledge is your best defense against emerging threats. Take the time to educate yourself and others about the dangers of voice cloning. Be wary of unsolicited calls, especially if they involve urgent requests that offer suspicious information or attempt to trick you into engaging in behavior that seems “strange” (such as sending gift cards to supposed friends or family).
• Trust reliable sources: Our information ecosystem is awash with lies and inaccurate information, but at least in the realm of elections, we know who to turn to for accurate information about election administration: state and local election officials (and those who support their efforts).
• Prepare a voting plan before Election Day: Developing a voting plan allows you to confirm when, where, and how you can vote. It also allows you to consider alternatives in case your preferred voting plan doesn’t work due to an unforeseen event such as illness. Finally, planning ahead reduces the risk of falling victim to a voice cloning attack, even if it appears to be real.
Voice cloning attacks are part of the “new frontier” of malicious attempts to interfere in U.S. elections. By staying informed, implementing safeguards, and remaining skeptical of unexpected communications, voters can increase their chances of thwarting these threats before they cause real damage.
David Levine is a consultant in electoral integrity and management. ©2024 The Fulcrum. Distributed by Tribune Content Agency.