Fake AI voice scammers are now impersonating government officials
You probably know that it’s easy enough to fake audio and video of someone at this point, so you might think to do a little bit of research if you see, say, Jeff Bezos spouting his love for the newest cryptocurrency on Facebook. But more targeted scam campaigns are sprouting up thanks to “AI” fakery, according to the FBI, and they’re not content to settle for small-scale rug pulls or romance scams. The US Federal Bureau of Investigation issued a public service announcement yesterday, stating that there’s an “ongoing malicious text and voice messaging campaign” that’s using faked audio to impersonate a senior US official. Exactly who the campaign is impersonating, or who it’s targeting, isn’t made clear. But a little imagination—and perhaps a lack of faith in our elected officials and their appointees—could illustrate some fairly dire scenarios. “One way the actors gain such access is by sending targeted individuals a malicious link under the guise of transitioning to a separate messaging platform,” warns the FBI. It’s a familiar tactic, with romance scammers often trying to get their victims off dating apps and onto something more anonymous like Telegram before pumping them for cash or blackmail material. And recent stories of federal employees and bosses communicating over Signal, or some less savory alternatives, have given these messaging systems a lot of exposure. Presumably, the scammers contact a specific target using an unknown number and pretend to be their boss or some other high-ranking official, using an attached voice message to “prove” their identity. These have become trivially easy to fake, as recently demonstrated when billionaires like “Elon Musk” and “Mark Zuckerberg” started confessing to heinous crimes via the speakers at Silicon Valley crosswalks. “Deepfakes” (i.e., impersonating celebrities via animated video and voice) have now become extremely common online. The FBI recommends the usual protection steps to avoid being hoodwinked: don’t click on sketchy links over text or email, don’t send money (or crypto) to anyone without lots of verification, and use two-factor authentication. One thing I’ve recently done with my family (since my ugly mug is all over TikTok via PCWorld’s short videos) is to establish a secret phrase with my family to give us a way to authenticate each other over voice calls. But with automation tools and hundreds of thousands of potential targets in the US government, it seems inevitable that someone will slip up at some point. Hopefully, federal law enforcement won’t be too busy with other matters to take care of real threats.

You probably know that it’s easy enough to fake audio and video of someone at this point, so you might think to do a little bit of research if you see, say, Jeff Bezos spouting his love for the newest cryptocurrency on Facebook. But more targeted scam campaigns are sprouting up thanks to “AI” fakery, according to the FBI, and they’re not content to settle for small-scale rug pulls or romance scams.
The US Federal Bureau of Investigation issued a public service announcement yesterday, stating that there’s an “ongoing malicious text and voice messaging campaign” that’s using faked audio to impersonate a senior US official. Exactly who the campaign is impersonating, or who it’s targeting, isn’t made clear. But a little imagination—and perhaps a lack of faith in our elected officials and their appointees—could illustrate some fairly dire scenarios.
“One way the actors gain such access is by sending targeted individuals a malicious link under the guise of transitioning to a separate messaging platform,” warns the FBI. It’s a familiar tactic, with romance scammers often trying to get their victims off dating apps and onto something more anonymous like Telegram before pumping them for cash or blackmail material. And recent stories of federal employees and bosses communicating over Signal, or some less savory alternatives, have given these messaging systems a lot of exposure.
Presumably, the scammers contact a specific target using an unknown number and pretend to be their boss or some other high-ranking official, using an attached voice message to “prove” their identity. These have become trivially easy to fake, as recently demonstrated when billionaires like “Elon Musk” and “Mark Zuckerberg” started confessing to heinous crimes via the speakers at Silicon Valley crosswalks. “Deepfakes” (i.e., impersonating celebrities via animated video and voice) have now become extremely common online.
The FBI recommends the usual protection steps to avoid being hoodwinked: don’t click on sketchy links over text or email, don’t send money (or crypto) to anyone without lots of verification, and use two-factor authentication. One thing I’ve recently done with my family (since my ugly mug is all over TikTok via PCWorld’s short videos) is to establish a secret phrase with my family to give us a way to authenticate each other over voice calls.
But with automation tools and hundreds of thousands of potential targets in the US government, it seems inevitable that someone will slip up at some point. Hopefully, federal law enforcement won’t be too busy with other matters to take care of real threats.