
If a call sounds like your boss (asking for bank account numbers) or your family member (begging for help in an emergency), you’re more likely to act. That’s why scammers use voice cloning to make their requests for money or information more believable. And the FTC is fighting back.
When the FTC announced its Voice Cloning Challenge last year, the main goal was to encourage innovative ways to help protect people from AI-enabled voice cloning harms. Today, we’re pleased to announce that that the FTC awarded four top prizes to the winning submissions that take a wide range of approaches to doing just that:
- a solution that would use algorithms to detect whether voice patterns are human or synthetic
- a technology that would detect in real time voice cloning and deep fakes in incoming phone calls or digital audio in two-second chunks, assigning a “liveness score”
- a proposal that would watermark audio with distortions that people would not be able to hear, but that could throw off AI voice cloners so that the audio could not be accurately cloned
- a technology that would authenticate that a voice is human and embed the authentication as a type of watermark
Learn more about the winning proposals on the Voice Cloning Challenge page.
If you get a call like this, call the person who supposedly contacted you using a phone number you know is theirs, and verify the story. If you can’t reach your loved one, try to get in touch with them through another family member or their friends.
If you spot a scam, report it to the FTC at ReportFraud.ftc.gov.