Spencer Municipal Utilities' Website Compass

WebsiteCompass 17 • Legitimate bank customers to hijack their accounts and create new ones to run up debt or launder money • Recently deceased individuals to collect welfare checks and other funds • Company executives to fool the listener into providing a password or other sensitive information • Young family members supposedly in trouble—such as in jail and needing bail—to get grandparents to send money • Famous music artists to sell fake audio clips online • Famous actors to use their “voice” to hawk products in commercials • Political candidates to sway public opinion and influence elections Clearly, this flood of fake content can have real-world consequences for consumers, communities, and countries. Deepfake audio could enable criminals to steal identities and money, foster discord and distrust, generate confusion and violence, and more. In a disinformation landscape, people can’t tell what’s real and what’s fake, which is cause for concern. Steps to Mitigate Risk What’s being done to address these threats? Some voice-cloning vendors appear to be taking measures to mitigate the risk. ElevenLabs announced it had seen an increasing number of voice-cloning misuse cases among users and is considering adding additional account checks, such as full ID verification, verifying copyright to the voice, or manually verifying each request for cloning a voice sample. Facebook parent Meta, which has developed a generative AI tool for speech called VoiceBox, has decided to go slow in how it makes the tool generally available, citing concerns over potential misuse. On October 12, 2023, four U.S. senators announced a discussion draft bill aimed at protecting actors, singers, and others from having their voice and likeness generated by artificial intelligence. The bipartisan NO FAKES Act (Nurture Originals, Foster Art, and Keep Entertainment Safe Act) would hold people, companies, and platforms liable for producing or hosting such digital replicas. FTCWants to Help Prevent the Harms of Voice Cloning The Federal Trade Commission recently announced the Voice Cloning Challenge to help promote the development of breakthrough ideas to protect consumers from the misuse of AI-enabled voice cloning. Samuel Levine, Director of the FTC’s Bureau of Consumer Protection, said, “We want to address harms before they hit the marketplace, and enforce the law when they do.” Voice cloning technology holds promise for consumers, such as medical assistance for those who lost their voices due to accident or illness. At the same time, the FTC is concerned about how voice cloning technology could be used to harm consumers—such as making it easier for scammers to impersonate family and friends or deceive consumers by appropriating the voices of creative professionals. Challenge submissions must address at least one of these intervention points: • Prevention or authentication to limit the use or application of voice cloning software by unauthorized users • Real-time detection or monitoring to detect cloned voices or the use of voice cloning technology • Post-use evaluation to check if an audio clip contains cloned voices This is good news for consumers and bad news for scammers!

RkJQdWJsaXNoZXIy MTMzNDE=