FCC chair proposes ban on AI-voiced robocalls
Federal Communications Commission Chair Jessica Rosenworcel proposed Wednesday that robocalls using voices generated by artificial intelligence be made illegal under the Telephone Consumer Protection Act.
AI-generated voices used in robocalls and telemarketing have the potential to cause confusion and spread mis- and disinformation by modeling voices of politicians, celebrities and family members. Last month, a bipartisan group of 26 attorneys general urged the FCC to restrict the use of the technology in robocalls and robotexts to protect consumers.
“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” Rosenworcel said in a press release. “That’s why the FCC is taking steps to recognize this emerging technology as illegal under existing law, giving our partners at state attorneys general offices across the country new tools they can use to crack down on these scams and protect consumers.”
In response to the state attorneys general’s letter, the FCC is working to establish partnerships with law enforcement agencies across the country to identify and eliminate illegal robocalls and created a memorandum of understanding with 48 attorneys general to work together to combat this issue, according to the release.
The Telephone Consumer Protection Act helps limit junk calls by restricting telemarketing calls, the use of automatic telephone dialing systems and artificial or prerecorded voice messages. If Rosenworcel’s proposal is enacted, those restrictions would extend to AI-generated voice calls.
Last fall, the FCC launched an inquiry to build a record on how the agency can best combat illegal robocalls and learn how AI might be involved. Conversely, the FCC also explored how AI technology can help the agency with pattern recognition in order to “turn this technology into a force for good that can recognize illegal robocalls before they ever reach consumers on the phone.”