State Officials Find the Source Behind the AI-Generated Fake Biden Robocalls; The Feds Would Like a Word, Too.
Federal authorities appear to have caught up to the culprit behind the barrage of robocalls placed to New Hampshire voters just two days before the state’s primary election. The Anti-Robocall Multistate Litigation Task Force, consisting of 51 attorney generals, sent a cease-and-desist letter on Tuesday to Life Corporation, the company allegedly behind the robocalls. New Hampshire Attorney General John Formella announced at a press conference that the calls were traced back to the Arlington, Texas-based telecommunications company.
The calls used artificial intelligence to mimic President Joe Biden’s voice, urging people not to vote in the January primary and instead save their vote for the upcoming general election. A stilted but altogether convincing AI-generated Biden falsely asserted that people could only cast one ballot for the election cycle, and that doing so in the New Hampshire primary guaranteed a second term for former President Donald Trump.
The spoof campaign message — which appeared to come from legitimate caller IDs belonging to New Hampshire political party officials — was sent to upwards of 25,000 voters. Now, authorities are warning Life Corp. and any other would-be bad actors to think twice before placing artificial calls.
Attorney General Formella said the task force was choosing to release information about its investigation now because it wants to make clear to anyone that would try this, we can find you, and we will.
In the official notice letter from the robocall task force, the attorneys general cautioned Life Corp. that it should cease originating any illegal call traffic immediately. Further transmission of these calls may violate the Telephone Consumer Protection Act, 3; the Truth in Caller ID Act, 4; as well as state consumer protection statutes.
This doesn’t appear to be the first time Life Corp. has been caught by a federal agency attempting to skirt the rules.
In 2003, the Federal Communications Commission (FCC) issued an official citation to the man behind Life Corp., Walter Monk, for making one or more prerecorded unsolicited advertisements to residential telephone lines in violation of U.S. telecommunications law. The citation noted that Monk could be fined up to $11,000 for each subsequent violation.
In signing on to the letter, the attorneys general expressed concern that their states could be targets of similar robocalls. But thanks to a unanimous ruling from the FCC, the attorneys general may soon have a new tool in their arsenal to fight against such calls. On Thursday, the FCC outlawed robocalls that contain AI-generated voices. The ruling takes effect immediately and makes voice cloning technology used in common robocall scams targeting consumers illegal. The ruling expands the scope of robocalls’ criminality — now the act of using AI-generated voices without consent is illegal, as opposed to focusing on the resulting scam or fraud.
This would give state attorneys general across the country new tools to go after bad actors behind these nefarious robocalls, the FCC said in its announcement of the ruling.
Thursday’s FCC ruling has the power to stop robocalls like those made by Life Corp. because it focuses on the use of AI technology. It will be up to federal election officials to potentially regulate the contents of the speech itself.
But taken together, the FCC ruling and the momentum behind the state attorneys general are important steps toward safeguarding the American public from misinformation and fraud this election season.