AI-Generated Robocalls of Victims' Voices Likely Illegal

AP Photo/Wilfredo Lee

Artificial intelligence is something of a buzzword in this day and age. We tend to lack true AI, at least as we science fiction fans would recognize it, but it's close enough for most people's satisfaction.

Advertisement

Thanks to the current technology, we can imagine amazing things. 

Yet there are problems, as there are with any new technology. Some early adopters are using it for somewhat nefarious purposes. As I'm someone who isn't a fan of just creating laws without a good reason, when I saw a story illustrating why new laws against using AI for certain things was a waste of time, I had to check it out.

What caught my interest, though, was this bit:

As AI technology continues to improve and change our day-to-day lives, this type of scam is going to become more common, and the scope will expand beyond calls from AI “Joe Biden.” In fact, something similar already happened when a political action committee affiliated with Florida Gov. Ron DeSantis used AI to replicate Donald Trump’s voice in an attack ad. 

In response to such incidents, many are calling for federal intervention, and the Federal Communications Commission (FCC) is the first agency to answer the call. 

The FCC confirmed in early February that using AI-generated voices in robocalls is illegal. This applies to all forms of robocall scams, including those related to elections and campaigns. While much of the media coverage surrounding the decision focused on the agency taking action to “ban” or “outlaw” the use of AI in robocalls, the actual decision was a simple confirmation that there are already federal regulations in place that apply to AI phone calls. 

Advertisement

Wait...using AI-generated voices in robocalls, not just scams, is illegal? Was that right?

After all, not even a month ago, I wrote about how Manuel and Patricia Oliver, parents of Parkland victim Joaquim Oliver, were using AI-generated robocalls of their son's voice to push for gun control. 

I wanted to see exactly what the FCC said. Was it about just scam calls or AI-generated robocalls in general.

Well, it looks like there's no differentiation. 

The rise of these types of calls has escalated during the last few years as this technology now has the potential to confuse consumers with misinformation by imitating the voices of celebrities, political candidates, and close family members. While currently State Attorneys Generals can target the outcome of an unwanted AI-voice generated robocall—such as the scam or fraud they are seeking to perpetrate—this action now makes the act of using AI to generate the voice in these robocalls itself illegal, expanding the legal avenues through which state law enforcement agencies can hold these perpetrators accountable under the law. 

Also note that while it says "such as scams or fraud," it isn't exclusive to just those calls.

So the very act of using AI to create the voice is the illegal act, meaning the Olivers broke the law when recreating their son's voice for the robocalls.

Further, robocalls are illegal unless a consumer has expressly given written consent to receive them. While lawmakers should expect people to call their offices, there's no blanket exception I could find that the prohibition on robocalls doesn't apply to them.

Advertisement

At the time I reported on this, I used the word "disgusting" to describe the Olivers' actions. I still think it's gross, using their dead son's voice to try and push people into embracing certain bits of legislation.

Yet I didn't think we'd find out is was illegal.

It should also be noted that the FCC's notice was dated February 8th. I reported on the Olivers' AI-generated robocalls on the 15th of that month on an effort that started the day before, meaning the ruling came before the report on the robocalls.

So yeah, they were breaking the law.

Now, I don't want them prosecuted or anything. I don't think they were intending to deceive anyone, really, because we all know Joaquim was killed and the call acknowledges as much. Yet the fact is that it's an illegal act and for people pushing for new laws, committing an illegal act seems a bit hypocritical.

In other words, they should knock it off and knock it off now.

Join the conversation as a VIP Member