Google CEO Sundar Pichai sounded the alarm about the dangers of using AI to create fake videos of public figures, a task he says will soon be “easy.”
Pichai made the comments during a conversation with CBS’ “60 Minutes,” telling interviewer Scott Pelley that Google is placing limits on its Bard AI to prevent misuse.
“It will be possible with AI to create– you know, a video, easily. Where it could be Scott saying something, or me saying something, and we never said that,” Pichar told Pelley. “And it could look accurate. But you know, on a societal scale, you know, it can cause a lot of harm.”
“Is Bard safe for society?” Pelley asked.
AI GAME-CHANGER MAKES LEAPS TOWARD FUTURE BY DETECTING DENTAL DISEASES EARLIER THAN EVER: ‘CUTTING EDGE’
“The way we have launched it today, as an experiment in a limited way, I think so. But we all have to be responsible in each step along the way,” Pichai responded.
“You are letting this out slowly so that society can get used to it?” Pelley clarified.
AI’S RAPID ADOPTION COULD HURT WORKERS’ MENTAL HEALTH, EXPERTS WARN
“That’s one part of it. One part is also so that we get the user feedback. And we can develop more robust safety layers before we build, before we deploy more capable models,” Pichai said.
Current AI models are already capable of fabricating images of public figures that are nearly indistinguishable from reality. Video and audio fabrications are less adept, so far.
AI-created audio clips are capable of imitating the intended voice, but current versions sound slightly robotic and unnatural. AI video clips are even less polished.
Pichai went on to say that Google does not always fully understand the answers its Bard AI provides, offering an example of the program appearing to teach itself Bengali despite not being trained for it.
CLICK HERE TO GET THE FOX BUSINESS APP
“You don’t fully understand how it works. And yet, you’ve turned it loose on society?” Pelley asked.
“Yeah. Let me put it this way. I don’t think we fully understand how a human mind works either,” Pichai responded.