How Google's "Don't be evil" motto has evolved for the AI age

Google CEO: AI impact to be more profound than discovery of fire, electricity

"I've always thought of AI [artificial intelligence] as the most profound technology humanity is working on. More profound than fire or electricity or anything that we've done in the past," said Sundar Pichai, the CEO of Google and its parent company Alphabet.

The 51-year-old Pichai gave 60 Minutes correspondent Scott Pelley rare access to the inner workings of Google's AI development, which includes robots that have acquired skills through machine learning and Project Starline, an AI video conferencing experience Google is developing to allow people to feel as though they are together, despite being in different locations. 

60 Minutes correspondent Scott Pelley is shown a prototype of Google's Project Starline, a videoconferencing experience the company began working on more than five years ago. It's currently in testing. 60 Minutes

Perhaps Google's most anticipated and noteworthy foray into AI is its chatbot, Bard. The company presently calls it an experiment, in part to do more internal testing. Bard notably made a mistake when Google debuted the program in February. When Bard was first released, it did not look for answers on the internet, and instead it relied on a self-contained and mostly self-taught program. Last month, Google released an advanced version of Bard that can write software and connect to the internet. Google says it's developing even more sophisticated AI models.

"[AI] gets at the essence of what intelligence is, what humanity is," Pichai told Pelley. 

In the video below, Pelley asked Pichai how Bard will affect Google search which runs 90% of internet queries and is the company's most profitable division.

60 Minutes asked Google CEO: "Have you killed your cash cow?"

When Google filed for its initial public offering in 2004, its founders wrote that the company's guiding principle, "Don't be evil" was meant to help ensure it did good things for the world, even if it had to forgo some short term gains. The phrase remains in Google's code of conduct. 

In April, Pichai told 60 Minutes he was being responsible by not releasing advanced models of Bard, in part, so society could get acclimated to the technology, and the company could develop further safety layers.

Google CEO shares his concerns about AI

One of the things Pichai told 60 Minutes that keeps him up at night is Google's AI technology being deployed in harmful ways. 

Google's chatbot, Bard, has built in safety filters to help combat the threat of malevolent users. Pichai said the company will need to constantly update the system's algorithms to combat disinformation campaigns and detect deepfakes, computer generated images that appear to be real. 

Google CEO calls for global AI regulation

As Pichai noted in his 60 Minutes interview, consumer AI technology is in its infancy. He believes now is the right time for governments to get involved.

"There has to be regulation. You're going to need laws…there have to be consequences for creating deep fake videos which cause harm to society," Pichai said. "Anybody who has worked with AI for a while…realize[s] this is something so different and so deep that, we would need societal regulations to think about how to adapt."

Adaptation that is already happening around us with technology that Pichai believes, "will be more capable "anything we've ever seen before."

Soon it will be up to society to decide how it's used and whether to abide by Alphabet's code of conduct and, "Do the right thing."

You can watch Scott Pelley's two-part report on Google, below.

Exploring the human-like side of artificial intelligence at Google | 60 Minutes

The video at the top was originally published on April 16, 2023 and was produced by Keith Zubrow and edited by Sarah Shafer Prediger

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.