Lost in the AI Jargon: Google’s Big Event Was Clear as Mud

Estimated read time 5 min read


In almost 13 years of being a technology journalist, I’ve never felt so overwhelmed by information as I have after sitting through Google’s 2024 I/O keynote on Tuesday. From Gemini, Gemma, Gem, Veo, Astra and Learn LM, Google threw a lot of stuff out there, and I’m sure I can’t be the only one whose head is spinning. 

And if I — someone who’s tried hard to remain at the cutting edge of tech reporting — am struggling to fully comprehend what just happened, then it doesn’t bode well for the casual observers among you who are arriving at the announcements, fresh-faced and rosy-cheeked hoping to simply find out how Google’s AI is going to help you become a better person. 

AI Atlas art badge tag AI Atlas art badge tag

I’m not going to break the information down here. CNET has a talented team of expert writers who are already in the process of doing just that. But I do think it’s something Google needs to address as it moves further forward into becoming an AI-first behemoth. 

More from Google I/O 2024

Google I/O is fundamentally an event for developers. It’s for coders, researchers and app creators, most of whom will already be familiar with terms like “tokens” and “large language models.” So in some ways, the cacophony of information and technical terminology broadcast Tuesday is to be expected. 

But Google also needs to court consumers. It needs to get everyday users, like me and you, excited about its products potential and help us understand precisely how they’ll fit into our lives. I might be a technology journalist, but I’m also an everyday tech fan. I’m a phone user, both Android and iOS. I use Gmail, Google Drive, Docs and Maps. I’m also a photographer and a YouTube channel host. Almost all of today’s announcements are in some way relevant to one part of my life or another, and yet I’m struggling to even grasp what they are, let alone what they do. 

AI shout count on screen 121 times AI shout count on screen 121 times

Google ended up saying “AI” 124 times in its keynote. I understood it less often. 

Google/Screenshot by CNET

Is Project Astra the same as Gemini? Or Gemma? Are all of them tied into Learn LM or is that something else? Does this replace Google Assistant? Right now, I honestly don’t know, and a large part of my job is to understand all of this stuff and break it down to you. So I will be spending many of my upcoming hours reading my colleagues’ writing on exactly this and if you’re interested in Google’s AI efforts then I encourage you to do the same. 

But I don’t like having to do deep reading just to understand the basics. I’m a firm believer in never reading a product manual, and if you have to, then that product has failed in its usability. This is very much how I feel about today’s keynote.

Google Gemma logo on large screen on stage at Google I/O 2024 Google Gemma logo on large screen on stage at Google I/O 2024

I’d love to tell you why Gemma is an exciting step forward for AI. But I simply cannot.

Google/Screenshot by CNET

Honestly, I can’t put all the blame on Google. The arrival and evolution of AI — in particular generative AI — has been so rapid that I already found myself feeling somewhat out of place. Last year, Google was talking about Bard. But, oh no, it’s not Bard now, it’s Gemini — come on, keep up. It’s similar to how I felt when terms like “blockchain” first started being used and, to be totally truthful, I still can’t tell you what the blockchain is. 

We’ve got ChatGPT, Samsung’s Galaxy AI, Meta AI, plus the arrival of new AI-based devices like Rabbit R1 and the Humane AI Pin. There’s so much AI going on and there seems to be little consensus on how exactly the term AI is applied. The result is a real feeling of fragmentation and confusion. I’m often asked by friends and family about AI, about which chatbot they should use (if at all) and how to create generative AI images. And beyond pointing them to CNET’s helpful AI Atlas, I struggle to give meaningful answers. 

Google, like all tech companies, needs us, the consumers, to understand this stuff. They need us to know what these AI tools are and how transformative they can be if they want us to be excited by them. And if we’re excited by them then, maybe, we’ll buy them. Maybe we’ll opt for the next Pixel phone because of Gemini and its arguably quite compelling video question service. 

gemini gems for google gemini gems for google

Oh wait, it’s Gems now? What are Gems? 

Google/Screenshot by CNET

But a straight two-hour presentation that leaves even experienced tech reporters scratching their heads and struggling to make sense of it isn’t the way to do it. Google spent a long time talking about how great its AI is at summarizing things. Maybe it’s time that it used those tools itself. 

Editors’ note: CNET used an AI engine to help create several dozen stories, which are labeled accordingly. The note you’re reading is attached to articles that deal substantively with the topic of AI but are created entirely by our expert editors and writers. For more, see our AI policy.





Source link

You May Also Like

More From Author

+ There are no comments

Add yours