Google hosted its annual developers conference this week, which it calls Google I/O. And for the first time since the start of the pandemic, attendees had the option to show up in person. The company announced software updates and new devices and, of course, details of improvements to the Android operating system, which runs on most of the world’s mobile phones.
The event also sets the tone for the other big tech conferences throughout the year. For this week’s “Quality Assurance” segment, where we take a second look at a big tech story, I spoke with Ian Sherr, an editor at large for CNET who attended the conference virtually. He said one of Google’s biggest reveals was a new wearable device. The following is an edited transcript of our conversation.
Ian Sherr: What they are is glasses that actually use Google’s translation technology in real time. And what you do is you’re wearing the glasses, it will somehow identify the language that the other person is speaking and automatically translate it in kind of a text that’s right on the glasses. So in a lot of ways, you’re getting speech to text straight in the real world. And so seeing Google really think this stuff through and show, “OK, well, we had a really nerdy Google Glass many years ago. We didn’t really know what to do with it. But now, here’s an idea that absolutely people would see a use for,” I think that was really neat.
Kimberly Adams: Talk a little bit more about that because, yeah, a couple of years ago Google did try to release augmented-reality glasses, and it was a very famous flop. What’s different now?
Sherr: In a lot of ways, I think that Google has sat there and realized that they had created something looking for a problem before. They created this really cool technology that, yeah, it can have a computer kind of in your vision a little bit, but not fully, and a camera so you can interact with the real world in nifty ways. But they didn’t know what to do with it. They had a couple of ideas, like giving you real-time directions as you’re walking down the street. But then, we all got these phones that do that pretty well, and some of us have those watches on that do that pretty well and [were] like, “Do I really need to wear this nerdy head glasses for that?” And so as a result, I think they’ve kind of taken a step back. And they’re not alone, by the way. I think Apple and Microsoft and Meta really have realized this as well. They need to figure out what is it that this thing can actually do that’s going to change my life, and not just hand me a technology and say, “Go spend $1,000 on this and then figure out how you’re going to use it.” And that one demo — they didn’t show anything off, they just showed the real-life subtitles — you can just kind of see what the utility of that is alone. And I think that that is a really powerful moment for them.
Adams: What kind of emphasis did you see on tech and accessibility at the conference this year?
Sherr: So there’s actually a lot of accessibility stuff that what ends up happening is that these features help everybody. They have taken their computer vision, where a computer brain is able to actually understand what you have a camera pointed at. You go to a store, you point it at one of the shelves, and it’s actually going to start understanding what it sees on all the shelves. And you can actually do a Google search by tapping on it.
Adams: Now, of course, we are still in the pandemic, marking around a million people in the United States so far who’ve died from COVID-19. How did the pandemic factor into what technology Google chose to highlight in this conference?
Sherr: A lot of Google’s technology — and, again, this is not just Google, this is all the tech companies. They’ve been building stuff like videoconferencing and internet collaboration software for a long time. And it caught on with certain people, but it didn’t really catch on until we hit the pandemic. So one of the things they showed off was that their Google Docs, which is their Microsoft Word competitor but it’s on the internet, has a function that’s basically TL;DR — too long; didn’t read. It will take a very long document or maybe notes that you took in a meeting, and somehow using the computer brains at Google will shorten it down into something that’s easily readable within a paragraph. I don’t fully understand how it’ll work, and I’m fascinated to try it out. But it, again, speaks to that whole “We’re interacting remotely, we are using technology a lot more and we’re leaning on it a lot more.” And so that is another example. One other thing I’ll bring up is that in their video meetings, Google Meet — another one of those things that they’ve had for a long time, but they really supercharged during the pandemic — they’re now going to start doing the subtitling and everything as well in there. Something, by the way, Microsoft and others have done, but making this stuff widely available, really, I think makes life a lot easier. Now, does it change the world? I mean, for a lot of ways, it’s definitely going to tip it quite a bit. But that’s what’s interesting about it. A lot of this is evolutionary change.
Adams: Google I/O happens before the Apple and Microsoft developers conferences. Do you think that what we saw at this conference sort of gives us a hint at what to expect from these other big tech companies?
Sherr: In a way, yes, it might set the tone of, “We’re still figuring out what the next big thing is.” And the reality is they don’t know. They don’t know what that next life-changing technology will be. And so they’re all making bets in all these different directions, but clearly, no one’s figured it out yet. And so I think that’s something we’re going to see throughout. Apple will have their things, and they’re cool and they’ll get their nifty “oooh- ahhhhs,” but they’re not going to change the world the same way. And if they do, I’ll be impressed. And Microsoft, same deal. I think we’re still at that point where they’re figuring things out.
Related links: More insight from Kimberly Adams
Sherr was live-blogging the conference for CNET along with colleagues, and it includes more of his takes on Google’s various announcements at the conference. He mentioned those augmented-reality glasses that give consumers real-time translations, and, according to him and The Verge, there’s still no information about how much the device might cost or whether the technology will eventually be widely available to the public.
For reference, the versions of Google Glass you can get now run between $1,000 and $2,000 — mostly on the secondary market.
And while we are on the topic of gadgets, this week Apple announced the end of the iPod era. We want to hear your stories about your memories of the increasingly shrinking devices over the years. Did you crave an iPod when you were younger? Do you still have one? Do you still use it? Send us a voice memo to firstname.lastname@example.org.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.