By KIM BELLARD
If I were a smarter person, I’d write something insightful about the collapse of Silicon Valley Bank. If I were a better person, I’d write about the dire new UN report on climate change. But, nope, I’m too intrigued about Google announcing it was (again) killing off Glass.
It’s not that I’ve ever used them, or any AR (augmented reality) device for that matter. It’s just that I’m really interested in what comes after smartphones, and these seemed like a potential path. We all love our smartphones, but 16 years after Steve Jobs introduced the iPhone we should realize that we’re closer to the end of the smartphone era than we are to the beginning.
It’s time to be getting ready for the next big thing.
—————
Google Glass was introduced ten years ago, but after some harsh feedback soon pivoted from a would-be consumer product to an Enterprise product, including for healthcare. It was followed by Apple, Meta, and Snap, among others, but none have quite made the concept work. Google is still putting on a brave face, vowing: “We’ll continue to look at ways to bring new, innovative AR experiences across our product portfolio.” Sure, whatever.
It may be that none of the companies have found the right use case, hit the right price point, adequately addressed privacy concerns, or made something that didn’t still seem…dorky. Or it may simply be that, with tech layoffs hitting everywhere, resources devoted to smart glasses were early on the chopping block. They may be a product whose time has not quite come…or may never.
That’s not to say that we aren’t going to use headsets (like Microsoft’s Hololens) to access the metaverse (whatever that turns our to be) or other deeply immersive experiences, but my question is what’s going to replace the smartphone as our go-to, all-the-time way to access information and interact with others?
We’ve gotten used to lugging around our smartphones – in our hands, our purses, our pants, even in our watches – and it is a marvel the computing power that has been packed into them and the uses we’ve found for them. But, at the end of the day, we’re still carrying around this device, whose presence we have to be mindful of, whose battery level we have to worry about, and whose screen we have to periodically use.
Transistor radios – for any of you old enough to remember them – brought about a similar sense of mobility, but the Walkman (and its descendants) made them obsolete, just as the smartphone rendered them superfluous. Something will do that to smartphones too.
What we want is all the computing power, all that access to information and transactions, all that mobility, but without, you know, having to carry around the actual device. Google Glass seemed like a potential road, but right now that looks like a road less taken (unless Apple pulls another proverbial rabbit out of its product hat if and when it comes out with its AR glasses).
—————-
There are two fields I’m looking to when I think about what comes after the smartphone: virtual displays and ambient computing.
Virtual displays: when I refer to virtual displays, I don’t mean the mundane splitting your monitor (or multiple monitors) into more screens. I don’t even mean what AR/MR (mixed reality) is trying to accomplish, adding images or content into one’s perception of the real world. I mean an actual, free-standing display equivalent to what one would see on a smartphone screen or computer monitor, fully capable of being interacting with as though it was a physical screen. Science fiction movies are full of these.
I suspect that these will be based on holograms or related technology. The displays they render can appear fully life-like. You’ll use them like you would a physical screen/device, not even thinking about the fact that the displays are virtual. You may interact with them with your hands or maybe even directly from your brain.
They’ve historically required significant computing power, but this may be changing and might not even be a constraint even if it doesn’t, due to ambient computing.
Ambient computing: We once thought of computers as humans doing calculations. Then they became big, room-sized machines. Personal computers brought them to a more manageable (and ultimately portable) size, and smartphones made them fit to our hands. Moore’s Law continues to triumph.
Ambient computing (aka, ubiquitous computing, aka Internet of Things – IoT) will change our conceptions again. Basically, computers, or processors, will be embedded in almost everything. They’ll communicate with each other, and with us. As we move along, the specific processors, and their configuration, may change, without missing a beat, much as our smartphones switch between cell towers without us (usually) realizing it. AI will be built in everywhere.
The number of processors used, which processors, and how they’re used, will depend on where you are and what task you want done. The ambient computer may just listen to your direction, or may project a screen for you to use, depending on the task. You won’t worry about where either is coming from.
In that new world of virtual screens and ambient computing, carrying around a smartphone will seem as antiquated as those 1950’s mainframes. Our grandchildren will be as astounded by smartphones as Gen Z is by rotary phones (or landlines in general).
That’s the kind of advance I was hoping Google Glass would help bring about, and that’s why I’m sad Google is calling it quits.
—————-
Healthcare is proud of itself because it finally seems to be embracing telehealth, digital medicine, and EHRs. Each is long overdue, none are based on any breakthrough technologies, and all are being poorly integrated into our existing, extremely broken healthcare system.
What healthcare leaders need to be thinking about is what comes next. Healthcare found uses for Google Glasses and is finding uses for AR/MR/VR, but it is still a long way from making those anywhere close to mainstream. Smartphones are getting closer to mainstream in healthcare, but no one in healthcare should assume they are anything but the near-term future.
What is possible – and what is required – when there are no physical screens and no discrete computers?
Hey, I’m still waiting for my holographic digital twin as my EHR.
Kim is a former marketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and now regular THCB contributor.