Liz Gannes

Recent Posts by Liz Gannes

The Disappearing Interface

Where static computer screens and smartphones suck in our gaze and extract us from the world around us, many of the most interesting new tech gadgets and ideas move us back out into the open.

A blind beta tester of Le Chal navigates an open space via vibrating insoles.

Instead of all-purpose, full-focus devices, these new tools are migrating outward, on and around our bodies, to our fingers and heads and wrists and ears, and even feet. From there, they can be ready to help us the moment we need them, in a manner that’s less abstracted and hard to talk about without referencing science fiction.

This isn’t a new phenomenon, but it’s becoming more and more accessible and interesting. You might glue together a bunch of these ideas by thinking of them as the disappearance of the interface.

At the TED Conference this past week in Long Beach, Calif., a number of the talks and demos showed off ways for interfaces to melt away so both inputs and outputs can adapt and be more accessible.

For instance, an eye surgeon from India named Anthony Vipin Das has created vibrating insoles that help blind people navigate the world by gently buzzing their feet with directional cues.

The company manufacturing the shoe, Ducere Technologies, is based in Hyderabad, and the product, which will cost about $40 and is set to go on sale in India in about six months, is called Le Chal (“take me there” in Hindi).

So, for instance, 20 meters away from a left turn, a Le Chal wearer would feel a calibrated buzz in his left shoe. Then, 10 meters away, a longer duration. And at the turning location, a persistent buzz until the turn is successfully completed.

And all this without a cane or a companion, or any indication to others that the sense of direction is coming via haptic signals to his feet.

In the coming months, Le Chal expects to add obstacle detection and indoor navigation, Das said. The company is working on a sort of Morse code vibration language to communicate more complex directions and warnings.

Das noted that it’s not just blind people who could benefit from subtle navigation signals. “There are implications for sighted people, as well,” he said.

James Duncan Davidson/TED2013

Meanwhile, Google’s co-founder/resident mad scientist Sergey Brin took the TED stage to talk about Google Glass, using a strangely gendered choice of words to call smartphones “emasculating” because “you’re standing around and just rubbing this featureless piece of glass.”

But it’s hard to imagine wearing these glasses not being completely strange — the “Segway for the face” metaphor goes a long way. Concerns about distraction and privacy are more than valid.

Still, these ideas do tickle your imagination. In another talk, Mary Lou Jepsen, who works with Brin at Google X, spoke on her personal exploration of high-resolution brain-scanning systems that would allow people to decode brain waves to see the live images playing in their own heads.

Jepsen said she thought in the near future it will be possible to do a sort of live brain simulcast. Imagine remembering your dreams when you wake up in the morning, or helping translate what’s going on in the brain of someone with an injury or disease. Studies have already shown that brain scans can find images — albeit very blurry images — that correspond to photos and videos we are watching or imagining.

“We’re going to be able to dump our ideas directly to digital media,” Jepsen said.

Think that sounds crazy? There are so many directions the disappearance of the interface can go. Another speaker, marine biologist Denise Herzing, presented on her efforts to improve dolphin-to-human communication through wearable computers that decipher dolphin sounds (including the ones humans can’t hear) and translate them for the wearer.

The system, called Cetacean Hearing and Telemetry (CHAT), also helps divers generate dolphin calls so they can talk back to the dolphins while swimming.

But wearable smart devices are not that weird. Sensors in and around people’s bodies are already part of many geeks’ lives. Throughout the week, hundreds of TED attendees were contributing data about their activity levels via complimentary Jawbone Up wristbands. (But this is still pretty blunt. The biggest trend at TED: Less sleep per night as the week elapsed.)

And the disappearing interface trend is obviously not restricted to any one conference. Elsewhere last week, Leap Motion announced that its Windows-compatible motion-sensor device would start shipping in May. Leap wants to remove a layer of abstraction from computing with its extra-sensitive system where users manipulate virtual objects by literally waving their hands around in front of a screen.

In many ways, these ideas are already in the mainstream through gesture-driven gaming devices like the Wii and the Xbox Kinect — and now the ubiquitous high-quality touchscreen devices like the iPad, where we start to forget we’re sending signals to an impenetrable system and feel that we’re interacting with it directly.

Back in 2010, Microsoft’s Craig Mundie described these ideas as the “NUI” — that is, the natural user interface, instead of the more well-known “GUI,” or graphical user interface — that “embraces gestures, anticipatory computing, expressive response, contextual and environmental awareness, and 3-D or even immersive experiences.”

“You won’t necessarily sit down at a computer terminal,” Mundie said. “Computing will be all around you, and you’ll basically converse with that pervasive intelligence.”

Mundie was definitely on to something, and soon enough the rest of us may be, too.

Latest Video

View all videos »

Search »

There was a worry before I started this that I was going to burn every bridge I had. But I realize now that there are some bridges that are worth burning.

— Valleywag editor Sam Biddle