How Touchscreens Are Forcing the Reinvention of Keyboards
Last week at the Consumer Electronics Show, an Israel-based company called Snapkeys invited showgoers into a booth to test its new keyboard technology. Within a few minutes of using it, the company said, people were already getting the hang of Snapkeys, which consolidates the letters of the alphabet into just four keys.
The idea behind Snapkeys isn’t new; the company says it has been working on it for more than 10 years.
But the more recent emergence of touchscreen devices — and the complaints from even avid users about typing on them — means that Snapkeys’ research and development has been serendipitously well-timed.
“We think the end user is finally ready for an upgrade to the old Qwerty keyboard, after almost 150 years,” said Ryan Ghassabian, a Snapkeys business development manager. “Today, there are just too many new devices — phones, tablets — that are changing everything.”
“And Qwerty is just not meant to be on touchscreen devices,” he added.
Snapkeys is just one of a growing number of devices and applications that aim to change the way users interact with the traditional keyboard.
That doesn’t necessarily mean altering the layout of the Qwerty keyboard. The popular keyboard add-on Swype, recently acquired by Nuance, uses a standard layout, but lets users trace a word with their fingers.
While many companies work on technology for onscreen keyboards, still others are trying to create smart, ultra-portable or “invisible” keyboards.
Korea-based Celluon, which works on portable input applications, has introduced a “Magic Cube” device that connects wirelessly to an iPad or iPhone and projects a laser keyboard image onto an opaque surface for users to “type” on. The idea is that the user would only have to tote the palm-sized, battery-operated cube around, instead of a full keyboard.
Mozilla Labs’s Seabird project uses two Pico projectors to spit out keyboard imagery on either side of a smartphone to establish a full keyboard for typing.
Others believe the answer to typing on touchscreens lies in somehow adding a tactile set of keys — ones that people can actually feel, as they’re accustomed to — to those sleek glass displays.
Part of this stems from the simple fact that many consumers find typing on raised keys easier than typing on touchscreens. A study conducted last year at the University of Washington’s Information School in conjunction with Microsoft Research found that when users typed on a flat surface lacking tactile feedback, they were subject to inadvertent touches, and typing speed was 31 percent slower than it was with a physical keyboard.
And consumers seem to want options beyond just attaching a full keyboard to a mobile phone or tablet. Last fall, two Seattle-based designers received $201,400 dollars in pledges on crowdfunding site Kickstarter, after having set an initial goal of just $10,000. Their product: A thin, light keyboard overlay called the TouchFire that goes over the iPad’s touchscreen and creates a sense of keys.
But tactile touchscreen tech still hasn’t made its way into the mainstream.
While physical buttons certainly have their advantages, software keyboards, in the meantime, are showing a tremendous amount of potential. For example, keyboards can simply be reconfigured based on context. When in a browser, dedicated keys can be presented for “www” and “.com”. If the entry is for a ZIP code, a screen with only numbers can be offered.
Also, soft keyboards can do interesting things using prediction. Based on what the next character is likely to be, the software can actually assume which letter is likely to be pressed next, making those keys bigger, either physically or just by favoring those keys.
Above all, software keyboards, unlike physical ones, disappear entirely when they are not needed. The trend away from physical keyboards, which began with the iPhone, has continued unabated, with full touchscreen smartphones making up a steadily increasing portion of the market.
Chris Harrison, a Ph.D. candidate in Carnegie Mellon’s Human Computer Interaction Institute, says that while tactile feedback is “kind of the holy grail of input,” we’re still years away from tech that offers true tactility on touchscreens. “Right now, there are ways you can take really inaccurate input and make it usable — look at something like BlindType — so that’s what you’ll see getting pushed out in the next two or three years. Maybe in five years or more, we’ll see the technological breakthrough of ‘shape-shifting’ the keys on touch surfaces, so people can feel them.”
Harrison has spent the past two and a half years working with Microsoft on skin-sensory computing technology, called Skinput. The technology includes specialized sensors that gauge vibrations happening inside of the human body and enable graphical multitouch. The idea, basically, is that by tapping a projected image on your forearm, you can tell your computer — or another electronic device, like your TV — what to do.
More recently, Harrison and Microsoft have retailored the tech, which is now called OmniTouch, to use it on variety of surfaces — not just the epidermis, but also walls, tables, and notepads.
And while Harrison is laser-focused on changing the way we input information, he expressed a different sentiment than Snapkeys does it when it comes to the keyboard.
“The physical keyboard is an amazing thing, and the fact that it hasn’t changed much in almost 150 years is a good thing,” he said. “If you brought back an old keyboard, people will still be able to type just as well, and there aren’t many technologies as durable as that.”
Readers, which do you prefer for typing: Touchscreens or tactile keys?
(Magic Cube photo courtesy of Flickr/AsiaClassified)
AllThingsD’s Ina Fried contributed to this report.