Liz Gannes

Recent Posts by Liz Gannes

Captricity Inches Toward the Holy Grail of Handwriting Recognition, One Data Field at a Time

I take handwritten notes about many of the people I interview and events I attend. It’s not the most efficient process, but a small notepad is unobtrusive and never runs out of batteries.

Eventually, my notepads fill up (like my current one just did; yesterday, I wrote on the backs of pages because I didn’t have a spare ready), and they go to live on a shelf in an entirely unhelpful pile of lookalikes.

So one technological breakthrough I’m personally interested in seeing come to fruition is an excellent handwriting-recognition scanner. And, so far, despite some placeholder efforts from folks like Evernote and Moleskine, I haven’t seen it.

A 12-person Berkeley-based startup called Captricity seems to be on its way there. Currently, it analyzes handwritten data that’s entered in structured forms, for the benefit of governments, nonprofits, health-care organizations and researchers. Later this year, Captricity plans to offer a more free-form product, founder and CEO Kuang Chen told me yesterday.

Chen, who raised raised $4.5 million from investors including the Social+Capital Partnership, the Knight Foundation, Atlas Venture and Mitch Kapor, after making a startup out of his Ph.D. research in Tanzania and Uganda, said he has a higher mission of “liberating data for any organization.”

“If I meet someone I don’t want to talk to, and they ask me what I do, I say, ‘data entry,'” Chen said. “It isn’t sexy, but this is a very real problem.”

Captricity founder and CEO Kuang Chen

How Captricity currently works is this: Users upload empty forms, then virtually draw data fields onto the form, then scan lots of filled-out paper forms using the Captricity iPhone app or a scanner, wait about 30 minutes for the first 30 documents, and then receive a spreadsheet of answers, and review flagged concerns.

In the background, Captricity is combining computer vision and Amazon Mechanical Turk crowdsourcers. Once it learns a form, it can go faster.

Each individual form field gets sent to three different Mechanical Turk users, with the results compared. “Our great trick is that humans are used not for the final answer, but to tune the algorithm,” Chen said. He claims that Captricity has better than 99 percent accuracy.

For example, the SEIU in Los Angeles used Captricity at a recent event for 20,000 low-wage workers to survey their demographics and health insurance. “They don’t have 20,000 iPads sitting around,” Chen said. “Fast-food workers don’t just do SurveyMonkey.” So, instead, the organizers brought paper forms and mobile scanners, and got 5,000 forms analyzed from that one event.

(While that may sound like a dis of SurveyMonkey, Chen said the company’s CEO, Dave Goldberg, is a key adviser.)

Captricity charges one cent per data field, with monthly fees starting at $75 for 10,000 fields. Last week, it announced that it is HIPAA-compliant, so doctors can use it to analyze patient data.

I can’t say I’m entirely hopeful that workable Captricity handwriting analysis is actually coming this fall, but Chen said the end goal is getting closer in sight.

“It’s like the Google self-driving car,” he said. “When we have enough data, we’ll solve handwriting.”

Latest Video

View all videos »

Search »

When AllThingsD began, we told readers we were aiming to present a fusion of new-media timeliness and energy with old-media standards for quality and ethics. And we hope you agree that we’ve done that.

— Kara Swisher and Walt Mossberg, in their farewell D post