13 September 2011

Feeling It Up


In the movie Star Trek IV: The Voyage Home, the crew of the USS Enterprise goes back in time to 1980’s San Francisco in order to find an artifact needed to save their own future world.

There was a lot of sly humour in this film and one of my favourite scenes takes place when Scotty and Bones McCoy visit a corporate office. Scotty tries in vain to issue commands to a desktop computer by simply talking to it—as he would, of course, be able to do back on his starship. Even the computer mouse bamboozles him.

For a long time in our digital history, everyone assumed that speech recognition would be the ultimate means by which people would interact with computers. A lot of research effort has gone into speech recognition technology over the years, almost from the very beginning of the computer age. But it has been slow to develop and never really took off in the consumer space.

[KKalyan/Flickr]
We also believed that handwriting recognition would be the way to go for text input. Remember how every personal digital assistant (PDA) that came out during the 1990’s (Apple Newton, Palm Pilot, etc.) came with a stylus and an application that promised to learn our handwriting—sometimes painfully—and turn our scribbles into neatly-rendered type? They came, they went.

It seems that, after 130 years, the best text input device we have is still the qwerty keyboard — now miniaturized and operated by thumb and forefinger.

And that—the finger, the thumb, the gesture—is increasingly becoming the means by which we communicate with computers today. Spearheaded by Apple’s wildly successful consumer products, our interactions with technology have become so direct, so tactile, using such an economy of motion, that they feel almost inevitable.

We started with the mouse, of course, with the point-and-click. But today the human hand itself is our own built-in interface to digital technology. Using only our fingers we open files, play music, enlarge photos, play and pause movies, search our phonebooks, navigate maps, even present the news.

By just lifting a finger we open locked doors, clock in and out of work, logon to our laptops and ring up our purchases at the supermarket. And with a flick of our wrist we can smash a virtual tennis ball on a Wii or wield a lightsaber on an Xbox Kinect.

Instead of our personal computers having to learn a dictionary of dozens or hundreds of our spoken commands, the vocabulary of today’s gestural computing is amazingly concise, just a half-dozen or so words: Tap. Flick. Shake. Pinch. Stroke.

The computer is at our fingertips. And it feels sexy.

Here’s a video that beautifully demonstrates the power of what is possible today. It’s a short demo of the iPad digital book “Our Choice” by Al Gore, given earlier this year at TED. The technology was developed by Push Pop Press (who have just been acquired by Facebook, so we may see this technology incorporated into future Facebook developments).

No comments:

Post a Comment

We welcome your feedback! By submitting a comment, you agree to abide by the terms of our comment policy.