Let’s say you walk into a bar or a coffee shop, order a drink and then sit down to check your Twitter feed, find a location on a map, flick through photos or play a game. All fairly standard activities, but what if you could do all that on the tabletop in front of you without ever taking your smartphone or tablet out of your pocket?
In the future, a tabletop, or any other surface, could become the touchscreen for your mobile device by connecting wirelessly to a tiny camera and projector. Research scientist Vivayak Honkote and his team from Intel Labs demonstrated the experimental technology at the 11th annual Research at Intel event in San Francisco.
“I can work with the tabletop as if I’m working on the touchscreen itself although the device is somewhere else,” said Honkote. “Using my finger tip, I can draw on the table or I can expand and move around a map projected on the table.”
A tiny camera in front of Honkote and a projector, smaller than the diameter of a dime, above him, turned the white tabletop into giant touchscreen. Honkote used his fingers to tap, pinch, stretch and control the Internet and a variety of applications, including a digital book, photo editor and paint application.
The camera, explained Honkote, captures and communicates finger movements which the device interprets as input from its touchscreen. In the near future, he said these are the kinds of experiences people could beam from a smartphone, tablet, laptop or other computing devices.
Honkote and his team will continue to refine the concept. “We want to make the user experience as rich, yet as simple as possible while still using the same compute power that you have on the devices that you carry around and would like to work on,” he said.