SOFTWARE OUTFIT Microsoft has shown off some of its research work on surfaces and user interfaces.
In a video posted to a Microsoft blog, the firm showed off how it is using Kinect, its add-on to the XBox 360 games console, to add gesture and speech control to its Natural User Interface (NUI) work.
"Kinect is one technology that pushes the boundaries of how we can build more natural ways to interact with technology - through gestures and speech - but that's only one aspect of our work around NUI," wrote Microsoft's Steve Clayton.
"This shift towards natural user interfaces opens up enormous opportunities - in a wide variety of fields. It's an exciting time for Microsoft, our customers and our partners. As our researchers and developers continue to make advancements in this area, we'll begin to see products and ideas that we never even thought possible."
Another blog from the firm goes into slightly more detail about the work, and details Omnitouch, which should make any touchable surface a computing screen.
"We wanted to capitalise on the tremendous surface area the real world provides," said Hrvoje Benko, of Microsoft's Natural Interaction Research group as he suggested that we are about to be waving our hands towards Minority Report-like interactivity.
"The surface area of one hand alone exceeds that of typical smart phones. Tables are an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could deliver all of the benefits of mobility while expanding the user's interactive capability."
Omnitouch uses a wearable device, with a short range camera and a Pico projector that projects a display that can be touched and controlled by the user, and in a video it was shown shining a keypad onto someones palm, displaying a camera, and working with pinch and zoom controls.
"This custom camera works on a similar principle to Kinect, but it is modified to work at short range," added Benko. "This camera and projector combination simplified our work because the camera reports depth in world coordinates, which are used when modeling a particular graphical world; the laser-based projector delivers an image that is always in focus, so didn't need to calibrate for focus."
He also discussed was Pocketouch, which is a way of controlling your handsets touchscreen while it is in your pocket. This has less immediate appeal, unless you are the sort of person that likes to be seen stroking your pocket in the public but could be used to perform basic tasks on a phone, like answering it, while it is locked in the confines of a blazer or pair of chinos.
This was not the easiest proposition, according to Benko, but it works, even if it does lend itself to some rather cringeworthy quotes.
"We knew we had solved the toughest challenge, which was to figure out a reliable way to detect and segment strokes from the capacitive touch sensor through fabric," he explained.
"If the user is sloppy with strokes - and believe me, when you're doing it through the pocket of a jacket, the results are sloppy." µ
But we probably won't see it until next year
Why stick a finger in a dyke when you can ram the entire boy in the hole, eh?
Reminds us that we're supposed to be able to trust them
'Exclusive' model starts shipping on 29 June