Hands-On With Microsoft's Touchless SDK 84
snydeq writes "Fatal Exception's Neil McAllister takes Microsoft's recently released Touchless SDK for a test spin, controlling his Asus Eee PC 901 with a Roma tomato. The Touchless SDK is a set of .Net components that can be used to simulate the gestural interfaces of devices like the iPhone in thin air — using an ordinary USB Webcam. Although McAllister was able to draw, scroll, and play a rudimentary game with his tomato, the SDK still has some kinks to work out. 'For starters, its marker-location algorithm is very much keyed to color,' he writes. 'That's probably an efficient way to identify contrasting shapes, but color response varies by camera and is heavily influenced by ambient light conditions.' Moreover, the detection routine soaked up 64 percent of McAllister's 1.6GHz Atom CPU, with the video from the Webcam soon developing a few seconds' lag that made controlling onscreen cursors challenging. Project developer Mike Wasserman offers a video demo of the technology."
iPhone? More like Eyetoy (Score:3, Interesting)
Re:Didn't Toshiba do something similar to this onc (Score:2, Interesting)
Actual numbers (Score:3, Interesting)
Okay, I know it's a little late to post this, but these are the numbers I'm getting from my EEE 900. I'm running a 3-tap FIR filter to average all the pixels in a dummy frame. This doesn't include the time it would take to pull the frame from the CMOS/CCD sensor.
On battery alone:
On AC its a little better
Given the sensor resolution is 1280 x 1024, it appears their algorithm uses the full resolution. They could probably get much better results if they used 320 x 240. A little speed binning goes a long way.
Respond to this post if you're interested in the code.