Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Microsoft

Hands-On With Microsoft's Touchless SDK 84

snydeq writes "Fatal Exception's Neil McAllister takes Microsoft's recently released Touchless SDK for a test spin, controlling his Asus Eee PC 901 with a Roma tomato. The Touchless SDK is a set of .Net components that can be used to simulate the gestural interfaces of devices like the iPhone in thin air — using an ordinary USB Webcam. Although McAllister was able to draw, scroll, and play a rudimentary game with his tomato, the SDK still has some kinks to work out. 'For starters, its marker-location algorithm is very much keyed to color,' he writes. 'That's probably an efficient way to identify contrasting shapes, but color response varies by camera and is heavily influenced by ambient light conditions.' Moreover, the detection routine soaked up 64 percent of McAllister's 1.6GHz Atom CPU, with the video from the Webcam soon developing a few seconds' lag that made controlling onscreen cursors challenging. Project developer Mike Wasserman offers a video demo of the technology."
This discussion has been archived. No new comments can be posted.

Hands-On With Microsoft's Touchless SDK

Comments Filter:
  • by Sockatume ( 732728 ) on Saturday October 11, 2008 @12:12PM (#25339863)
    While it's very vogueish to make comparisons with Apple products lately, Sony's Cambridge studio are the group that spring to mind when it comes to gestural webcam-based interfaces. On a related note, their original Eyetoy tech demos were similarly "keyed to color", using large foam props, although the end product worked on skintones and therefore was heavily dependent on good lighting and contrast. They patented a "wand" with coloured LEDs back in 2005 which provided a reasonable compromise between the two (a month or two before the Wii Controller popped up, and made it all look passe).
  • by cowlobster ( 1223326 ) on Saturday October 11, 2008 @12:42PM (#25340083)
    you're a bit late, it's been done already, but with a wii remote. http://www.cs.cmu.edu/~johnny/projects/wii/ [cmu.edu]
  • Actual numbers (Score:3, Interesting)

    by gillbates ( 106458 ) on Sunday October 12, 2008 @12:29AM (#25343651) Homepage Journal

    Okay, I know it's a little late to post this, but these are the numbers I'm getting from my EEE 900. I'm running a 3-tap FIR filter to average all the pixels in a dummy frame. This doesn't include the time it would take to pull the frame from the CMOS/CCD sensor.

    On battery alone:

    Resolution: 160 x 120 : 4223 frames, (422.300000 per second)
    Resolution: 320 x 240 : 849 frames, (84.900000 per second)
    Resolution: 640 x 480 : 303 frames, (30.300000 per second)
    Resolution: 720 x 480 : 269 frames, (26.900000 per second)
    Resolution: 800 x 600 : 171 frames, (17.100000 per second)
    Resolution: 1024 x 768 : 118 frames, (11.800000 per second)
    Resolution: 1280 x 1024 : 71 frames, (7.100000 per second)
    Resolution: 1600 x 1200 : 30 frames, (3.000000 per second)

    On AC its a little better

    Resolution: 160 x 120 : 5758 frames, (575.800000 per second)
    Resolution: 320 x 240 : 1675 frames, (167.500000 per second)
    Resolution: 640 x 480 : 321 frames, (32.100000 per second)
    Resolution: 720 x 480 : 353 frames, (35.300000 per second)
    Resolution: 800 x 600 : 276 frames, (27.600000 per second)
    Resolution: 1024 x 768 : 169 frames, (16.900000 per second)
    Resolution: 1280 x 1024 : 101 frames, (10.100000 per second)
    Resolution: 1600 x 1200 : 60 frames, (6.000000 per second)

    Given the sensor resolution is 1280 x 1024, it appears their algorithm uses the full resolution. They could probably get much better results if they used 320 x 240. A little speed binning goes a long way.

    Respond to this post if you're interested in the code.

"The four building blocks of the universe are fire, water, gravel and vinyl." -- Dave Barry

Working...