Wednesday, October 31, 2012

Easier, More Accurate Camera Calibration

We've had a few weeks to react to feedback and have released six updates since our October 3rd launch.  Most of these updates have been bug fixes, but we wanted to bring your attention to the latest update (v0.9.13), which makes camera calibration both easier and more accurate.

We realize camera calibration was a significant pain point for you.  As a result, we've rewritten it to consist of just two easy steps:

  1. Adjust the cameras so that the checkerboard is centered in each image.
  2. Wave your arms around.

Here's a thirty-second video showing the new calibration:

The best part of this new calibration process is that it's also quite a bit more accurate than before. We urge you to give it a shot! Download the latest version (v0.9.13) from our Download page.

Thanks again for trying out our SDK! We'll keep you posted with more updates!


A complete list of changes follows

v0.9.13: October 30th, 2012

* Added --force-hand option to hand tracking server (see handdriver.bat --help)
* Fixed bug for setting num-threads in hand tracking server
* Fixed synchronization issue leading to crashing of hand calibration (shape and pose)
* Including CHANGELOG.txt in distribution

v0.9.12: October 29th, 2012

* Significantly easier and more accurate camera calibration
** You'll need to overwrite the contents of your data directory with the zip file gestural_user_interface_data.zip
** We recommend re-calibrating your camera by running camerasetup.bat again
** You can rescan your hand calibration as follows.
* Changed database format in anticipation of support for single camera. Run:
** handcalibration.bat sample

v0.9.11: October 22nd, 2012

* Fixed issues with European locales that use "," instead of "." for decimals.
* Better documentation and error reporting for Kinect driver issues.

v0.9.10: October 11th, 2012

* Fixed bug associated with freezes in hand shape calibration and pose calibration
(due to zero length scans). * Detecting mistaken use of Microsoft Kinect SDK with Xbox 360
* Minor updates / fixed typos in the documentation
* Better error messages from the OpenNI and Kinect for Windows drivers

v0.9.9: October 9th, 2012

* Printing version number in camerasetup.bat, cameracalibrationtest.bat and handcalibration.bat
* Upgraded to Kinect for Windows version 1.6
* Minor updates / fixed typos in the documentation

v0.9.8: October 4th, 2012

* Minor correction in documentation
* Fixed bug associated with working in a path with spaces

Tuesday, October 2, 2012

Hello from 3Gear Systems

Hello from 3Gear Systems!

We're a three-person team based out of San Francisco trying to fundamentally change the way people interact with computers. We're excited to kick things off on our blog by announcing the release of a software development kit (SDK) for adding gestures to your applications.

The story so far

It's easy to forget that the mouse is over 40 years old. (Source: Wikipedia / SRI International)

It's easy to forget that the mouse is over 40 years old. While today's mice are smoother and have more buttons, they haven't changed all that much. The biggest innovation in UI since the mouse has been the touchscreen, which, like the mouse, treats your hand as if it's one big pointing finger (or at most two fingers sliding around pictures under glass).

But your hands can do so much more than point at things! They can grab things, turn things over, assemble things, animate things, etc.

At 3Gear, we're creating technology that uses your entire hand (fingers, thumbs, wrists and all) for user interaction. This is especially useful when you're doing something 3D, say assembling 3D parts in Computer Aided Design (CAD), flying through a medical 3D MRI scan, or playing 3D games. With the rise of 3D printing and the Maker community, we're especially interested in making it easier to create in 3D.

Making the Kinect "finger-precise"

We're using 3D camera hardware (e.g., the Microsoft Kinect) to make this possible. However, existing Kinect software only work on large, full-body motions. We've developed software that creates a finger-precise representation of what your hands are doing, capturing tiny motions of your index finger and subtle movements of the wrist. This means your applications can use small, comfortable gestures such as pinching and pointing rather than sweeping arm motions.

To make this work, we had to develop new computer graphics algorithms for reconstructing the precise pose of the user's hands from 3D cameras. A key component of the algorithm is to use a database of pre-computed 3D images corresponding to each possible hand configuration in the workspace. The 3D image database is efficiently sampled and indexed to enable extremely fast searches. At run-time, the images from the 3D cameras are used to "look up" the pose of the hand using the database. This way, the user's hand pose can be determined within milliseconds — fast enough for interactive applications and a short enough time to avoid the effects of "lag" or high latency.

Gestures you can use all day

Because the cameras are mounted above the desk, it is still possible to use our system even when the hands are very close to the desk surface.

Before founding 3Gear, Rob spent a lot of his PhD working on tracking hands with a color patchwork glove. He eventually abandoned the idea because it was so hard to get people to put on gloves. Above all, our input system is designed to be practical and comfortable. We're trying to fit into your workflow rather than interrupt it. We've been careful to design an input device that you can use all day. For instance, we mount the cameras above the desk so that the hands can be tracked well even a couple centimeters above the keyboard with the forearms resting on the desk, avoiding the so-called "gorilla arm" problem. You don't have to wave your arm around much to interact with the system; it can pick up motions on the order of millimeters.

Try our software development kit (SDK)!

Our input system uses two Kinect cameras and an aluminum frame for mounting the cameras.

Users can grab virtual objects and move them around in 3D with their hands.

Today, we're releasing a "public beta" version of a software development kit (SDK) that allows you to quickly incorporate 3Gear's technology into your applications or invent new uses of gestural user interfaces.

Here's what our software can do:

  • Intuitive 3D manipulation. Our 3D input technology provides 1-to-1 3D control of virtual objects. Users can grab objects and move them around in 3D with their hands.
  • Touchless (aseptic) control. Whether the user is a surgeon in an operating room or a chemist in a pharmaceutical lab, sometimes it's just too inconvenient to touch a computer. Our technology offers precise control of computer systems for dirty jobs.
  • Runs on commodity hardware. Our input system currently uses two Kinect cameras and an aluminum frame for mounting the cameras. All of the components are available off-the-shelf right now.
And here are the limitations:
  • Our software requires each first-time user to go through a short (five-minute) calibration step. A user only ever has to do this once ever.
  • It's good at recognizing the set of useful gestures covered in the calibration, but it can't track arbitrary hand gestures quite yet.

We're working hard to relax these limitations. We're also actively trying out new camera tech and frame designs — we know it looks a little clunky right now.

Our software is free for both non-commercial and commercial applications up until the end of the beta period (November 30th, 2012). After the beta period, we will continue to offer a free version of the software for researchers, hobbyists, and small commercial entities (i.e., annual turnover of US$100,000 or less).


About us

Okay, so since this is our first post, here's a little more about us. We're just three people right now, but we've all been thinking about computer graphics and HCI for a while:

Rob Wang wrote his doctoral dissertation at MIT on tracking colorful things using computer vision. Most notably, he created a color patchwork glove and a set of algorithms for tracking a user's hands in real-time with a webcam.

Chris Twigg figured out how to step back in time while getting his PhD at Carnegie Mellon University. Before founding 3Gear Systems, Chris pushed the envelope of digital visual effects at Industrial Light and Magic R&D.

Kenrick Kin got his Ph.D. from Cal but spent much of his time in grad school at Pixar Animation Studios inventing ways to use multitouch screens to build rich 3D environments for computer-animated films.

We're supported by K9 Ventures, Uj Ventures and a research grant from the National Science Foundation

“Kinect” is a registered trademark of Microsoft.