Interesting links about Kinect Hacking Sep 1

Hardware info Hardware info Hardware Information Laser illuminator The illuminator uses an 830nm laser diode. There is no modulation - output level is constant. Output power measured at the illuminator output is around 60mW (Using Coherent Lasercheck).
Futuristic Push buttons
accuracy / resolution of depth data? - OpenKinect
I Heart Robotics: Limitations of the Kinect I Heart Robotics: Limitations of the Kinect or "Why do we still need other sensors if the Kinect is so awesome?" The Kinect is a great sensor and a game changer for mobile robotics, but it won't make every other sensor obsolete. Though maybe I can use it to convince someone to let me take apart a Swiss Ranger 4000. Field of View The field of view is an important consideration for sensors because if you can't see enough features you can't use scan matching/ICP to estimate your change in position.
Currently I am writing my master thesis using the Kinect and ROS. I stumbled across the Kinect Fusion video last year and recently I found out it's open source and can be downloaded from the Point Cloud Library SVN. To build the Point Cloud Library under Kubuntu on your own, you will need to install a few additional things. System requirements: NVidia graphics card with CUDA cores Compiling Kinect Fusion on Kubuntu 11.10 Compiling Kinect Fusion on Kubuntu 11.10
Master Thesis: Online Symbol Recognition Through Data Fusion of a 3D- and a Color Camera for an Autonomous Robot Master Thesis: Online Symbol Recognition Through Data Fusion of a 3D- and a Color Camera for an Autonomous Robot Recently I finished my master thesis which was about using the Kinect to do sign recognition. (sign on the right is out of range) The goal was to improve recognition speed by only searching symbols in suitable surfaces (filtered by arrangement, distance, size and place). I wrote most of the algorithms on my own, except some stuff I used from OpenCV (e.g. warp perspective) and from ROS (gathering the images). The most interesting stuff begins in chapter 5. The document can be found here.
Grasshopper Canvas with Kinect Interaction: Part 2 | LMNts Grasshopper Canvas with Kinect Interaction: Part 2 | LMNts 1.Kinect Sensor | 2.Raw Depth Image | 3.Segmented Depth Image | 4.After Lowpass Filter | 5.Connected Component Labeling | 6.Particle Analysis | 7n.Gesture Filters | 8.Keyboard/Mouse Events In our previous post, we talked about our efforts to use the Microsoft Kinect to control a large-scale table-top interface running Grasshopper. In short: we used the Kinect to sense touch and gestures…which we then used to control the canvas via keyboard and mouse events. We’ve had a lot of fun building Kinect Multitouch Interactions but – being an architecture firm – we can only spend so much time developing the code. We think we’ve created a solid foundation and would like to share with the broader community to use, modify, and extend. Obviously, Grasshopper is only one possible application and we’d love to see how others will use this interaction.
openni_camera/calibration Intrinsic calibration of the Kinect cameras Description: This tutorial shows how to calibrate the intrinsic parameters of the IR (depth) and RGB cameras, including focal length and the distortion model. For applications requiring high accuracy, calibration can improve on the default camera models used by openni_camera. Tutorial Level: BEGINNER Should I calibrate? openni_camera/calibration
openni_kinect/kinect_accuracy This page discusses both the precision and the accuracy of the Kinect sensor. If you are not sure what the difference between precision and accuracy is, check out this Wikipedia page. Precision of the Kinect sensor Because the Kinect is essentially a stereo camera, the expected error on its depth measurements is proportional to the distance squared. openni_kinect/kinect_accuracy
kinect_calibration electric: Cannot load information on name: kinect_calibration, distro: electric, which means that it is not yet in our index. Please see this page for information on how to submit your repository to our index.fuerte: Cannot load information on name: kinect_calibration, distro: fuerte, which means that it is not yet in our index. Please see this page for information on how to submit your repository to our index.groovy: Cannot load information on name: kinect_calibration, distro: groovy, which means that it is not yet in our index. kinect_calibration
camera_calibration/Tutorials/MonocularCalibration camera_calibration/Tutorials/MonocularCalibration Description: This tutorial cover using the camera_calibration 's cameracalibrator.py node to calibrate a monocular camera with a raw image over ROS. Keywords: monocular, camera, calibrate Tutorial Level: BEGINNER Before Starting
Kinect Alternative for Developers (XTION Pro)
I Heart Robotics: Progress with RGB-D Sensors
RSS 2011 Workshop on RGB-D Cameras
Kinect with ROS
Mapping

ROS Contest Projects

Matt's Webcorner - Kinect Sensor Programming The Kinect is an attachment for the Xbox 360 that combines four microphones, a standard RGB camera, a depth camera, and a motorized tilt. Although none of these individually are new, previously depth sensors have cost over $5000, and the comparatively cheap $150 pricetag for the Kinect makes it highly accessible to hobbyist and academics. This has spurred a lot of work into creating functional drivers for many operating systems so the Kinect can be used outside of the Xbox 360. You can find a decent overview of the current state of people working on Kinect here. I decided to hack around with the Kinect partially because Pat Hanrahan bought us a Kinect and partially becauase I wanted to see if it had a good enough resolution to be used for my scene reconstruction algorithm. Matt's Webcorner - Kinect Sensor Programming
ROS (Robot Operating System) and the Kinect One of my master courses is called autonomous systems. It's a course, where we do stuff with the Pioneer Robot from MobileRobots and ROS (Robot Operating System) from Willow Garage. There are groups with 2 to 4 people.
Kinect connector pinout
electric: Documentation generated on March 01, 2013 at 04:19 PMfuerte: Documentation generated on August 19, 2013 at 10:41 AMgroovy: Cannot load information on name: openni_kinect, distro: groovy, which means that it is not yet in our index. Please see this page for information on how to submit your repository to our index.hydro: Cannot load information on name: openni_kinect, distro: hydro, which means that it is not yet in our index. Please see this page for information on how to submit your repository to our index.indigo: Cannot load information on name: openni_kinect, distro: indigo, which means that it is not yet in our index. openni_kinect
Kinect Calibration
Welcome to the OpenKinect project About OpenKinect is an open community of people interested in making use of the amazing Xbox Kinect hardware with our PCs and other devices. We are working on free, open source libraries that will enable the Kinect to be used with Windows, Linux, and Mac. The OpenKinect community consists of over 2000 members contributing their time and code to the Project.

OpenKinect

How To Hack Kinect | TheTechJournal.com

Introducing OpenNI

The OpenNI framework is an open source SDK used for the development of 3D sensing middleware libraries and applications. The OpenNI website provides an active community of developers, the tools and support, a network of potential partners and a distribution platform - addressing the complete development lifecycle. lATEST files
OpenNI/OpenNI - GitHub
SDK | OpenNI Downloads
OpenNI discussions - [OpenNI-dev] Passing the output buffer to the generators Thank you very much for the reply. > Shift length in which domain? time?
Kinect tear down - I FixIt Yourself
V-Sido] Control the Humanoid Robot by Kinect
Microsoft Kinect somatosensory game device full disassembly report _Microsoft XBOX - waybeta
Der Fall Kinect - Page 1 | DigitalFoundry | Eurogamer.de
Video: Alternative Anwendungsmöglichkeiten für Microsoft Kinect
Microphone Array