Facebook Twitter

Openni/Contests/ROS 3D/Minority Report Interface. Minority Report Interface Description: A minority report-like interface that lets you drag around photos Submitted By: Garratt Gallagher Keywords: Minority Report Video How to Reproduce Your Entry Physical Setup This works best with a projector, but any monitor will do.

openni/Contests/ROS 3D/Minority Report Interface

Code to Checkout First you have to check out and compile the mit-ros-pkg repository. Running the Code This demo uses the finger detector. Roslaunch hand_interaction detectfingers.launch. Openni/Contests/ROS 3D/RGBD-6D-SLAM. Description: The Kinect is used to generate a colored 3D model of an object or a complete room.

openni/Contests/ROS 3D/RGBD-6D-SLAM

Submitted By: Felix Endres, Juergen Hess, Nikolas Engelhard, Juergen Sturm, Daniel Kuhner, Philipp Ruchti, Wolfram Burgard Keywords: RGBD-SLAM, 3D-SURF, Feature Matching, RANSAC, Graph SLAM, Model Generation, Real-time This page describes the software package that we submitted for the ROS 3D challenge. The current RGBD-SLAM package is located here. Summary We developed a novel method to quickly acquire colored 3D models of objects and indoor scenes with a hand-held Kinect camera. Video In the video we show the generation of a model of our PR2 and a 360° "panorama" of the students lab. Data As a sample for building 3D object models, here is an example of a box and a mug. The scene has been captured with several snapshots instead of a continuous run, to reduce the size of the resulting file.

The pcd file is viewable with the pcd viewer (rosrun pcl_visualization pcd_viewer -ax 0.1 <file>). 1. 2. 3. 4. Openni_kinect/kinect_accuracy. This page discusses both the precision and the accuracy of the Kinect sensor.


If you are not sure what the difference between precision and accuracy is, check out this Wikipedia page. Precision of the Kinect sensor Because the Kinect is essentially a stereo camera, the expected error on its depth measurements is proportional to the distance squared. The experimental data shown in the graph below confirm the expected error model. The graph was obtained by pointing the Kinect at a planar surface, fitting a plane (using RANSAC) through the measured pointcloud, and checking the distance of the points in the pointcloud to that plane. The raw depth images coming from the Kinect sensor are not rectified.

Accuracy of the Kinect sensor The accuracy of a calibrated Kinect sensor turns out to be very high. The accuracy of the Kinect is greatly affected by the intrinsic and extrinsic calibration of the Kinect cameras. OctoMap - 3D occupancy mapping. 3D Stixels Obtained from Stereo Data in a Urban Environment. 3D mapping with Kinect style depth camera. Build a 3D Scanner From A $25 Laser Level - Systm. NI Mate. How to build your own 3-D camera. Home. Image. Features. Whether you need a crumbling building, rain, fire, smoke, fluid, cloth or full on destruction, Blender delivers great looking results.


Blender’s simulation tools include Fluid – Realistic water and fluid simulations.Smoke – Billowing smoke with flames and scene interaction.Hair – Beautiful wafts of hair that blows in the wind and interacts with collisions.Cloth – Amazingly realistic cloth simulations for clothing and environmentsRigid Body Physics – Makes any object destructable and collidableParticles – For creating things like rain, sparks and shrapnel.