background preloader

Kinect

Facebook Twitter

Nostarch/HackingTheXbox_Free.pdf. Kinect | Doc-Ok.org. I just read an interesting article, a behind-the-scenes look at the infamous “Milo” demo Peter Molyneux did at 2009′s E3 to introduce Project Natal, i.e., Kinect. This article is related to VR in two ways. First, the usual progression of overhyping the capabilities of some new technology and then falling flat on one’s face because not even one’s own developers know what the new technology’s capabilities actually are is something that should be very familiar to anyone working in the VR field. But here’s the quote that really got my interest (emphasis is mine): Others recall worrying about the presentation not being live, and thinking people might assume it was fake.

Milo worked well, they say, but filming someone playing produced an optical illusion where it looked like Milo was staring at the audience rather than the player. Gee, sounds familiar? With the “Milo” demo, the problem was similar. Importing your (real) world into Minecraft. Tak Tak Tak » Archive » Skeleton Tracking with Kinect & Processing. Gallery. Getting Started with Kinect and Processing. So, you want to use the Kinect in Processing. Great. This page will serve to document the current state of my Processing Kinect library, with some tips and info. The current state of affairs Since the kinect launched in November 2010, there have been several models released. Here's a quick list of what is out there and what is supported in Processing for Mac OS X. Kinect 1414: This is the original kinect and works with the library documented on this page in Processing 2.1 Kinect 1473: This looks identical to the 1414, but is an updated model. Now, before you proceed, you could also consider using the SimpleOpenNI library and read Greg Borenstein’s Making Things See book.

I’m ready to get started right now What hardware do I need? First you need a “stand-alone” kinect (model 1414 only for now!). Standalone Kinect Sensor If you have a previous kinect that came with an XBox, it will not include the USB adapter. Kinect Sensor Power Supply Um, what is Processing? OfxKinectKinect CinderBlock Lots! Kinect + Arduino | Tanner's Website. With an Arduino Ethernet, Processing, and a Kinect, I was able to easily create this little demo where hand movement can control a servo. This is just a tiny step in my master plan to create a robot clone so that I don’t have to leave my chair. <p>[Javascript required to view Flash movie, please turn it on and refresh this page]</p> The following libraries and drivers made this work and also made it super easy for me to create it: OpenKinectDaniel Shiffman’s Processing Kinect Library (he knows his stuff and has great examples on his site)Arduino Ethernet UDP send / receive string Servo:EMAX ES08A Servo How it works: The Arduino Ethernet acquires an IP address and waits for UDP packets on a certain port.The machine with the Kinect sends packets to the Arduino that contain hand coordinate data.The Arduino then takes this data (an integer) and maps the range from 0 to 180 degrees.The mapped value is sent to the servo.

Mobile Autonomous Robot using the Kinect. Given a priori knowledge of the environment and the goal position, mobile robot navigation refers to the robot’s ability to safely move towards the goal using its knowledge and sensorial information of the surrounding environment. In fact, in mobile robot operating in unstructured environment, the knowledge of the environment is usually absent or partial.

Therefore, obstacle detection and avoidance are always mentioned for mobile robot missions. Kinect is not only normal camera sensor but also a special device can provide depth map.Depth map is acquired through OpenNI library then processed by Point Cloud library to extract accurate information about the environment. Here is link of full project: (code + references in English, others in Vietnamese but still good to understand from the source code) Some fun stuffs using kinect are available on my channel.

Check out for more :d. Rosnodejs - Program robots with JavaScript. Rosnodejs is currently deprecated. Most of my efforts for JavaScript and robotics has been shifted to the Robot Web Tools project. I highly recommend taking a look at Robot Web Tools if interested in putting your robot on the web. Features include: JavaScript interface to ROS functionality 2D tools for mapping and more 3D tools for robot visualization in a 3D environment I still feel a Node.js interface into ROS is important. Rosnodejs is a Node.js module that lets you use JavaScript to interact with the Robot Operating System, an open-source robot framework used by many of the top universities and research programs around the world.

Perform a range of robotic tasks, from controlling the motors on an Arduino to processing Kinect sensor data using JavaScript and Node.js. The goal is to make the field of robotics more accessible to the countless intelligent web developers out there. One of the top frameworks to program robots with today is the Robot Operating System (ROS). OpenGL Video Tutorial - Home. OpenGl - Tutorial 09 : Blending. Introduction Blending is commonly used to make objects translucent. To view and understand some blending effects, it requieres some learning on how OpenGl computes Blending. This is a little longer so I put this in a Lesson 3. It is highly recommanded to read it for an accurate understanding of this tutorial. Sample use of Blending In this tutorial, we will see some blending application, the technique part is written in Lesson 3.

Make an object translucent Mixing Pictures Filter effect Many other other effects can be created with blending. Translucent object Translucent objects is the common use of Blending. Without blending, when an object is rendered, all pixels drawn replace existing pixels in the frame buffer. The Blending formula defined with glBlendFunc is : srcColor+destColor In case of multiple translucent object, disable the writting in the depth buffer (Lesson 3).

You can control how the object is translucent. Translucent object Mixing pictures First Method The alpha value is 0.75. Keys. Basic OpenGL Lighting. By Steve Baker Introduction. Many people starting out with OpenGL are confused by the way that OpenGL's built-in lighting works - and consequently how colour functions. I hope to be able to clear up some of the confusion. What is needed to explain this clearly is a flow chart: Lighting ENABLED or DISABLED? The first - and most basic - decision is whether to enable lighting or not. glEnable ( GL_LIGHTING ) ; ...or... glDisable ( GL_LIGHTING ) ; If it's disabled then all polygons, lines and points will be coloured according to the setting of the various forms of the glColor command. GlColor3f ( 1.0f, 0.0f, 0.0f ) ; ...gets you a pure red triangle no matter how it is positioned relative to the light source(s).

With GL_LIGHTING enabled, we need to specify more about the surface than just it's colour - we also need to know how shiney it is, whether it glows in the dark and whether it scatters light uniformly or in a more directional manner. glMaterial and glLight glColorMaterial It's slow. GlNormal. OpenKinect - Keyboard Anywhere. Invisible Piano (Keyboard Anywhere, a Kinect Piano) After writing my previous instructable, I was asked about installing some slightly different software to use with the Kinect. Since I'd already done it, I figured it wouldn't take to long to retrace my steps and write the instructable. After much frustration, I figured out a really easy process to get everything installed and talking.

This instructable with walk you though getting a virtual keyboard working with the current release (11.04) of Ubuntu. There are other ways of doing this (which I've done in the past), but trying to reaccomplish the task, I found many shortcuts to what I did in the command line previously. If you have any questions on the command line or getting around in Ubuntu, please see my previous instructable. Also, don't transpose anything to terminal between [ ], it's there for reference. KinoogleFinalReport. KinoogleProposal. ROS and Kinect- Ubuntu Installation | Project RobotaS. ROS- Installation 1.0 Install ROS 1.1 check/add repositories 1.2 Setup your sources.list For Ubuntu 10.10 (Maverick) sudo sh -c ‘echo “deb maverick main” > /etc/apt/sources.list.d/ros-latest.list’ 1.3 Set up your keys wget -O – | sudo apt-key add - 1.4 Installation Make sure you have re-indexed the ROS.org server: sudo apt-get update Desktop-Full Install: (Recommended): ROS, rx, rviz, robot-generic libraries, 2D/3D simulators, navigation and 2D/3D perception sudo apt-get install ros-electric-desktop-full 1.5 Environment setup echo “source /opt/ros/electric/setup.bash” >> ~/.bashrc . ~/.bashrc (change the environment of your current shell, you can type: source /opt/ros/electric/setup.bash) Install Eclipse Applications>Programming>Eclipse Window>Open Perspective>Other…>C/C++ Kinect Installation !!

Installation and setup. Openni_launch. Overview This package contains launch files for using OpenNI-compliant devices such as the Microsoft Kinect in ROS. It creates a nodelet graph to transform raw data from the device driver into point clouds, disparity images, and other products suitable for processing and visualization. Starting with ROS Hydro, all the functionality of openni_launch has been moved to rgbd_launch, in order to allow other drivers such as libfreenect (freenect_launch) to use the same code. openni_launch itself contains 1 launch file: launch/openni.launch - Launch RGB-D processing through rgbd_launch with the OpenNI driver.

Quick start Launch the OpenNI driver. Roslaunch openni_launch openni.launch To visualize in rviz: rosrun rviz rviz Set the Fixed Frame (top left of rviz window) to /camera_depth_optical_frame. Add a PointCloud2 display, and set the topic to /camera/depth/points. Alternatively you can view the disparity image: rosrun image_view disparity_view image:=/camera/depth/disparity Launch files openni.launch. Kinect driver for ROS. Getting the Kinect to Work. This post is about how I got kinect to work on my machine which is Ubuntu 10.04 Lucid using ROS What didn't Work: What works: Open NI does support the older Xbox 360 sensor, which we had in the lab. I tried it out and it worked perfect (close to). Major steps are outlined as follows 2) Install Open NI drivers using apt-get install ros-fuerte-openni-kinect or follow the directions here. That's it and you are done. Roslaunch openni_launch openni.launch. . [ INFO] [1339168119.174802439]: Number devices connected: 1[ INFO] [1339168119.174938804]: 1. device on bus 002:21 is a Xbox NUI Camera (2ae) from Microsoft (45e) with serial id 'B00367200497042B' If you want to visualize the rgb image, you can use rosrun image_view image_view image:=/camera/rgb/image_color For depth image use rosrun image_view disparity_view image:=/camera/depth_registered/disparity Additionally you can install rviz which is visualization utility with ros using sudo apt-get install ros-fuerte-visualization.

Kinect with ROS | Mailing List Archive. Finger tracking with direction (Kinect + EmguCV) Hand Tracking (IMPROVED) Hello from 3Gear Systems. We're a three-person team based out of San Francisco trying to fundamentally change the way people interact with computers. We're excited to kick things off on our blog by announcing the release of a software development kit (SDK) for adding gestures to your applications. It's easy to forget that the mouse is over 40 years old. (Source: Wikipedia / SRI International) It's easy to forget that the mouse is over 40 years old. While today's mice are smoother and have more buttons, they haven't changed all that much. But your hands can do so much more than point at things!

At 3Gear, we're creating technology that uses your entire hand (fingers, thumbs, wrists and all) for user interaction. We're using 3D camera hardware (e.g., the Microsoft Kinect) to make this possible. To make this work, we had to develop new computer graphics algorithms for reconstructing the precise pose of the user's hands from 3D cameras. Users can grab virtual objects and move them around in 3D with their hands.

Ubidisplays - Easily create interactive projected displays anywhere! This tool makes building interactive projected displays quick and easy! This is UbiDisplays, a prototype toolkit for building interactive displays using a projector and a Microsoft Kinect. The software is still in beta and so please report bugs on the issues page. We will resolve them as soon as possible. The tool enables you to drag and drop interactive web content into the world around you. Using a Javascript API, displays are able to move around, appear and disappear. In the past, similar systems required a lot of time and skill to build. Hardware Requirements: Microsoft Kinect Projector (external monitor) Windows 7 (or higher) PC with i7 Processor Software Requirements: Community Support: Commercial Use: This software is currently licensed for free for personal and academic projects.

Academic Papers: Preview Video: Tutorial Video: This project was created as part of John Hardy's PhD research. News 19th Nov 2013 - Mageca featured Ubi Displays on their website! REPORT_DARIA-Final.doc. Mit-ros-pkg/KinectDemos. This page describes how to set up your system to run the kinect demos found in the mit-ros-pkg repository Kinect demos that you might want to check out include: General Installation All of the MIT kinect demo software shares a base installation procedure. To run one of these demos, please follow these General Installation instructions, and then follow the directions specific to the demo you wish to run.

Software Setup These instructions assume you are running Ubuntu, preferably 10.04. The Kinect demos can be installed with the following versions of ROS: click on a turtle to see the appropriate install instructions. Hardware Setup You will need a kinect for these demos. Specific Installation and Execution If you followed the optional 'install all demos' in the distribution-specific instructions, you are done with the installation. Hand Detection Installation rosmake hand_interaction Execution roslaunch hand_interaction hand_detector.launch To view with rviz, use the configuration: Finger Detection. Augmented Reality – What all the fuss is about - Who could ever forget Tom Cruise’s cool futuristic augmented reality computer in “Minority Report” where he effortlessly navigates the computer user interface using a series of natural gestures. I never thought that it was practical till I saw Piano Reality: an app where a user can use the camera feed to recognize the piano keys that are drew on an ordinary piece of paper, and then lets you play that piano.

Piano Reality - Click to watch the video When Transformers 3 – Dark Side of the Moon was released in July 2011, Paramount Pictures introduced an augmented reality app for iOS called Defend the Earth. Movie fans can download the app, locate a Transformers 3 poster, and scan the poster code to unlock an augmented reality, first-person shooter game. App - Transformers Defend the Earth Poster - Click to watch video And the excitement does not stop there. In addition to Windows Kinect SDK, you can also develop Kinect apps using these two open source APIs: 1. 2. About Steven Neo. Hci.usask.ca/uploads/286-KinectArms_CameraReady.pdf. Interaction Lab | KinectArms: a Toolkit for Capturing and Displaying Arm Embodiments in Distributed Tabletop Groupware. FingerTracker. Tuiokinect - A simple TUIO hand tracker for Kinect.

KinectArms: a Toolkit for Capturing and Displaying Arm Embodiments in Distributed Tabletop Groupware | Aaron Genest. KinectArms Introductory Video. Sstephenson/kinect. Ray Chambers - Kinect Goodies. Want to make a basic Kinect SDK App… now you can :) | Ray Chambers. Ray Chambers | Using Innovation as well as the Kinect In Education.

Lcna_co2012_dalal.pdf. Kinect. Kinect Open Source Programming Secrets. Patriciogonzalezvivo/KinectCoreVision. Kinect+OpenNI学习笔记之. Kinect+OpenNI学习笔记之8(Robert Walter手部提取代码的分析) - tornadomeet. Finger Tracking with Kinect SDK for XBOX - Project Directory. Kinect « Search Results. Kinect 3D Hand Tracking. Hand and finger tracking using the Kinect - Freenect. Kinect - fingertip detection. Finger Tracking with Kinect SDK (and the Kinect for XBox 360 Device) | Coding4Fun Kinect Projects. KinectEDucation. Kinect Hand Tracking. 1. Kinect Controls Windows 7 – Win&I; 2. Kinect Hacks Are Going Way Too Far | Utopian Frontiers Foundation.

2. Installing the Kinect Sensor · sanghi/metalab_rgbdemo Wiki. Kinect tutorial 1: First steps - Robotica. 12 BEST Kinect HACKS. Kinect on a Pioneer Mobile Robot with Onboard SBC - MobileRobots Research and Academic Customer Support. How to: Install Kinect in Linux (Mint 12, Ubuntu 12.04) « Igor Barbosa. Kinect on the BeagleBoard (and Ubuntu) Setting up Kinect for programming in Linux (part 1) Freenect - Latest news covering Kinect projects, applications, programming and hacking.