Get flash to fully experience Pearltrees
The Kinect motion controller for the Xbox 360 has been a huge success for Microsoft, probably above and beyond what the company ever expected the peripheral to achieve in the short amount of time it has been available. Removing the need to hold a controller clearly does work for consumers and gamers alike. Even though we don’t know when Microsoft plans to replace the Xbox 360, there is one aspect of the new console we can be sure about: Kinect will feature and most likely come as standard, integrated into the machine. Eurogamer has been informed that not only is that the case, but Kinect 2 is far superior in terms of the accuracy it can achieve. The tracking it performs will be so accurate, that it will be able to read your facial expressions and even go so far as to lip read. It may sound like a giant leap forward with the technology, but the main difference looks like it could be the rate of data transfer the new console can handle from the controller.
REDMOND, Wash. — Nov. 14, 2011 — For the past several months, Microsoft’s Windows Embedded business has been laying the foundation for an entirely new category within the traditional embedded market — solutions known as intelligent systems that can extend enterprise software and cloud services out to everyday devices such as point of service (POS) terminals, in-car infotainment , medical equipment and even bar-top game machines . According to Windows Embedded general manager Kevin Dallas, like so many other transformations in the technology world, the move toward intelligent systems is all about information. Kevin Dallas, general manager of Windows Embedded
Microsoft is readying a new type of Kinect hardware to work with Windows that will build on the current Kinect for Xbox 360 sensor. That revelation is from a November 22 post on the "Kinect for Windows blog" from Craig Eisler, the General Manager for Kinect for Windows (and a former leader in the Zune and Mac business units at Microsoft). "(W)e have optimized certain hardware components and made firmware adjustments which better enable PC-centric scenarios.
Kinect for Windows launches in early 2012 and today Microsoft announced that the Kinect hardware for PC is "specially designed to connect with PC." The list of changes includes "shortening the USB cable to ensure reliability across a broad range of computers" and adding a "small dongle to improve coexistence with other USB peripherals." But perhaps the most interesting change is a "near mode" that allows the Kinect sensor to focus on objects between 40 and 50 centimeters away from the camera.
To commemorate the one-year anniversary of Kinect and the Kinect Effect , I sent an email to my team earlier this week. I’d like to quote for you what I said to them, “It all started with a revolutionary sensor and amazing software that turned voice and movement into magic. With that magical combination, last year the Interactive Entertainment Business at Microsoft showed the world how to re-imagine gaming. This year, we’re showing the world how to re-imagine entertainment. Next year, with Kinect for Windows, we will help the world re-imagine everything else.”
| About | Blog | Code | Community | Current | TV | Wiki NUI Group Go - Version 1.0 - Preview Build 0.0.1.5 Chrome 5.0 - Pass... Firefox 1.9 - Pass... Firefox 2.0 - Pass...
Since announcing a few weeks ago that the Kinect for Windows commercial program will launch in early 2012, we’ve been asked whether there will also be new Kinect hardware especially for Windows. The answer is yes; building on the existing Kinect for Xbox 360 device, we have optimized certain hardware components and made firmware adjustments which better enable PC-centric scenarios. Coupled with the numerous upgrades and improvements our team is making to the Software Development Kit (SDK) and runtime , the new hardware delivers features and functionality that Windows developers and Microsoft customers have been asking for. Simple changes include shortening the USB cable to ensure reliability across a broad range of computers and the inclusion of a small dongle to improve coexistence with other USB peripherals.
Microsoft’s next-generation Kinect device may be accurate enough to read your lips. It has also been stated that future Xbox consoles will ship with Kinect 2 built in. Sources speaking to Eurogamer have claimed that the unannounced Kinect 2 will be able to read facial expressions and detect a variety of emotions through the user’s tone. Improved motion-sensing and voice-recognition technology are expected to be installed. The depth sensor in the current Kinect device was set to 30 frames per second upon the November 2010 launch, offering a 320×240 resolution limit.
You've seen those eye tracking heat maps that show where most people look first when they land on a web page - why not turn eye tracking technology like that into a replacement for your mouse or your finger on a touchscreen? That's what a Danish startup called Senseye claims to be doing; they say they've got software for Android that uses the front-facing camera to track a user's eye movement and then uses that to control what happens on the phone's screen. They're not alone in working on doing that kind of work, either. Eye tracking could be a big new way that users interact with their devices. If the company can really pull this off, Senseye could join the ranks of Microsoft's Kinect, Surface and the touchscreen mobile devices in what people are calling the Natural User Interface (NUI). A Swedish company called Tobii announced in-car eye tracking technology this week as well and these aren't isolated innovations.
October 31, 2011 | By Chris Morris 1 comments More:
Engineering embedded systems is an increasingly interesting, disruptive — and lucrative — field for designs ranging from bicycles to firearms to airplanes and beyond. In the following interview, “ Making Embedded Systems ” author Elecia White ( @logicalelegance ) describes how she came to write a defining book on the subject, how she came to learn the practice, and why she came to realize that the future is just around the corner. Why are embedded systems important right now? <img src="http://s.radar.oreilly.com/2011/11/15/1111-elecia-white.jpg" border="0" width="95" alt="Elecia White" style="float: right;margin: 3px 0 10px 10px" /> Elecia White: Embedded systems are where the software meets the physical world. As we put tiny computers into all sorts of systems (door locks, airplanes, pacemakers), how we implement the software is truly, terrifyingly important.
As revolutionary as the mobile ecosystem is, it’s the interactions of more-intelligent connected devices with people outside the context of phones or computers that will drive more innovation, says Mark Rolston, the chief creative officer at Frog Design. Rolston, speaking at the Mobile Future Forward conference on Monday in Seattle described a future where devices become more contextually aware, thanks to embedded and connected sensors. Instead of thinking about the buttons on a phone or a laptop, manufacturers and designers need to think about what will happen when computers are embedded in everything and connected all the time.
Computers — the boxes that we consult in the form of tablets, mobile phones and desktops — are wonderful, but they take away from what it is to be human and to really connect with one another. So the challenge and opportunity that lies ahead is how to get the computers out of computing, said Mark Rolston, the chief creative officer at frog. Speaking at the GigaOM RoadMap conference in San Francisco, Rolston took the audience through a vision of omnipresent computing . “The room is the computer,” he said, as he described putting something like Apple’s Siri voice recognition system into an earpiece, and then being able to interact with a projector in a room to create a screen wherever the user needed one. “Computing is decoupling.
Siri is the first working example of how everyday people will interact with connected devices in the near future. What's selling many people on Apple's latest handset isn't the impressive hardware: It's the promise of an "invisible interface" through Siri, the iPhone's personal assistant software. Siri is arguably the first working example of how everyday people will interact with connected devices in the near future.