Google glasses: Surf the net, email, make calls - how the Google goggles work By Jaya Narain Updated: 08:55 GMT, 24 February 2012 If you’re the sort of person who spends ages looking for your mobile phone, Google may have the answer. And the solution will be right in front of your eyes – literally. The technology giant is close to launching a pair of futuristic glasses that would deliver all the services of a smartphone straight to the wearer’s eye. Google glasses would deliver the smartphone experience straight to your eyes Featuring a miniature display on one lens, the hi-tech specs allow users to surf the internet or deal with text messages and emails without lifting a finger. The screen is controlled with a ‘mouse’ which is moved simply by tilting your head. Reports suggest the device, which would revolutionise the smartphone market, could be available by the end of this year costing less than £380 – making it cheaper than Apple’s iPhone. It has been reported the glasses will resemble Oakley's Thump sunglasses Google declined to comment.
Yet2com Blog Holograms On iPad Demoed On Video Using Sixth Sense Technology I’ve heard many people taking about technology as a whole and that it’s not advancing as fast as in recent years which will eventually lead to a stagnation. I don’t agree at all with this statement and I believe that the advancement in technology will blow the mind of those who disagree. Per example, take a look at a technology called Sixth Sense developed by a MIT research assistant called Pranav Mistry. During the TED conference a couple of years ago, Pranav demoed a portable projector which gives people the possibility to interact with objects and the information displayed by the objects. When Sixth Sense will be adopted at large scale will allow people to see the reviews of a book in real time then holding the book. The Sixth Sense projector knows that book your are holding and it will search the internet for reviews of that book. According to inventor Pranav Mistry, Sixth Sense will be accessible to the a big chunk of Earth’s population as it will cost less than $350.
How Networks Of Biological Cells Solve Distributed Computing Problems Distributed computing is all the rage these days. The idea is to break down computational tasks into convenient chunks and distribute them across a network to a number of computers. The benefits are clear, such as easy, on-demand access to huge computing resources. The conventional way to think about these systems is as independent Turing machines connected by a network that allows them to exchange large messages. This so-called ‘message passing model’ certainly applies to much of the distributed computing that takes place on the internet; projects such as the SETI@home and the Einstein @home programs. But there is a growing awareness that many networks are much more limited, both in the size of the messages they can transmit and receive and also in the processing capacity at each node. A biological cell, for example, can transmit and receive only limited amounts of information and can perform only rudimentary processing tasks. That could turn out to have far reaching consequences.
Sell and Recycle Used Electronics - Gazelle Startup Makes 'Wireless Router for the Brain' Optogenetics has been hailed as a breakthrough in biomedical science—it promises to use light to precisely control cells in the brain to manipulate behavior, model disease processes, or even someday to deliver treatments. But so far, optogenetic studies have been hampered by physical constraints. The technology requires expensive, bulky lasers for light sources, and a fiber-optic cable attached to an animal—an encumbrance that makes it difficult to study how manipulating cells affects an animal’s normal behavior. Now Kendall Research, a startup in Cambridge, Massachusetts, is trying to free optogenetics from these burdens. Christian Wentz, the company’s founder, began the work while a student in Ed Boyden’s lab at MIT. The device, which weighs only three grams, is powered wirelessly by supercapacitors stationed below the animal’s cage or testing area. The wireless capabilities allow researchers to control the optogenetics equipment remotely, or even schedule experiments in advance.
Scientists engineer nanoscale vaults to encapsulate 'nanodisks' for drug delivery There's no question, drugs work in treating disease. But can they work better, and safer? In recent years, researchers have grappled with the challenge of administering therapeutics in a way that boosts their effectiveness by targeting specific cells in the body while minimizing their potential damage to healthy tissue. The development of new methods that use engineered nanomaterials to transport drugs and release them directly into cells holds great potential in this area. And while several such drug-delivery systems — including some that use dendrimers, liposomes or polyethylene glycol — have won approval for clinical use, they have been hampered by size limitations and ineffectiveness in accurately targeting tissues. Now, researchers at UCLA have developed a new and potentially far more effective means of targeted drug delivery using nanotechnology. The UCLA research team was led by Leonard H. "Vaults can have a broad nanosystems application as malleable nanocapsules," Rome added.
Watson's New Job: IBM Salesman IBM’s Watson supercomputer reached a milestone in artificial intelligence last February when it beat two Jeopardy! champions. Millions watched, and while some experts dismissed it as a publicity stunt, IBM said Watson would soon be helping doctors diagnose illness, and hinted at talks with gadget companies about Watson helping consumers with questions. As IBM prepares to celebrate the first anniversary of the televised contest on February 16, though, it is not yet offering the question-answering system for sale. Although limited trials using Watson technology are underway in health and financial services businesses, the AI prodigy is having its biggest impact by pulling in new customers for existing business products—as IBM persuades them to organize their data into formats that an AI like Watson can better understand. IBM hasn’t disclosed how much it spent developing Watson, but the lengthy research and development process is believed to have cost in the tens of millions of dollars.
Activision Shows Off Animated Human That Looks So Real, It's Uncanny Activision showed off the state of the art of real-time graphics on Wednesday, releasing this mind-boggling character demo. The character's skin, facial expressions and eyes look so real, it's uncanny. When you watch this video, see if you think this character has reached the other side of what's commonly called the "uncanny valley," a term first uttered by early robotics guru Masahiro Mori in 1970. It describes the range of sophistication of animated graphics, from one side of the valley where human figures simply look unrealistic, to the middle of the valley — where they look just realistic enough to be creepy — to our side of the valley, where animation is indistinguishable from reality. Whenever the uncanny valley is mentioned, the animation techniques from the November, 2004 movie Polar Express come to mind. Most viewers noticed the characters weren't quite photorealistic enough to keep them out of the creepy zone. I think this is impressive, but not perfect yet.
Robot Hand Copies Your Movements, Mimics Your Gestures (video Robot see, robot do. 2010 may go down in history as the year of gesture recognition. We’ve seen it in TVs, we have it in our video games (thanks, Kinect!), and now we have it in our robots. The Biological Cybernetics Lab at Tsukuba University, headed by Kiyoshi Hoshino, recently demonstrated a robotic arm that can mimic the position and movements of your own. Using two cameras, the system tracks your hand and arm more than 100 times per second and relays the information to the robot so that it can repeat what you do. The Hoshino Lab should be commended for its cool approach to robotic controls, but I’m not sure how impressed we should be with their camera setup. Also, while I’m totally in favor of gesture controlled robots, the video above demonstrates that there are some serious limitations to the technology. Even with these limitations, I think that the Hoshino Lab shows us that non-traditional controls for robotics could arrive much sooner than we think. Only better.