The Internet of Things and the cloud — Cloud Computing News. We are in the early stages of the Internet of Things, the much anticipated era when all manner of devices can talk to each other and to intermediary services.
But for this era to achieve its full potential, operators must fundamentally change the way they build and run clouds. Why? Machine-to-machine (M2M) interactions are far less failure tolerant than machine-to-human interactions. Yes, it sucks when your Netflix subscription goes dark in a big cloud outage, and it’s bad when your cloud provider loses user data. But its far worse when a fleet of trucks can no longer report their whereabouts to a central control system designed to regulate how long drivers can stay on the road without resting or all the lights in your building turn out and the HVAC system dies on a hot day because of a cloud outage. The current cloud infrastructure could crumble under the data weight In this new role, the cloud will have to step up its game to accommodate more exacting demands. Let’s be clear. Design beyond the glowing rectangle - EuroIA2010. A Brief Rant on the Future of Interaction Design.
So, here's a Vision Of The Future that's popular right now.
It's a lot of this sort of thing. As it happens, designing Future Interfaces For The Future used to be my line of work. I had the opportunity to design with real working prototypes, not green screens and After Effects, so there certainly are some interactions in the video which I'm a little skeptical of, given that I've actually tried them and the animators presumably haven't. But that's not my problem with the video. My problem is the opposite, really — this vision, from an interaction perspective, is not visionary. This matters, because visions matter. This little rant isn't going to lay out any grand vision or anything. Before we think about how we should interact with our Tools Of The Future, let's consider what a tool is in the first place. I like this definition: A tool addresses human needs by amplifying human capabilities. That is, a tool converts what we can do into what we want to do.
That's right! Physiological Computing : Biometrics and evaluation of gaming experience part two: a thought experiment. Recent posts on the blog have concerned the topic of psychophysiology (or biometrics) and the evaluation of player experience.
Based on those posts and the comments that followed, I decided to do a thought experiment. Imagine that I work for a big software house who want to sell as many games as possible and ensure that their product (which costs on average $3-5 million to develop per platform) is as good as it possibly can be – and one of the suits from upstairs calls and asks me “how should we be using biometrics as part of our user experience evaluation? The equipment is expensive, its labour-intensive to analyse and nobody seems to understand what the data means.” (This sentiment is not exaggerated, I once presented a set of fairly ambiguous psychophysiological data to a fellow researcher who nodded purposefully and said “So the physiology stuff is voodoo.”)
Here’s a list of 10 things I would push for by way of a response. Physiological Computing : Lifestreams, body blogging and sousveillance. Way back in June, I planned to write a post prompted by Kevin Kelly’s talk at the Quantified Self conference in May and a new word I’d heard in an interview with David Brin.
Between then and now, the summer months have whipped by, so please excuse the backtracking – those of you who have seen the site before will have heard of our bodyblogger project, where physiological data is collected on a continuous basis and shared with others via social media sites or directly on the internet. For instance, most of the time, the colour scheme for this website responds to heart rate changes of one of our bodybloggers (green = normal, yellow = higher than normal, red = much higher than normal – see this for full details). This colour scheme can be mapped over several days, weeks and months to create a colour chart representation of heart rate data – the one at the top of this post shows a month’s worth of data (white spaces = missing data). “What emerges in this new model are Lifestreams. People Are Walking Architecture, or making NearlyNets with MujiComp, January 2010.