Showing posts with label multi-touch. Show all posts
Showing posts with label multi-touch. Show all posts

Thursday, May 31, 2007

surface

So it has arrived. This is a landmark in the way we engineer our computing environments and this will set the bench mark for things to come. And it comes from Microsoft. Yes they beat Apple to it. It is SURFACE. It's a multi-touch enabled coffee table that allows you to drop your digital camera onto the surface and it automatically downloads your images for viewing on the table. You can then resize, zoom and crop your images. Even upload them to another device. It is amazing. The drawbacks to this are that it is quite impractical relying on mirrors, cameras and projectors to work. Unlike the multi-touch device, iPhone, which will be accessible to the masses and will probably do really well. Madonna has been given a surface. B**ch. Helsinki has a tourist info wall. I want one. It would look great in my flat. But you can be sure you'll have to use a coaster for your drink.

Saturday, March 10, 2007

the future is nigh

So with all the media attention being firmly on Apple and the imminent release of the iPhone we should really look at why this technology is important. It is not because the product looks pretty (which inevitably it does) and certainly there is nothing ground-breaking about what this device is capable of. What is revolutionary is the interface that has been developed and with it an era of 'multi-touch' technology. What Apple is doing here is actually taking existing ideas and creating a viable product with an intuitive user-interface (the iPod and Newton are past examples of this). This MT system has been around for a quite a while though, with the likes of Jeff Han working on some incredible interfaces (check out his work in the video below). The product was created after the inventor studied the effect of his finger pads on the outside of a glass of water noticing where his fingers actually made contact. You have to wonder as to the influence of science-fiction in many of these new technologies with Minority Report being an obvious example. Or is this just an evolution of the way in which we interact with our devices? Are we tired of using a mouse and keyboard and is this the next step? Of course there are new things to learn. Gone are terms like 'point and click': enter terms like 'pinch-zoom'. Are we bringing in a new energetic way to interact? Certainly it is much easier to interact with the physical world and actually being able to move objects around a screen using our fingers seems like a great (if a little tiring) way to interact.



So what is the future going to hold for us in this field? Will there ever come a time when we can simply look at an object and will it to move around a screen? Sounds far fetched? Well believe it or not scientists have developed a system whereby a person's actions can be represented as brain activity in a computer. This computer can then be trained to respond to that same activity even if it is only a thought and not the actual action: hence the ability to move objects with the power of your mind. Check out this crazy monkey brain control video here (thanks to blunt for this one). But this is not the stuff of small research labs, this technology was showcased this week at a gaming conference in San Fransisco. A device that scans the brain can interpret brain activity as intended action and move objects about onscreen. This is cutting edge stuff and we are a long way off having home computers with the computing power to read our minds in an effective way. But the work has begun and the technology is there. Science fiction becomes science fact.

Everything you can imagine is real.
Picasso

Saturday, February 17, 2007

multi-touch

Check out this unbelieveable demo of multi-touch software. If you thought the technology behind the iPhone was cutting edge you won't believe the tech behind this GUI (graphic user interface). I think that interacting using your hands and arms like this would be tiring over time. But it definitely has some interesting applications for the real world. Jared, in the agency, had a great idea that this tech could be used by doctors to perform remote operations. They could be half-way across the world and yet be able to control the operation using techniques that are more human than using tiny remote controls. I also think it is interesting to be able to expand and explore images onscreen using gestures similar to the 'pinch' technique shown in the iPhone. The exploration of Google earth using this technique is also really cool.