Andy Piper brought his new toy to the lab today. While on a whistle stop tour of China recently he called in at Hong Kong on the way back, where he picked up one of the a Parrot AR.Drones which have been released this month.
The AR.Drone is a quadricopter with 2 video cameras, one mounted in the nose and one downward-facing. The drone that acts as an ad-hock Wi-Fi access point allowing it to be controlled from any device with Wi-Fi. At the moment Parrot are only shipping a client for the iPhone, but there is an API available and there is already footage on the net using an Android Nexus One to control one. It’s loaded with a bunch of other sensors as well, an accelerometer to help keep it stable and a ultrasound altimeter to help it maintain altitude over changing ground.
The iPhone interface for flying the drone uses the accelerometer and is a bit tricky to start with, but I think with a little bit of practice it shouldn’t take too long to get the hang of it. The feed from the video cameras is fed back to the handset allowing you to get a pilot’s eye view. At the moment none of the software allows you to capture this video, but it’s expected to be added soon. You can also use the camera to play AR games or have the it hover and hold station over markers on the floor.
The whole thing runs a embedded Linux build on a ARM chip and you can even telnet into the thing. It comes with 2 chassis, one for outside and one with some protective shrouds for the propellers to use indoors.
I think some very cool stuff should be possible with a platform like this.
Here are 2 short videos I short of a few of us having a go with it on the lawn in front of Hursley House.
The theme for the day was location and augmented reality.
A particular highlight was a talk by Paul Golding on Augmented Reality & Augmented Virtuality, covering a variety of topics such as the state of Virtual Worlds today, and the potential of mobile augmented reality apps to move us from a “Thumb Culture” to a camera-led “Third Eye culture”.
A number of mobile augmented reality platforms were discussed, such as Nokia’s MARA research project, the QR-based Insqribe, the real-world / virtual-world mobile mashup platform junaio, and the ‘world browser’ Wikitude.
Another platform that got several mentions, including a developer’s crash course in the afternoon from Richard Spence, was Layar.
I had a quiet afternoon in the office this afternoon, so I thought I’d give the Layar API a quick try for myself.
Last night I was invited to speak to a very diverse class of MBA students and entrepreneurs at Babson as a guest (Thankyou Linda for the invite). The conversation of course happened in Second Life, and also happened to be around midnight my time. That in itself is almost routine now, though for a change I was using voice and watching for text questions. However what was great was that the subject was not the metaverse itself, though I did throw in some futures like 3d printing and augmented reality. No the subject was the story of eightbar, the steps to get to the point we are at, how despite various things stacked against many of us we just carried on and did the right thing. ***UPDATE@abelniak who was at the meeting twittered that he had this post on his perspective as an member of the class. Once again the power of social media and willingness to share and build is in action.
Also some honest statements about the risk of being a pioneer, and the fear that self organized group can generate in traditional control structures.
I really enjoyed talking to the group, and there were some great questions from some clever minds.
We discussed leadership in particular, and what differs or is the same in virtual worlds. My general answer is a good leader will adapt, those true leaders already in traditional places of power have the emotional skills to lead and inspire anywhere. However the new connected world and removal of local as barrier unleashes the abilities in anyone who wants to lead.
Anyway, a huge thanks to everyone who came along, thankyou for listening, and following up. It became clear to me there is a huge value in sharing this story now, its constantly evolving for me, epredator, eightbar, metaverses, IBM. At any point in time it has things to learn from and things to share.
Thankyou to AnnieOk for pointing me towards the video and articles here on the MIT Fluid interfaces that got such a good reception at TED 2009. This is brilliant work. You have to see this and go to wired to read the rest of the article.
Projection, mixed with gesture and finger tracking, whilst looking a little cumbersome this is showing some very clever things actually working.
What I like about projection (though I do find the personal ways to get an AR experience relevant too) is the potential to share with others. Just as it has become common, as I have mentioned before, to see people gathered around and iphone on the table.
Its been quite a weeks for seeing things often talked about actually working.
Over at Redmonk, James Governor has written a very interesting piece on what has happened to the Microsoft ESP platform. Mirror worlds, accurate representations of real things, ideally instrumented by a raft of sensors from the real world are a very specific, and obvious, use of virtual worlds. After all pilots already spend a large amount of time training in such environments and we entrust out lives to them. (It would be interesting to know how much virtual training the Hero of the Hudson has had, re water landings).
James said of ESP “the single coolest initiative I have seen from Microsoft in the 13 years I have been watching the firm”, but now it appears there is a drop in focus on it.
There are of course lots of other mirror worlds and hybrid mirrors out there, but as yet there is not a good commercial high fidelity toolkit that can be used to build specific mirrors.
Google Earth is clearly the most rich in terms of global level instrumentation, but it is at a much more finite and realtime level that we will see the benefits.
I am not sure what we would do with a live as-is model of the world accurate and instrumented in every way possible, but as a concept and seeing the fascination people have for maps, photos, and satellite images of their part of the planet it seems a worthwhile goal to make a true mirror world.
Also an accurate model of an environment is a base requirement to help enhance the real world with augmented reality systems. i.e. like the ones we already have for our GPS tracking. Without the accurate map(digital model of the world), the GPS position is of less use to the average user.
As James also says though ,there is some speculation in the future of the ESP platform. So I guess we will have to wait and see.
Whilst we may have missed the advertizing-fest that is the US Superbowl, we do atleast get to see some of the great ideas courtesy of youtube. For me the most significant was this one. (Thanks Roo for finding it first 🙂 )
It speaks for itself, but does have some subtle little niceties. What is does show is a mainstream appreciation that we all have various avatars and visual persona’s that we engage with anywhere and everywhere, on mobile devices, in coffee shops.
Mainstream appreciation of the adoption of this way of interacting?
I moved offices today and having a bright new whiteboard I could not leave it clean for long.
Its not really a mindmap, just some association of thoughts and bits of linkages. I am sure it will alter, but right now this is what was in my head in a mad flurry. The underlying red part is really the substrate of the whole thing. Just my personal thoughts linked to some of the things I have seen and been involved with one way and another.
Note: edited to show smaller version of the board as it was cropping the right hand important side for those that did not click through to flickr. 3d printing FTW and high value professional social networks one there too !
Thanks again to Malburns for spotting this and tweeting it. Rivers Run Red have released an example of an application layered onto immersive workspaces in Second Life. In this case it is around retail planning and visualization.
This is an example of the next layer of toolsets that we can expect to see across virtual worlds, as those virtual worlds become a platform not just a place.
Producing what if scenarios, or mirror world scenarios does need the ability to simply sketch and examine the possibilities whether its a retail store, a machine room or an intricate business model that cannot normally be visualized.
The exciting thing about this for us here at eightbar is that it makes it a step closer to be able to then instrument the model with real live data via publish subscribe methods such as MQTT. Merging the data from a smart planet into immersive visualizations that can be explored together, not stand alone clearly is a direction we have been pushing since even before the 2006 Wimbledon. Hursley is (for those who dont know) the home of messaging, pub/sub reliable MQ messaging.
Andy Piper was telling me that he was listening to a podcast with Leo Laporte where he basically dissed Second Life as a gimmick and suggested it was not all that. Well I did have a listen to the end of this podcast and sure enough both Leo and Amber Macarthur made some throw away comments about the value of Second Life. Now to be fair, it was not a rant. In some ways this was about the time and effort required to engage, and they did give a shout out to the great communities that have formed. Though Leo did use the word gimmick.
Clearly they only said Second Life, they did not say virtual worlds in general, so they may have been dealing with a specific experience and press bubble, but it is a little odd position to take when someone has a reputation for transitioning across media. Yes it can take a little bit of time to engage with people in virtual worlds, but that is the point in many ways.
There is room for text, for twitter, for podcasts (that I sometimes find very time consuming to have to listen to), for virtual world events and for whats next.
Without virtual worlds we have no place to take this further, no mirror worlds, no augmented reality, no 3d printing/rapid fabrication.
It is of course different horses for courses, but I dont think any of us in any field should consider excluding any of the others or writing them off out of hand. Text and voice still work of course.
The journey of discovering good ways to interact with one another is one I think we are all on so I am just let them off this time 🙂
If they want a metaverse evangelist on the show to explain…. well happy to help.