Wimbledon roof garden – Pimms or Prims

Judge has built a roof garden based on the real life media centre roof garden. When Tara5 Oh came to visit we had a misunderstanding on the bottle of Pimms (a Wimbledon tradition of sorts) and the word prim (somewhat popular in Second Life)
I cut a video of both the real and the virtual to illustrate the interpretation

Real Life Wimbledon still Rezzing

Wimbledon is building a new court number 2. I got to go and have a look. The whole place is currently plain concrete and also has some terracotta warrior style statues of the players. All in all it looks like a Second Life render of a place that has not had any textures applied yet. (It was Ricky that pointed this out first yesterday to me and so I had to go and see, and he is right I think)
Court 2
The avatars of the players ar in the video below, along with a nice particle effect (they were watering the real grass with real water and getting a real rainbow)
Replacement video as the other one was incorrectly identified as copyright infringement. Still doing the paperwork to clear my good name

Eating the IT Elephant

I just got the review copy of Richard Hopkins and Kevin Jenkins book Eating the IT Elephant : moving from Greenfield Development to Brownfield.
Eating the IT Elephant
I have known and worked with Richard for quite a few years so I was very interested when he started to talk about this book, then when this exploded into using Second Life for visualization of existing system architectures (starting on Hursley island) it got me even more interested.
turner boehms original build
Image from snapzilla
The book is not solely about using virtual worlds to visualize systems, but it is a part of the whole. For any IT architects out there and software engineers many of the themes around complexity with familiar. As will the not so good solutions of representing complex architectures in reduced down powerpoint slides or stickers on a wall.
There is a lot more to the book, and I need to read the rest properly. It has a foreword by Grady Booch and by the one of the UK based IBM fellows Chris Winter. They make interesting reading. Though I really like Richard’s family dedication. I wont spoil that for you 🙂
They have their own site an blog over at elephanteaters.org the book is on amazon the uk link is here

The Eightbar brand – part 3

Following on from Ian’s posts about the Eightbar gang sign and large hands and sign language in Second Life I have created the Eightbar gang sign using my good friend Anna.

Anna is an animated avatar developed by the University of East Anglia’s eSign project to synthesize sign language, she was used as part of an Extreme Blue project to convert Speech to Sign last summer. Anna is animated using Signing Gesture Mark up Language (SiGML) which is based on the internationally established notation for sign, HamNoSys. Currently, to create signs for Anna, eSign have provided an editor which is very good, but requires a reasonable amount of time to be able to use efficiently. For a few days per week I have been working on a way for people who are not familiar with the eSign editor to create signs for Anna, with the hope that creating signs can be the sort of thing you just dip into, when you have a spare minute and a sign you would like to create. With this in mind I decided, as a test, to create the EightBar gang sign using my interface.

After 15 minutes of playing about, I had a gang sign! It took a few attempts to get there mind…

But finally Anna was throwing the gang sign like a pro…

Eightbar gang sign

Metaverse object creation

For a while now our CIO Innovate Quick team have been busy on the Torque based Metaverse. As we have explained before this is primarily aimed at having a set of resources we can delve a bit deeper into the code of. The aim is not to create an all encompassing platform, as personally I dont think there ever will need to be an all encompassing one.
The team creating and writing and enhancing what we have has grown recently. This has meant some of the things we were missing, but wanted to put in are now getting done.
The biggest of these just went live, that of object manipulation. We tend to develop the Web 2.0 way and put things out there for people, fellow IBMers, to experiment with and break at will. This principle has been used for lots of things under out Technology Adoption Program(TAP). Applications get floated out under this banner and successful ones get adopted in a more serious fashion.
Torque as a game engine already had a live editing function and interface, but this was only really for the server, not for each client to interact with.
Now we have the start of the ability to create objects from the pallette in real time. The strange thing was that despite having been able to do this as a server for some time I felt different rezzing and moving an object as an avatar. I knew that I missed the ability to do this, having been spoilt in that sense by Second Life, but to be able to enter a world and just Rez, even if you are not going to do it all the time seems to have a different psychological impact (at least on me!) to more static virtual worlds where it is merely your participation that is allowed.
Of course as soon as you can create then a power mad spark fires up and you want to build lots of things, more and better. However for now we are able to investigate what this will mean for our colleagues entering this environment, for a comfort factor and then move to start to examine the business value to creation and manipulation in our own environment.
penguin ball
So yes this is not a complex business object that we are interacting with, but I was able to build this live (from the objects we have) and so that is an interesting start to another part of the journey.

Sport Relief 08 – Charity Fundraiser in Second Life

The excellent chairity Sport Relief is holding a charity event 14-16th March in Second Life. There are various press releases flying around but as this is for a good cause then a little bit of viral marketing does not go a miss. The sim’s themselves are not yet public but the avatar based racing looks like it will probably need you to join the group “Sport Relief Fund Raisers” owned by Lottie WeAreHere.
I am sure I will be able to get along and participate, and as with many of the other charity events we can all do some good with our virtual world presences and get donating.
If you are not sure what this is all about then check out the official sport relief site
More to come over the weekend and good luck to everyone running it.
*Updated the dates and the group name 🙂

Powerup – A serious game out in the wild

Powerup the game recently went live created by our colleagues here in IBM using the Torque games engine.
It is a scenario based game, with a multiplayer online team approach to help students understand the role of science in saving a planet.
This fits into the growing usage and acceptance of game based technology and finding ways to engage people in serious subjects but in a more entertaining way.
The game is a downloadable windows client, that then connects to the main servers.
I captured a little video of some of the elements, the frame rate is much better live of course.

You get to select a first and second name from a list. This is to avoid any risky names I believe. So for once I am not epredator, but I did log on as Ian Hughes, imagine that! Though I did manage to only select the female avatar.
The rest of the communication is via menu systems, not freeform chat, again this is a protection mechanism given the target audience for the game.
Players can join missions, that are typically sharded ones and tasks are designed for more than one person to help complete.
The official website has much more information for parents and teachers. We here at eightbar cannot claim to have built any of this, though we have known about the development as its obviously closely related to our Torque based internal metaverse and is another spin off of bringing IBM into virtual worlds and games in general (which I think we can rightly claim to have set the ball rolling on 🙂 )

Long Live the infocenter !

I’ve always been a bit scared of infocenters – even though, deep down, I know they’re “just HTML”; they never quite seem that way. Javascript and to-the-pixel object placement is just getting too good these days. You could almost mistake it for a java applet or at least some kind of fancy AJAX application.

But no, it’s just a set of good-old framesets, frames, HTML content, hyperlinks and images, bound together with some javascript eggwhite and stirred vigorously for a few minutes to make the infocenters we know and (some, I hear) love.

However, to make it seem like it’s “alive”, there is a Java servlet lurking back at the server, generating parts of the Infocenter dynamically, including rendering the Table of Contents from a behind-the-scenes XML description, and running search and bookmarks and things like that.

What I became curious about, then, were two things:

  • Could we extract a sub-set of an infocenter and just display that, rather than having to wade through everything we were given? For example, I might only be interested in the administration section of a product, or might only need to know about one component of a toolkit of many components. Having a more navigable and less intimidating sub-set would greatly improve productivity.
  • Rather than having to install an Eclipse infocenter run time on a server to host a set of documentation, is there a way to run it on any plain old HTTPd (e.g. Apache)? I accept that search, bookmarks, and other dynamic features won’t work, but the real information – the useful stuff in the right-hand window, which we use to do our jobs with the products we’re trying to understand; and the all-important navigational Table of Contents structure in the left-hand window – would be available to us “anywhere” we can put an HTTPd.

With a ThinkFriday afternoon ahead of me, I thought I’d see what could be done. And the outcome (to save you having to read the rest of this!) is rather pleasing: Lotus Expeditor micro broker infocenter.

This is a subset of the Lotus Expeditor infocenter containing just the microbroker component, being served as static pages from an Apache web server.

First the information content. The challenge I set was to extract the sections of the Lotus Expeditor documentation which relate to the microbroker component. It has always been a bit of a struggle to find these sections hidden amongst all the other information, as it’s in rather non-obvious places, and somewhat spread around. This means creating a new navigation tree for the left-hand pane of the Infocenter. When you click on a link in the navigation tree, that particular topic of information is loaded into the right-hand window.

However, it quickly became apparent that just picking the microbroker references from the existing nav tree would yield an unsatisfactory result: the topics need to be arranged into a sensible structure so that someone looking for information on how to perform a particular task would be guided to the right information topic. Just picking leaf nodes from the Lotus Expeditor navigation tree would leave us with some oddly dangling information topics.

Fortunately Laura Cowen, a colleague in the Hursley User Technologies department for messaging products, does this for a living, and so was able to separate out the microbroker wheat from the rest of the Expeditor documentation and reorganise the topics into a structure that makes sense out of the context of the bigger Expeditor Toolkit, but also, to be honest, into a much more meaningful and sensible shape for micro broker users

First we needed to recreate the XML which the infocenter runtime server uses to serve up the HTML of the navigation tree. Laura gave me a sample of the XML, which contains the title and URL topic link. From the HTML source of the full Expeditor navigation tree, using a few lines of Perl, I was able to re-create XML stanzas for the entries in the navigation tree. Laura then restructured these into the shape we wanted, throwing out the ones we didn’t want, and adding in extra non-leaf nodes in the tree to achieve the information architecture she wanted to create.

Wave a magic wand, and that XML file becomes a plug-in zip file that can be offered-up to an infocenter run time, and the resulting HTML content viewed. After some iterative reviews with potential future users of the micobroker infocenter, we finalised a navigation tree that balanced usability with not having to create new information topics, apart from a few placeholders for non-leaf nodes in the new navigation tree.

So far so good – we had an infocenter for just the microbroker component of Expeditor, and it was nicely restructured into a useful information architecture.

Now for phase two of the cunning plan: can we host that on a plain-old HTTPd without the infocenter run time behind it? The information topics (the pages that appear in the right-hand window) are static already, and didn’t need to be rehosted – the existing server for the Lotus Expeditor product documentation does a perfectly good job of serving up those HTML pages. It’s the rest of the Infocenter, the multiple nested framesets which make up the Infocenter “app”, and the all-important navigation tree, which are dynamically served, from a set of Java Server Pages (JSPs).

A quick peek at the HTML source revealed that several JSPs were being used with different parameter sets to create different parts of the displayed HTML. These would have to be “flattened” to something that a regular web server could host. A few wgets against the infocenter server produced most of the static HTML we would need, but quite a few URLs needed changing to make them unique when converted to flat filenames. A bit of Perl and a bit of hand editing sorted that lot out.

Then it transpired there is a “basic” and an “advanced” mode which the back-end servlet makes use of to (presumably) support lesser browsers (like wget 😐 ). Having realised what was going on, and a bit of tweaking of the wget parameters to make it pretend to be Firefox, and the “advanced” content came through from the server.

Then we had to bulk get the images – there are lots of little icons for pages, twisties, and various bits of window dressing for the infocenter window structure. All of this was assembled into a directory structure and made visible to an Apache HTTPd.

Et voila! It worked! Very cool! An infocenter for the microbroker running on a straight HTTPd. Flushed with success, we moved it over to MQTT.org (the friendly fan-zine web site for the MQ Telemetry Transport and related products like microbroker). Tried it there…

Didn’t work. Lots of broken links, empty windows and error loading page stuff. Seems the HTTPd on MQTT.org isn’t quite as forgiving as mine: files with a .jsp extension were being served back with the MIME type text/plain rather than text/html, which may not look like much, but makes all the difference. So a set of symlinks of .jsp files to .html files, and another quick wave of a perl script over the HTML files put everything right.

So with an afternoon’s work, we were able to demonstrate to our considerable satisfaction, that we could excise a sub-set of an Infocenter from a larger book, restructure it into a new shape, and take the resulting Infocenter content and flatten it to a set of HTML pages which can be served from a regular HTTP server.

IBM at the NRF

Does your avatar know how to make actual money? Bernadette Duponchel’s does. She was recently at the National Retail Federation conference with the rest of her team, presenting IBM’s take on virtual worlds for the fashion design industry.


This is the second consecutive year IBM has demonstrated the use of virtual worlds at the NRF. The brief demo highlights the benefits of real-time collaborative design, short feedback loops when tweaking materials and costs, and even pre-selling the item before it is physically manufactured.

VR Cave, Second Life and Retail at the NRF

All things virtual featured in the latest NRF retail show. I spent some time talking to my collegue in Second Life a fellow member of eightbar Siobhan Cioc about the Dallas based GSC demo centre.
There is a background story to this that is also of interest to the Web2.0 community trying to establish the value of both blogging and virtual worlds.
I had seen a press release about IBM at NRF. Now that the metaverse acceptence has spread there are too many things and spin off for even us tuned in metaverse evangelists to keep up with. So here is what happened….
I saw the press release and the mention of a CAVE demo. I blogged internally about it to see if anyone had anything. Of course many of the people involved were at the show so that was always going to be a slow burn approach. However the question was out there, what are we doing? Then an article about NRF appeared on our intranet with a link to … yes an NRF blog. I tracked back on the blog entry asking the question again about the SL/Cave piece to try and connect the threads. Sobhan Cioc’s real life presence both replied on my blog and also sametime instant messaged me. We then both dived into our public SL islands where she explained what the project was all about. I listened and also took a small snap of film which I just put on youtube.
Now we have connected, discussed who we know in common, worked out some ways to help one another becuase we both used all the available technology and approaches to connect with one another. Why blog inside a company firewall….. Well there is your answer. We get people connected and questions answered. Some instantly.
I digress (I think I may have turned into Ronnie Corbett)
So the cave project. You may be able to see this here though I am not sure how well the link will work to the fox news item I will post a better one when I find it, this is hot of the press after all.
So just in case here is the explanation with some SL footage.
The team have created a configurable room in SL. The room is HUD controlled. The HUD allows elements in the room, in this case TV’s, Speakers and even the starlit sky above it to be adjusted. This approach has been shown before in various ways, the Circuit City couch and the Sears kitchens over on IBM 10. The difference here though was that the demo was built to specifically integrate with a head mounted display and the booth was build for this sort of interaction to occur.
Yes we have seen 3d rooms in VR before, but the difference here is this is on a public multi user platform. Much of what we saw with VR before was single user or just very expensive. This example was at a general retail conference, mainstream.
Being able to configure an room or a product experience and be able to share it with others, whether they are friends, other people with a vested interest and/or experts from the store or business is a very significant point.
Having a booth with a single SL or metaverse experience is good, but in some ways having multiple headset stations to help people see that someone else is going to join them may have made this more obvious to people that others need to be involved in the process.

I think we will see lots more concrete examples this year both of additional interfaces into virtual worlds, but more importantly interaction with existing enterprise systems such as product fulfilment.
Another thing to consider is scale here. People often worry about how many people can you get in one space, Roo has a good post on the way very soon. Here we have an example that if you are having a personal shopper experience you do not want a huge crowd of thousands around you?