Hololens 2 and Kinect for Azure

Microsoft have been busy. And yesterday night they announce what they had been working on, with a new Hololens device and a cloud powered reprise of an old favourite.

The new HoloLens looks lovely. It is apparently more comfortable with the processing power (which is completely contained in the device itself) at the back of the device to improve balance. It also has a neat “flip up” mechanic that makes it easy to get out of the augmented reality it gives you. They’ve also announced a major improvement in the single, principle issue I had with the original device. The augmented display (the computer generated bit that is laid over your view of the world) is now around twice the size of the original, vastly improving your field. of view. Which makes the virtual elements much more impressive. And, as a final flourish, the Hololens now also includes eye-tracking, so the device can work out where you are looking, which should greatly ease interaction and reduce the amount of hand waving you have to do to attract its attention. It looks like another wonderful device that I can’t afford, having a business oriented price tag of around 3,000 dollars.

However, with a bit of luck it might encourage lots of people upgrade, leading to a flood of cut price Hololens 1 devices going onto the market. I wouldn’t mind that at all…

The second big announcement was of the return of the Microsoft Kinect sensor, at a slightly less eye watering (but still clearly business focused) 399 dollars. I loved the original Kinect senor so much I wrote a book about it. For the second sensor I created the awesome Carbonite machine that used to sensor to make a 3D printable object of you embedded in carbonite. The first Kinect was, at the time of release, the fastest selling peripheral ever. Microsoft had high hopes for its successor, in fact for a while it was impossible to buy an Xbox One without getting a Kinect as well. Unfortunately the device never really lived up to is promise on the home front, and the Kinect sensor was quietly retired a while back.

But industry has always found the KInect very useful, and so Microsoft has brought it back. It has the same depth sensor as the Xbox One version, which is fine by me, but that is paired with a much higher resolution video camera and a bunch more microphones. It’s a great way for generating spacial data that can then be processed in the cloud. I’m really pleased to see it back again, and might even see if I can afford to get one to play with.

Developing for Unity and HoloLens at NASA Space Apps Challenge

I've always said that a hackathon is a great place to investigate new technology. It's an occasion where you can spend time concentrating solely on something, and that can be both instructive and useful. So, when I signed up for the Nasa Space App event I was keen to try something new.

I had a tiny go with Unity a while back, which was fun, but I've never written code for a Microsoft HoloLens. Number one son had an idea for an app that let you see where satellites are in the sky or in the ground. The idea was that it would use satellite data to predict positions and then render them in a way that was locked to your present position and orientation. Then you can look around and see what is up there, even through the surface of the earth you're standing on. 

It was quite an objective, particularly as neither of us had developed for the HoloLens before. But we thought we'd have a go. Number one son was in charge of getting the satellite data and doing things with it, while I looked at finding and displaying a globe. 

You can get the Unity framework here. It's free for personal use. If you want to make HoloLens applications you'll need some other things which you can find out about here.  You don't need a physical HoloLens to get started, there's an emulator you can use to find out what your apps will look like. I managed to run the emulator, Unity and Visual Studio on my Surface Pro 3 with 8 Gb of memory and it worked OK (although it got a little upset when I tried to load Adobe LightRoom as well....).

Number one son was using a MacBook Pro for his part of the development, so he installed the Mac version of Unity and the .NET framework and set to. I was amazed that you can do HoloLens development on a Mac, but when we took his code and moved it to the Surface Pro it worked fine, which I found astonishing. Even compiled dll files added as assets moved across.

A Unity solution is driven by the assets that it contains. These can be images, models, scripts, dll files, shaders, sounds and lots of other different things. You create scenes by bringing assets together and create behaviours by binding scripts to events. The scripts can be written in C#. I really like that. 

The items in a scene are fiercely hierarchical. Changes a container will affect the things in it. Scripts can be bound to objects and there are start and update behaviours that you fill in to get your scripts to act on your objects. Variables in your scripts can be mapped onto elements in game objects and used to affect their appearance and behaviour.

If you've played with game development in XNA you'll find the "set things up and then update them every frame" way of working very familiar. But it is both more powerful and more confusing, in that every item in your game can have its own start and update, rather than being driven from a single, central, game engine. You can run your game in the editing environment at any time, and you can turn elements on and off at will. 

Unity have created an asset store that plugs directly into game projects so that it is very easy to find paid (and free) items that you can include in your game. In no time at all we'd found a really nice globe and I'd kind of managed to get it into a Unity project.

There are some settings that need to be customised in Unity for HoloLens development. You can do this by hand (there are instructions here) or you can find a ToolKit that automates the process. 

You create your software in Unity and then use it to build a Visual Studio solution that is compiled and deployed to the target device. We had the HoloLens attached to the Surface Pro via a USB cable, and we ran the program that way. You can use WiFi deployment too, but one of the golden rules of hackathons is that once you've got something to work you stop working on it and move on to the next problem.

Number one son made awesome progress. He found some tools online for computing orbits and even tracked down some 3D models for the satellites themselves. I learned a lot (which is software engineer speak for went more slowly) but I did manage to get a globe displayed and spinning.

With half an hour to go before judging we brought the software over from the Mac, fixed a tiny issue with exceptions in the satellite code and then built and deployed the program to the HoloLens. And then the problems started.

Everything was upside down and wrong way round. The code worked fine in Unity on the PC, but on the device it was wrong. And, since we'd not done anything that could cause this behaviour, we didn't really know how to fix it. Not good. 

After a bit of frantic searching we managed to find this which fixed the problem. By turning off an apparently irrelevant option (MSAA) we got the code to work. This was very annoying. There is no mention of this issue in any of the release notes anywhere. It means that anyone who carefully follows the "getting started" sequence for the HoloLens would be rewarded with a solution that does not work properly and no information as to how to fix it. Not good. 

I was really impressed by the ease with which you can get started and the power of the HoloLens itself. I'm going to try and hang on to my loaned device for as long as I can.

Hololens and Occulus Rift at c4di

I got to the c4di Hololens and Occulus Rift demos slightly later than planned thanks to a succession of red lights on the way. By the time I arrived the room was pretty much full and there were queues to try out the latest in virtual and augmented reality.

The interesting thing for me was the contrast in the devices. The Occulus  Rift is a fairly bulky device attached to a large, powerful PC. The Hololens just sits on your head with no cables, no external computers, just the device itself. It ran happily on batteries for the time I was there. I had a brief go with it and the experience was just as impressive as it was when I played with it last year. The thing about these devices for me is that, unlike things like stereo TV or multi-channel sound, people try them and just decide that they want more of this.

The Rift (and my weapon of choice - the HTC Vive) take you somewhere else. The Hololens takes where you are and adds value to the surroundings. They are both awesome technologies and I'm racking my brains to think of an area where they couldn't have an impact. Interesting stuff. Thanks to Trident for arranging the session.