The test scene for the Micosoft HoloLens is finalized. Within the last week, we evaluated the usage of the HoloLens in combination with Unity. The first time some team mates were working with Unity itself. As I use Unity for my own game project for a few months, the focus for me was the usage of Microsoft’s toolkits for using the HoloLens.
The results of this week of learning:
The voice commands implemented are based on the focused objects (focused by the gaze):
- “Start”/”Stop” of the video playback and the audiosource clip
- “Start”/”Stop” of the audiosource’s clip
- “Next”/”Previous” for switching between clips
- “Volume Up”/”Volume Down” for setting the volume
Microsoft’s HoloLens 212 Tutorial and recommendations say, single syllable commands should be avoided. In this case I used them, as we don’t have many objects and voice commands, though.
The last voice command is something I might look into in the future, as a command with a value would fit very well here. Instead of saying “Volume Down” several times (my implementation just reduces the volume in 10% steps), I want to say “Volume Down by [X]%”. As it seems, the Dictation Recognizer could be capable of this.
As for the TV, I learned one thing: Videos are messy! To play the video on the 3D TV model, I attached a movie texture to the plane, resembling the screen. As you see in the video above, the frame rate is horrible. And the reason? I haven’t dug deep into it but I have one guess:
Unity’s handling of videos!
In the VR sector and also mobile app sector, it seems videos are sometimes a big problem in terms of performance, when used as a movie texture.
Another thought was HoloLens’ CPU power (something I don’t really see as the problem, as the HoloLens’ CPU/HPU structure should work this out). We tested simple Youtube videos via the Edge browser and also recorded videos. Those work out as intended: Stable and high frame rate! So the problem does lie beneath Unity’s processing here.
In addition, my colleague did some fine tuning for the experience. For placing the objects, he edited the “TapToPlace” script (this enables the user to do the air-tap and move the objects around, when they are focused). His edits enable us to ask for the normals of the focused objects and also the spatial map. The result: Objects can only be placed on a flat ground.
To indicate the user the allowance to place he also had the smart idea to have a gameobject, a box, surrounding the object as a parent and have a transparent material attached to it by default. By using the TapToPlace the box changes its color either to red or green.
- Red: Object cannot be placed
- Green: Object can be placed
A smart idea instead of going into the 3D model’s meshes!
He just had to remove the box collider of this indicator box and that thing worked out fine!
Our weeklong HoloLens project has ended with the results above. We learned alot during programming, designing and testing our small test-app.
For me especially it has been the first time to work with others on one Unity project. The creativity flow has been fantastic and Unity enabled us to bring our ideas to the screen in a fast manner.
The base of the whole week was learning about Microsofts HoloLens. Combining Visual Studio, Unity and the hardware has been a time-consuming activity at the very beginning of the project. Creating the first demo application in Unity and deployment took us more or less a day in total. From that day on, we were focused on understanding the HoloToolkit (a massive AR asset for the HoloLens) and its usage. Features of this asset were hard to understand, as documentation is mostly done by the community.
Microsoft does provide a great introduction into the HoloLens by providing the API and a great developer area. The HoloLens 101 was easy to understand and to apply to get the Origami example up and running. The second series of Tutorials, which increase the depth of the HoloLens’ functionalities, is not this easy to understand anymore and some steps seem deprecated as the HoloToolkit or the files from the tutorial are not working as described. This made the part of “getting into it” frustrating.
(Related: “Microsoft HoloLens – Loudspeaker Test”)
The last bit of learning is the Unity environment. The mentioned movie texture is something developers definetly are going to use (a TV in AR! Throw all digital goods in your home out of your window, you don’t need anything but the HoloLens!). This has been on barrier we experienced. I know there are video assets for Unity, but as the most recommended AVPro has a decent price tag on it, we did not tried.
To sum it up: The HoloLens is alien technology! Spatial mapping and the sound engine work great! Voice commands are done very well and the efficiency to develop an application is good! Still, there are some performance related issues either related to the deployment or Unity itself. But in the end, it is nothing future developments of the HoloLens and Unity could not erradicate!