In order to make the architecture from the first Holograms Catalogue post actually usable we’ll need a real cloud catalogue to retrieve the 3D models from. I’m not going to create a Web API at this point but we can simulate one using the Azure Storage SDK:
I first added the Github repo (created by my colleague at Microsoft Dave Douglas) from here https://github.com/Unity3dAzure/StorageServices as a git submodule by navigating a git command prompt to my Assets folder and typing:
*git submodule add https://github.com/Unity3dAzure/StorageServices.git*
I then followed the instructions to create an Azure Storage Account and associated Blob Container from here). My intention is to create a read-only access so I will upload 3D gltf files using the Azure portal interface:
N.B. I intend to use binary GLTF files (.glb) with embedded resources for convenience but the details will come in a subsequent post.
So, I got hold of some sample models from here Sample GLTF Models and uploaded them to my blob container.
When I returned back to my code it slowly dawned on me that the Azure Storage access code uses co-routines, which in turn ‘requires’ a MonoBehaviour-derived object which kinda’ breaks the repository pattern architecture I have been striving to create. It seems to me that a MonoBehaviour is really a construct of visual elements and here I am tying it to my data. At this point I could have implemented my own StartCoroutine function and Zenject will wire up a tick() entry point for me but I came to the conclusion that this code should be asynchronous and therefore shouldn’t really be in a co-routine at all. The upshot of this is that I decided to use the experimental .NET 4.6 framework support inside Unity, load up the Azure Storage SDK and use it directly. This way I could keep the implementation of IHologramCollection completely separated from the rest of the code.
This decision didn’t come completely free of challenges though. The first was that Unity doesn’t support Nuget – that is the package manager that works with Visual Studio and Unity doesn’t have it’s own package management system either currently. Ordinarily I would use Nuget to manage downloading the Azure Storage SDK and manage the versioning, update and dependencies for me. I had to do this manually so used the Nuget.org website to work out what the dependencies and versions were and I used a blank Visual Studio project to pull those down to my computer.
I copied out the DLLs into the Plugins folder of my Unity project and then was able to write code to access the SDK from my scripts.
For a while I was a bit confused about which assemblies I actually needed; having become used to just using Nuget to make that decision for me. To help diagnose this I created a new Visual Studio project based on the C# Holographic DirectX 11 App (Universal Windows) template which gets installed when installing the HoloLens emulator.
Then, after using Nuget to add the Azure Storage SDK I looked in the bin folder for the assemblies to be included in the app package and comparing the Microsoft.WindowsAzure.Storage.dll with those I had downloaded manually from Nuget I could see that it was the win8 target assembly that was required. So I copied this assembly into my Unity project and set it as ‘don’t process’ as it doesn’t work in the editor. I used the net45 target version of the DLL as a placeholder dll to be used in the editor so that I can keep developing in the Unity editor.
Note. This is important when developing for HoloLens as the iterative development loop is relatively lengthy so working in the editor can really speed up your process.
I was expecting that this would all just work as I expected now but any attempt to make an asynchronous call using the Unity Editor resulted in a faulted Task being returned.
As you can see the ‘authentication or decryption has failed’. After some research this Stackoverflow answer alerted me to the following ‘The main reason is that Mono, unlike Microsoft’s .NET implementation, does not include trusted root certificates, so all certificate validation will fail by default.’ It is possbile to hook into the certificate validation process and since this is just a sample, I can allow all requests through quite simply by executing the following code:
Although this would only cause a problem in the Unity Editor, since on the HoloLens we would not be using Mono, it is nevertheless nice to have all of this work in the Editor for the reasons stated earlier and so we can more easily define the user experience there.
So let’s review the final code for the Cloud Catalogue starting with the repository interface:
First note that the account name and account key which I removed from the connection string which is accessible via the Azure Portal is supplied via environment variables. It isn’t good practice to add the keys to source control especially on Github in a public repo. Having said that, I couldn’t see a way to set environment variables on my HoloLens device so had to resort to pasting my key here. You will need to do the same if you want to run my sample!
Also note that the code uses async/await which comes with the experimental .NET46 support in Unity and makes asynchronous code a pleasure to write.
In the next post we’ll look at how to dynamically load those 3D models onto a HoloLens and by extension any Windows Mixed Reality device.
6 thoughts on “Holograms Catalogue–To the Cloud”
Hi Pete, thanks for your interesting line of articles on gltf and cloud storage. I am looking for a fast, secure cloud storage for my gltf objects, with potentially (I hope) 1000’s of users. What would according to you be the advantage of Azure storage as compared to for instance using GitHub as a gltf repository (using the Raw option to get a uri)?
And second question: would it be possible to get uri’s towards the gltf (or glb) objects in this Azure storage, instead of or in addition to this rather difficult (for lesser Gods than you) solution in Unity with the Azure SDK, because I do not want solely to offer Unity apps, but also A-frame browser apps to end users.
Thanks Pete. You are correct. I do not want to use public strings. One other thing against that is that you are not allowed to include commercial TurboSquad or Sketchfab models when these are downloadable to the public. So raw github is out.
Then there are the options 1) to request with security a temporary uri or 2) use REST API.
The advantage of option 1) using a secret uri to a glb is being able to use the rich abundance of gltf readers on all levels (Unity, Aframe, three.js, babylon.js, Webgl, you name it, there is a loader). This is also how Sketchfab runtime loading does this, but then you get a zip file, which includes gltf. Some unpacking to do and that on all platforms. The advantage would be that zip does give a profit of about 25% smaller file size as compared to glb.
But REST API option 2) seems more appealing I think from a security point of view. Do have to tweak the code of the gltf loaders of choice, but since the code is normally open, source of all loaders, this is doable I think