Open Scale Test 2

Open Scale VR for Architecture: Test 2

A few weeks ago, we attempted to break out of room scale VR. Why? We wanted a teleportation-free architectural walkthrough experience. Since the new Microsoft Mixed Reality headsets are not bound to external hardware, with a little finagling we were able to walk the full length of the 82' 36' building in a park without teleportation.

 

In our first Open Scale Test, we did the impossible, but only technically. The unusual (and definitely unsupported) conditions we subjected the Windows MR headset to caused it to lose tracking, and the laptop wasn’t ideal as it would stop its screen recording abruptly to preserve battery life, as evident on the video cut short.

But the amount of interest in our first test surprised even us. Architects and LEAN consultants began to get as excited as we are for the potential of an Open Scale. We decided to adjust our approach and test again.

Backpacks are For Nerds

But first, we knew we needed a computer that was made for a mobile VR experience so we put it out into the Portland tech scene that we needed such a thing and almost immediately found a volunteer willing to let us borrow theirs. (Portland really is lovely. But brunch lines indicate that we have plenty of people. Don’t move here.) 360 Labs loaned us this absolutely stunning HP backpack.

The night before our open scale test we tested the backpack computer in a gaming environment. For some unknown reason, Windows 10 does not currently recognize its own headset as a display, and instead required us to plug into an external monitor. We plugged into a 7 inch monitor which severely hampered the sleekness of the backpack, but worked in a pinch. Once plugged into the external monitor, you could control the headset from within it.

How we modified for Test 2

We prepared for our second Open Scale test with these modifications:

  1. Feature Tracking. The Microsoft MR headset tracks itself in space by looking at features. We wanted to see if the feature tracking outdoors would be improved by tracking the sharper corners and cleaner lines of the basketball court. Then, just to be sure, we placed fiducial markers all over the place, hoping that doing so would help its tracking. Spoiler alert: it didn’t.
  2. Slope Adjustment. Slight elevation slopes in the real world meant that when Logan explored the far distance of his building in Test 1, it appeared in the headset that his legs were a foot through the floor. Hilarious, but not exactly helpful for our purposes. To correct for this, he wrote a script that took samples of the location of the ground underneath the headset and kept a running average to adjust the floor height. Thus, as he moves slowly downhill it should adjust the visualization to keep him grounded to the floor.

  3. Dynamic Chaperone. Without the boundaries that you have in normal VR, you’re unprotected from running into things. So again, he used the generated mesh from the inside out tracking technology to write a dynamic chaperone . The chaperone compares, collects, and synthesizes the points the display is creating to identify things (desks, curbs, lazy pets) above the ground plane and alert the users to avoid stepping on/into them.

What We Learned

Despite our best efforts, nothing we tried helped. But here were a few insights we gained from this exercise:

  1. Outdoor Inside-out Tracking is a no go, and the culprit wasn’t the leafy park or the grass. It was shadows. We didn’t realize it then, but the fact that our first successful test was performed on an overcast day was more significant than any of our modifications. Our tracking was slightly worse the second day and the motion controllers struggled to track their position in space. The long, rapidly moving shadows definitely hurt tracking.
  2. The dynamic chaperone and slope adjustment didn’t work in practice. In our experience developing for the Hololens, you can access the data that it is using to track itself. We discovered too late that accessing this data in the MR headset is not supported. 

    In the future we’re going to beg Microsoft to allow access to this data (Please?) or we will use an external sensor — which if we were to continue to use the HP backpack would mean shuffling a sensor, a headset, 2 controllers, a tiny external monitor, an external keyboard, and mouse just in case voice commands don’t work (which occasionally, they won’t — developer headset bugs). In any case, with all this equipment it’s starting to feel a bit like a circus.

Failing While Freezing

If we had known then that none of our attempts would improve our Open Scale test experience, we might have saved it for a less chilly day. The feeling of failing while freezing was somewhat demoralizing. Indeed, I was going to delay writing about this test until we had another “success” with a capital S. But it’s important to share just how complicated this stuff still is. And remind ourselves and our clients what we mean when we say VR is easy.

Opening up a mixed reality headset with an HP backpack computer while outside reminded us of 3 years ago when VR was fresh. Everything was hard and nothing worked consistently. Since that time, software and hardware has improved such that getting your building from Revit into VR is basically instant.

So yeah, compared to 2 ½ years ago. VR is easy. But as we canvas architecture firms, we have learned that for architects, knowing how and when to use VR is still a challenge. One our team intends to address.

As far as our next open scale test — who has a very large empty warehouse we can borrow for a day?

DSC_0964.JPG
Maret Thatcher