Why “what hardware?” is the wrong question when it comes to XR
At Bevel, our holidays are always punctuated with the emails from our architect friends saying “Okay, my firm is finally using VR in 2019–what hardware do we buy?”
The answer is easy: any of them. Seriously. Everyone has their preference. And not a single Bevel founder will agree to the answer to this question–we all have our favorites.
The real question is how?
Before developing custom software and before Bevel was Bevel, we were busy working away at our architecture firms training them how to use spatial tech.
We continue to be in firms, building custom visualizations. And from it we are able to see XR departments in action. We’re able to see what worked, what didn’t, and what amazing potential this tech has When. It. Is. Used.
There have been some huge wins, but on the flip side, there’s been some lost potential where spatial computing just failed to take off.
Here’s how a few VR for Architecture fails and how to fix them:
FAIL: Office culture doesn’t support wearing a VR headset. No one wants to risk looking silly.
That’s a damn shame. The good news is that there’s no time like the holidays to lighten up. Your best bet is to buy a device that is primarily for employee gaming in a break room. Once people are used to playing in three dimensions, walking through your own building is the obvious next step.
Look into handheld AR tools that can help ease into the utility of spatial tech. Pretty soon, you’ll want to get the screen out of the way and graduate up to wearable AR.
Try it out! Bevel has a free AR for AEC Demo. The iOS AR app overlays 3D digital models on top of the construction documents. It allows laypeople and experts to communicate in a common, visual language.
FAIL: VR is reserved for the end of the design process. “Let’s wait until the Revit model is ready.”
Schedule regular VR walkthroughs with the team as part of your BIM quality control. You’d be surprised at how many modeling mistakes, or even design oversights, you might find.
Not sure what materials you’re using yet? Don’t want your client to focus on the digital wood grain? Use simple white to acclimate the client and the team to focusing on spatial issues instead of fussing over details.
Brush up on your Revit skills for AR and VR with Bevel’s Technology Director, Logan Smith, in his newly released Linkedin Learning course. The course is a comprehensive dive into Revit AR and VR workflows.
- Why model for VR in Revit?
- When to use VR and AR
- How to use your Revit model with VR and AR
- Making the most of Enscape
- Using IrisVR Prospect for multiuser virtualization
- Simple 3D shortcuts
- Efficiently applying materials to Revit projects
- Using RPC components in Revit
- VR modeling for Revit
Interested in learning more – Take a look at this course and more on our Spatial Technology Learning page.
We would love to hear more from you. Are you just starting to use XR at your design firm? Are you old pros? We want to know what is working in your firms. Comment below!
Autodesk University was quite the success last week, but we’re just getting started. Industry Insights by UNIFI invited Simon and Logan to be on their webinar. The topic: “BIM and Virtual Reality.”
BIM and Virtual Reality webinar highlights:
Why VR isn’t just about being showy—when you have BIM, it’s about data.
There’s some perception that this is showy, but rest assured that VR is not just a novelty that is cool to look at. The potential we see and we are working on is for tools that are useful, interactive, and data-driven.
There is a special kind of ergonomics that comes from this…. Instead of working from plans and elevations. You add a human element – it enhances the human data.
It’s amazing the amount of info you can learn from subtle head movements. From your view in a theater to the placement of your cabinets.
You can’t communicate things like scale and distance and proportion in any better way than VR and AR.
When it comes to Virtual and Augmented reality for AEC, good BIM is important. What’s good BIM? Bevel clarified:
…depends on what you want out of your AR and VR.
If you’re going for realistic and pretty, having all your materials modeled in a more filled out way makes a big difference….
If you’re going for something more data focused, then having all those metadata points filled out would be the crucial thing… We can automatically set up those exports, our algorithms can read those data points without having to do it manually.
BIM nerds and those of you spearheading VR and AR efforts at your firm, listen to the full webinar below for more tech talk.
Learn how Logan and Simon decided to become spatial technologists for AEC. Find out their favorite VR meeting room platform. And see which programming languages and game engines they use for each project.
Hospitals are undoubtedly special building types. Between the number of regulations to the breadth of stakeholders, the job of a healthcare designer is a balancing act.
Here is how savvy architects are using Virtual and Augmented Reality to connect with their clients throughout the life of a healthcare project.
6 Uses of XR in Healthcare Design
- Design Development
- Virtual Reality Mockups
- Augmented Reality Mockups
- Simulation and Testing
- Stakeholder Communication
1. Design Development with Virtual and Augmented Reality Mockups
In the early days of VR for architecture—healthcare projects were among the first to see dollar impacts by supplementing or even replacing some mockups with VR. Simon Manning recalls a project at ZGF when it was impossible to find a warehouse to host a cardboard mockup Lean design event. That’s when he first started virtual reality mockups.
When we work in VR, we are able to keep a record of design changes. Old physical mockups get destroyed. Necessarily. The virtual mockups get saved, week after week, helping inform stakeholders at various stages of the project.
When we did wayfinding studies, we did a couple VR design review sessions. In one project, the first design review meeting had 3 people and still resulted in a bunch of changes.
VR to the rescue—“we were able to walk nurses and doctors through the design, get feedback, make changes in Revit and then test those changes after lunch.” What initially began as an emergency fix soon became an unexpected asset to the project.
“It was incredible to not have to wait a week or more for the reconstruction of a cardboard space and to instantly see the impact of design changes. It took way less time out of the doctors’ and nurses’ schedules, and saved over $45,000 in warehouse rental alone.”
– Simon Manning
Best of all, we can keep records of all the design changes so that people new to the design can watch how the design process has evolved. Virtual mockups and documentation help ensure everyone is on the same page for the most constructive feedback.
With augmented reality, we enhance existing spaces or cardboard mockups with virtual 3D models and design options. AR adds dimension and detail to a cardboard mockup experience and allows designers and clinicians to play with equipment arrangement in already built spaces.
AR adds dimension and detail to a cardboard mockup
Using Magic Leap or iPad, we use AR to create a shared design experience. And since this computing is truly spatial, we’re able to extract data about where people are walking and looking—making AR and VR mockups perfect for wayfinding studies.
For already built conditions, iPad AR through the CareConnect app lets nurses and home health care patients communicate and place medical equipment in their homes.
Designers often forget how technically savvy clinicians are—with the advent of telemedicine, doctors are frequently popping onto Skype for check-ins with their patients. VR is just what the doctor ordered for helping designers stay connected with their end users throughout the design process.
With a couple of simple headsets, designers, end users and client reps are meeting one another within the virtual space, as its being designed.
Any stakeholders who can’t make it to 3P lean design meetings get to review them in real time or asynchronously after the meeting. It’s helpful to not have to take a busy surgeon off work for a full day to test the new surgery suite when it’s quickly and easily understood with a brief virtual exploration.
Care Connect AR medical design paired with telepresence with healthcare professionals
When Bevel thinks of XR, we don’t just think of Augmented and Virtual Reality, we also think of all the spatial computing that happens in the interactivity engines. Interactivity engines or “game engines” like Unity and Unreal allow us to create virtual environments and robust simulations.
For healthcare design clients especially, we’re simulating Lean data in 3D environments to provide visual proof of design interventions.
With your 3D model in an interactivity engine, we study the impacts of various design iterations on clinician and patient flow. We can interact with equipment and instruments. We can even test new procedures. Early simulations shape the efficiency of the design and happiness of its users.
6. Stakeholder Communication
We’re seeing a huge impact on client communication with mobile augmented reality. Augmented reality overlays digital models with the real world in a spatially aware way. With AR, tablets and phones transform into magic windows—transforming a real-world site into a completed building in its actual context. Mobile Augmented reality also allows us to make floorplans into mini interactive models.
Simply put, we haven’t found a simpler, more accessible tool for stakeholder communication than handheld AR.
We worked with a firm whose clients was extremely confused about the scale and scope of the work. They came to place it in 3D on the site. Where he could wander around a bare patch of dirt and see his future building. It became a trusted tool for their client interactions going forward. After weeks of fruitless meetings trying flags, paint on the ground, etc. A 5 minute iPad experience was ultimately all he needed.
* Seeking great pilot partners
Email us if…
- Your team loves feedback and craves better ways to communicate design
- You use Lean design principles or consultants and want a way to visualize this data and test it across design iterations
- Your team wants to immerse their clients in the design from the beginning
Above and beyond but not required:
- Your team are Revit power users and you want to leverage your robust 3D models like never before
To find the seamless user experience for our Augmented Reality and Virtual Reality applications we re-tested Leap Motion and were thrilled with the software improvements.
The Leap Motion’s hand tracking prowess simplifies interaction, aids the sense of immersion, and allows for creative interaction concepts.
Read on to find out how we’re using Leap in our custom VR development.
WHAT IS LEAP MOTION
Leap motion is a peripheral mount to any VR headset that enables accurate hand tracking within VR. Instead of interacting with controllers and buttons, the leap motion allows you to use natural gestures such as pinching, grabbing, waving, and swiping without having to handle hardware.
The Leap Motion is like an expert at detecting hands— it was even designed to track how hands use tools–like a hand holding a stylus in virtual reality.
LEAP MOTION IN ENTERPRISE VR/AR APPLICATIONS
We like using Leap Motion whenever less equipment and less complexity is essential to the VR and AR experience. Controllers = Complexity.
When showing VR to clients, it’s often the first VR experience they’ve ever had. While frequent users and gamers will get used to Vive controllers easily, there is a learning curve for a novice. And when it comes to enterprise VR and AR experiences, you can’t rely on familiarity with VR hardware for interaction.
Leap Motion attaches to the headset and reconizes your hands as they type on virtual keyboards, push virtual buttons, or open virtual doors. Manipulating virtual objects with your hands reduces the barrier to interaction and simplifies the experience.
Consider the VR conference booth or an interactive kiosk—At a conference demonstration you only have a few seconds with each person or none at all. Leap Motion enables seamless user interaction with the virtual environment—without having to know that the trigger button grabs and the thumb pad slides.
Leap Motion is also is excellent whenever controllers would break complete immersion. Whether it’s a high-end immersive experience or you’re prototyping new processes in VR—Use hand tracking instead of controllers for a truer mockup and sense of immersion.
HOW TO USE LEAP MOTION
The Leap Motion hardware attachment runs around $70. All you need to try it is an application that’s programmed for it. Because Leap Motion requires an addition to a headset, you won’t find it in many off the shelf applications that consumers are using. But one way to try is in Alt Space.
If you’re building your own applications, Leap Motion is built into the Unreal engine, plugins, and input devices. Simply check the Leap Motion checkbox and set your pawn options. The Leap Motion is astoundingly programmable.
Its use in custom enterprise software makes a ton of sense. In enterprise, you’re not working with gamers. You’re much more likely to have a total VR novice. With a Leap Motion, VR novices can put on a headset and be proficient using their hands without fumbling around controllers.
A big hassle of VR is that you’re blind to the world. You have to find yourself in the virtual space, orient yourself to your controllers, and know how to interact with the controllers. Then there is additional interface confusion in dealing with hardware you can’t see.
Once Leap Motion is applied, it reduces the hardware complexity to just putting on a headset a going. With the attached Leap Motion, you can see your hands through the headset and immediately know how to interact with virtual space. Push buttons with your fingers, open doors with your hands.
CAVEATS – WHERE WE PREFER VR CONTROLLERS
There are some applications where we actually prefer controllers over the Leap Motion for interaction—and that’s when we need precision and fine grain interaction.
When you want precision and control, controllers win at this point. Controller tracking is precise enough to drop VR furniture in exactly the right place for example, and when drawing and modelling in VR, the controller is still essential. We anticipate the software will get better, but at the present, the action of releasing an object can be a struggle. The Leap Motion will occasionally rotate objects or drop objects as slightly incorrect angles.
That said, this software is improving all the time and it wouldn’t surprise us at all if within a few years it was a VR interaction standard.
INTERACTION CONCEPTS FOR LEAP MOTION
- Multiple fingers for typing on virtual keyboard (chicken pecking). Even though there isn’t (currently) haptic feedback, typing in this way is much more comfortable than having to key in information on a keyboard.
- VR calculator – same as virtual keyboard —working with a wand is obnoxious, but punching in the numbers with fingers is natural.
- Any interacting with UI – things such as the TiltBrush UI would do well with the Leap Motion. You don’t have to deal with abstractions that motion controllers create. You can simply point and go.
- Interface heavy applications – Virtual Showrooms where there are many different options for materials, furniture and schemes.
- The frustration of all of the VR showrooms we try is finding the interaction buttons.
- With the Leap Motion and clever spatial programming, you can change settings and parameters from one stop. Instead of a static UI, you can implement a context-aware interface.
- Kiosks – The Leap makes forbetter for demonstration of an unsupervised Kiosk. If you want someone to walk up to an experience, put on a headset and go, Leap Motion is less setup.
- Extended uses – because holding controllers for a long time can be a pain. The Leap Motion can provide extended VR users with better ergonomics
- Finally, the Leap Motion enhancing immersion and presence in high end virtual experiences by forgoing extra handheld hardware and instead letting the user exit and interact naturally.
If you’re solving spatial problems, collaborative VR is the start of that solution.
VR in architecture has been around for years — often used as an impressive showpiece with nubby textured carpets and light glinting off impossibly shiny floors. These experiences are impressive–we make them ourselves–but what excites us most about VR is its use in early stage collaborative design experiences.
The idea that VR should be reserved for when a building is completely modeled is a huge lost opportunity to iterate design and create people-centered buildings. By getting more than one person into the virtual space, you can start with something more elemental, less designed and flesh it out collaboratively — rather than just assessing something already complete.
Particularly now that multiuser software like IrisVR exists, Virtual Reality can improve design by experiencing the feel of it from the project’s earliest stages. With more interested stakeholders in the space, you get better feedback and the client feels involved and confident in the process.
Having supported hundreds of architectural VR meetings, we have some tips for getting the most out of Virtual Reality.
The Soft Skills of VR Collaboration
- Know what VR is good at.
VR can answer spatial questions better than anything short of actually constructing a full scale model.
It’s uncanny how quickly and confidently questions are answered when you’re in the virtual space. Less time is spent trying to interpret renderings, drawings, and verbal explanations, and more attention is given to sensing and experiencing.
VR can feel so natural that our average time in the headset is usually less than 15 minutes — not because the meeting is cut short, but because generally, that is all that is needed to move forward with confidence.
This is why VR is excellent for any time you want your clients to focus on the feel and quality of a space instead of on the numbers. If you have a conversation in front of a spreadsheet, they’re going to tell you to reduce square footage. But if you’re having that same conversation in VR, your clients are much more likely to favor a quality end result.
Some of our favorite meetings have been watching clients happily increase the budget to make a better building.
2. Have an Agenda.
Before any client meeting, you probably have an agenda about what you want to talk about and which problems you want to solve together. A meeting that incorporates VR is no different. Use VR’s strengths to help you answer questions like:
- Does the entry feel inviting?
- How is the ceiling height?
- Does the kitchen feel cramped?
- Is it too isolated from the living room?
- Is the signage clear?
- What are the sightlines from the director’s desk?
Decide on your goals for the VR experience, and share the plan with your client before you get into VR. Once in VR, continue to lead the meeting by keeping focused on problem solving those important questions.
3. Set up your file to do the job.
Once you know which issues you would like to solve with your client, you will be able to set up your design files thoughtfully to support the meeting.
For example, unless you’re making a decision about materials, display your building in line work. It’s a simple trick, but it completely reframes the conversation to convey that the VR experience is a draft for collaboration not a finished product. Another example: If you’re trying to make an A/B decision, have those options ready to go.
File setup needn’t be tedious or a huge time suck, but make sure it is set up to help answer the meeting’s questions. A good way of not spending too much time on file cleanup is to do a quick VR test run to note anything obvious that might get in the way of your client’s experience, but don’t worry about perfection, particularly in early stages.
4. Go for Multiuser whenever possible.
I am now used to wearing a dorky looking computer on my face, but I never forget that it can be a big ask for people when they first put on a headset. Take the lead and put on your headset first. And when they get inside of VR, there you are — much more friendly than being alone in a virtual space.
Aside from just being friendlier, multiuser is a must for collaboration. The conversations in multiuser VR are more natural, and it is much easier to guide your client’s attention when you are there in the space with them.
5. Know VR’s limitations.
VR is not particularly good at enabling note taking and documentation of decisions, so factor that in when you plan your meeting. You may want another person to attend the meeting out of the headset just to take notes or get your client’s permission to record the meeting.
Of all the turnkey CAD to VR softwares we have tested, IrisVR Prospect does a better job at this note taking piece, but it is still limited. You can export the highlights made or take 360 photos to document your drawings. But erasing the drawings isn’t possible except through “undo” and there is not currently a layer that can be turned on and off for drawings.
Regardless of which software you use, it’s worth doing a practice run without your client so you are comfortable with the tools and know their limitations.
As we assist architects using spatial computing like VR, we see a lot of hyperventilation about the design files not being “complete enough” or we see hours of time devoted to cleaning up or adding materials for one VR meeting.
Usually this is overkill. Some cleanup might be helpful, but properly introducing the VR experience is the best and easiest way to manage your client’s expectations. Try something like this:
“We’re going to go into a working draft of your building to check the sight lines from your new office and see if we want to keep the upper cabinets and skylights.” It’s simple. It’s reassuring. And it directs the focus of the meeting.
We also see a lot of unnecessary worry from architects that once their clients are in VR they will get distracted on the “wrong” thing or explore an area of the building that isn’t modeled yet.
Relax. It’s okay.
VR is a fairly new medium, so your clients will take your lead. If you’re not worried, they won’t be worried.
If you manage your client’s expectations, have an agenda, and proceed confidently, you will lead effective VR collaboration.
Open Scale VR for Architecture: Test 2
A few weeks ago, we attempted to break out of room scale VR. Why? We wanted a teleportation-free architectural walkthrough experience. Since the new Microsoft Mixed Reality headsets are not bound to external hardware, with a little finagling we were able to walk the full length of the 82′ 36′ building in a park without teleportation.
In our first Open Scale Test, we did the impossible, but only technically. The unusual (and definitely unsupported) conditions we subjected the Windows MR headset to caused it to lose tracking, and the laptop wasn’t ideal as it would stop its screen recording abruptly to preserve battery life, as evident on the video cut short.
But the amount of interest in our first test surprised even us. Architects and LEAN consultants began to get as excited as we are for the potential of an Open Scale. We decided to adjust our approach and test again.
Backpacks are For Nerds
But first, we knew we needed a computer that was made for a mobile VR experience so we put it out into the Portland tech scene that we needed such a thing and almost immediately found a volunteer willing to let us borrow theirs. (Portland really is lovely. But brunch lines indicate that we have plenty of people. Don’t move here.) 360 Labs loaned us this absolutely stunning HP backpack.
The night before our open scale test we tested the backpack computer in a gaming environment. For some unknown reason, Windows 10 does not currently recognize its own headset as a display, and instead required us to plug into an external monitor. We plugged into a 7 inch monitor which severely hampered the sleekness of the backpack, but worked in a pinch. Once plugged into the external monitor, you could control the headset from within it.
How we modified for Test 2
We prepared for our second Open Scale test with these modifications:
- Feature Tracking. The Microsoft MR headset tracks itself in space by looking at features. We wanted to see if the feature tracking outdoors would be improved by tracking the sharper corners and cleaner lines of the basketball court. Then, just to be sure, we placed fiducial markers all over the place, hoping that doing so would help its tracking. Spoiler alert: it didn’t.
- Slope Adjustment. Slight elevation slopes in the real world meant that when Logan explored the far distance of his building in Test 1, it appeared in the headset that his legs were a foot through the floor. Hilarious, but not exactly helpful for our purposes. To correct for this, he wrote a script that took samples of the location of the ground underneath the headset and kept a running average to adjust the floor height. Thus, as he moves slowly downhill it should adjust the visualization to keep him grounded to the floor.
- Dynamic Chaperone. Without the boundaries that you have in normal VR, you’re unprotected from running into things. So again, he used the generated mesh from the inside out tracking technology to write a dynamic chaperone . The chaperone compares, collects, and synthesizes the points the display is creating to identify things (desks, curbs, lazy pets) above the ground plane and alert the users to avoid stepping on/into them.
What We Learned
Despite our best efforts, nothing we tried helped. But here were a few insights we gained from this exercise:
- Outdoor Inside-out Tracking is a no go, and the culprit wasn’t the leafy park or the grass. It was shadows. We didn’t realize it then, but the fact that our first successful test was performed on an overcast day was more significant than any of our modifications. Our tracking was slightly worse the second day and the motion controllers struggled to track their position in space. The long, rapidly moving shadows definitely hurt tracking.
- The dynamic chaperone and slope adjustment didn’t work in practice. In our experience developing for the Hololens, you can access the data that it is using to track itself. We discovered too late that accessing this data in the MR headset is not supported.In the future we’re going to beg Microsoft to allow access to this data (Please?) or we will use an external sensor — which if we were to continue to use the HP backpack would mean shuffling a sensor, a headset, 2 controllers, a tiny external monitor, an external keyboard, and mouse just in case voice commands don’t work (which occasionally, they won’t — developer headset bugs). In any case, with all this equipment it’s starting to feel a bit like a circus.
Failing While Freezing
If we had known then that none of our attempts would improve our Open Scale test experience, we might have saved it for a less chilly day. The feeling of failing while freezing was somewhat demoralizing. Indeed, I was going to delay writing about this test until we had another “success” with a capital S. But it’s important to share just how complicated this stuff still is. And remind ourselves and our clients what we mean when we say VR is easy.
Opening up a mixed reality headset with an HP backpack computer while outside reminded us of 3 years ago when VR was fresh. Everything was hard and nothing worked consistently. Since that time, software and hardware has improved such that getting your building from Revit into VR is basically instant.
So yeah, compared to 2 ½ years ago. VR is easy. But as we canvas architecture firms, we have learned that for architects, knowing how and when to use VR is still a challenge. One our team intends to address.
As far as our next open scale test — who has a very large empty warehouse we can borrow for a day?
When room-scale virtual reality hit the scene in 2015 it was lauded as a game changer for architecture. Now it’s almost 2018–headsets are cheap, software is better than ever, and most firms either have a VR headset or are planning on investing on one this year.
When you review the case studies and VR success stories, VR’s true value becomes clear–not as a marketing pitch or another way to bill clients, but as a cost savings tool. VR reduces risk. VR facilitates decision making. VR helps you evaluate constructibility and detect clashes faster and better. VR is the secret weapon against bad VE. Understanding how you can creatively use VR to address common problems in your practice is what makes the difference to a project’s bottom line.
Ok, I can see my building in VR. What’s next?
As we spent the last year canvassing firms, we learned that using VR tools and hardware is relatively easy–knowing how, when, and why to use them is another story. Many of the professionals we talked to weren’t sure what they could be doing in VR without us. Even if they knew how to see their buildings in VR, many were not confident or familiar enough with VR tools to know how to leverage them.
Seeing your building for VR for the first time is exciting. But many professionals fail to move past this stage to see the true utility of this power. This is understandable. It’s not just a matter of learning new hardware and new software–it’s a mental shift. Never before have we been able to experience our designs at scale while they’re still fluid enough to be changed. And yet we frequently see VR brought in at the very end of the project instead of being leveraged from the beginning.
Introducing: Bevel’s VR and AR Workshops
Our workshops teach you how to leverage VR for spatial communication so you save time, money, and hassle.
We now offer workshops, 2 ways:
1. In-Firm Workshops
We teach in your firm, on your schedule, about the topics you care about most. Whether you want to spend an hour focusing on best practices to get your Enscape VR experiences up to snuff or you want to dig deeper into the opportunities for to give your healthcare clients more value. Firms can also schedule all or part our professional workshop series. In the next week we will be adding more
Rates beginning at $350/session and up depending on length of course and number of attendees.
2. Professional Workshop Series
Beginning in January, 2018 in Portland and Seattle – These small group happy hour workshops are priced per individual. Our first series is VR for AEC — a comprehensive dive into using VR profitably. Whether you currently use VR or have been meaning to try it, this course looks at applied skills for VR beyond the basics of setup and software use. Beer is on us.
Whether you’re considering the firm workshop or professional series workshop, if you schedule with us before Dec. 31, you will receive a Earlybird Discount!
Thanks to CoMotion Labs and Microsoft we got early access to a Windows Mixed Reality Developer Headset. We immediately began using it (not as directed) in pursuit of an “Open Scale” VR experience that would allow us to walk unbuilt spaces at full scale without teleportation.
We succeeded and have recommendations to refine this process.
Why Open Scale?
By opening up the scale, we can experience critical architectural moments like the flow of an entry sequence, the experience of moving through the building. The ability to study flow through a space is critical for practical purposes like wayfinding, and gaze tracking in retail, and making data driven efficiencies for LEAN Hospital Design.
We have been using the Vive or Oculus as the standard for our room-scale architectural and construction walk throughs. The magic of seeing unbuilt spaces at scale while walking through them on your feet is amazing, but when you reach the edge of the physical room, you’re required to “teleport” to reach the edges of the virtual room. The Vive and Oculus are bounded to their stationary exterior hardware. So while these tools have been great for understanding scale, they’re stuck. That’s not technically the case with the Windows MR Headset.
Inside out tracking: Logan can’t see out, but the cameras in the headset are looking for “features” to figure out where the headset (and therefore Logan) is in space.
Instead of external sensors, or “outside in” tracking, the Windows MR headset has “inside out” tracking. It uses two cameras to look at the world for features to figure out where it is in 3D space. This simplifies setup and creates the potential for Open Scale virtual worlds that are as big as we need them to be.
A huge part of experiencing architecture is by moving through it. If Goethe was right and Architecture really is “frozen music,” we can now experience the rhythm of an architectural design — not through a CGI fly through, but by walking on our own two feet.
We were ready to test the limits of Microsoft’s inside out tracking and the laptop’s outdoor capabilities. Equipped with a backpack, a gaming laptop with a beefy GPU and our Windows Mixed Reality Developer Headset, we set up at the park near our office.
Our Test Project: Principles of Wholeness
For our test, we loaded Logan’s old architecture school project into Unity — a Faculty Lounge with ground dimensions measuring at 82’ x 36’. The project was from a University of Oregon Design Process class taught by Jim Givins focused on Christopher Alexander’s Principles of Wholeness. As part of their design process, students performed a powerful exercise that required them to study proportion and scale by recreating their design’s footprint in an open field using flags and ribbon for walls.
University of Oregon Architecture Students testing Building Size with Ribbon and Flags. May 5, 2012 Logan Smith attempts to mentally extrapolate building dimensions from the 2D exercise, May 2012
The exercise of mapping out a building’s footprint at full scale is a practice as old as architecture itself but it yields surprising insight. Walking among the ribbons and flags, Logan discovered his first design iteration was too long and skinny. Extrapolating the 3D building from the field proportions, he could tell that proceeding with his initial design would result in a cramped feeling space. He proceeded making multiple other iterations in the field with flags, thinking through the implications at 3 dimensions. Unfortunately the exercise took a long time and he couldn’t iterate quickly. Mentally extrapolating 2D flag and ribbon into a full building is still a challenge in educated guesswork. Enter: Virtual Reality
Open Scale Workflow: How We Did It
Before going to the park, Logan imported his Faculty Lounge Sketchup model to Unity. Setting Unity to build for Windows Mixed Reality was more annoying than we have time to discuss today, but once that was complete the process of working in Unity was familiar. After adding a few materials and some lighting, it was ready to go into the backpack and out to the park. It was time to see if we could actually walk around the Faculty Lounge in open space.
After a minute of setup, Logan put on the MR headset, shut out the park, and started walking around his faculty lounge. All 82’ x 36’ of it. He even walked out of the building to see its exterior proportions. The actual area he walked was about 60’ x 110’ all without teleportation.
From the architect’s perspective, Logan remarked at how different the space felt when are walking it versus teleporting around it. He said he gained a truer understanding of distance and space.
How We Would Repeat It: Improvements
There were a few glitches that we think could be prevented using a different workflow to make this process Client-Ready.
1. Laptop Size. The laptop we have was never intended to be worn around. It is, in a word: enormous. In a client ready workflow, we would want to invest in the new lightweight backpack VR computers that are becoming the latest fashion trend if Autodesk University is any indication.
2. Laptop Settings. Even though it was set to not go to sleep when you shut it, the headset registered this a more of mild suggestion — likely to prevent idiots like us from using the headset in unapproved way we were about to use it. Undeterred, we ended up creating a stopper made of a glove and a mouse to prevent the laptop’s lid from shutting all the way. It did the trick. In the future we will probably use one of these cool foam stress balls we got at Autodesk University.
3. Location and Environment. There were times the headset would get lost and not be able to find itself without restarting the system. This is more our fault than the headset’s — we were definitely not using it as intended. The fact that we were using it in a leaf strewn park on a windy day was definitely a complicating factor as the system tracks features — including the videographer who was moving all around the place. The fact that headset did as well as it did in the circumstances was impressive. Next time we will go to a different environment to improve tracking features and ensure that the program height adjusts to compensate for slope.
Microsoft Mixed Reality Shows Promise
The Microsoft Mixed Reality headset and OS seem tailor-made to help business people ease into spatial computing. We imagine a near future of working in Revit on a desktop, then finding the building model in a application Window inside the new OS house. This would allow for a fluidity of experience and multitasking between applications that would be ideal for a spatial working environment. For a project where a more open scale is desirable, we would definitely refine this process.
Thank you Microsoft and CoMotion!
We work directly with AEC firms on implementing VR strategies, so every product on our list offers true room-scale VR capabilities that convert your CAD models into room-scale experiences with only a click or two. They really are that easy.