XR – A New Mix

This entry is part 1 of 3 in the series April 2019

Mixed reality for infrastructure and geospatial takes a huge leap forward with the announcement of the HoloLens 2, SYNCHRO XR, and the XR10.

This might sound a little cliché, but forget everything you knew about Microsoft’s HoloLens. The new model makes the first model seem like a toy (albeit a very cool toy).

A not-so-surprising (and long anticipated) announcement at the Mobile World Congress (MWC19), the premier global mobile technology exhibition) on February 24, 2019, was the HoloLens 2, the successor to the ground-breaking mixed reality headset.

Microsoft’s Alex Kipman—technical fellow, mixed reality and AI, regarded as the “inventor” of the HoloLens—announces the HoloLens 2 and many of its new features.

The bigger surprise was the emphasis placed on applications for infrastructure: planning, design, engineering, construction, and operations. This new model is not aimed solely at consumer and gaming communities. Microsoft (and partners) envision serious and substantive work accomplished with this device, the fruit of many years of R&D.

Evidence of this new focus was the announcement of a partner-developed model that fits on a hardhat and is designed for industrial and construction environments: the Trimble XR10, and Bentley’s SYNCHRO XR application for visualizing construction “digital twins.”

You are probably wondering, “What can this do for the surveyor [general contractor, construction crew, etc.]?” We’ll get to that, but first let’s look at the technology, its evolution, and the aspects of the HoloLens announcements that could bring the most impact for AEC.

VR, AR, and XR

There are some misconceptions about the world of 3D immersive experiences—especially for practical applications in geospatial, engineering, and construction. There is more to immersive tech than just visualizations, and there has been for many years. The technology has evolved greatly since the days of simply putting a display screen up in front of your eyes in a headset (picture the ColecoVision gaming headset released in 1982).

Developments have both followed and driven leaps forward in 3D processing, AI, remote sensing, and miniaturization of components—and have grown in acceptance with the comfort level of people for navigating and working in such environments.

Forgive the following simple characterizations, but we need to frame the distinction between them in the context of AEC. VR (virtual reality) is mostly simple viewing of 3D representations of models and scenes, like with those headsets/goggles that you can snap your smartphone into. VR is useful for, say, sharing proposed designs with team members and stakeholders. AR (augmented) superimposes the models over live (or recorded) real-world views captured with cameras and other sensors.

AR might be delivered via more sophisticated goggles/headsets, with cameras on the front to capture the live view, containing high-resolution 3D displays, on-board processors, MEMs for orientation, and audio. There are many AR headsets and applications on the market; we tried several models at MWC19 (most major cell phone manufacturers offer these).

AR offers more for AEC than simple VR as it mixes in proposed features over the existing ones. AR can be used to examine design alternatives, to assist construction and inspections, and to view “digital twins” for operations and asset management. Think of what such tools could do for training.

Are we ready for another acronym? Mixed reality (cross reality, or “XR” as it is becoming broadly labeled) is all the above but adding in your ability—through multi-sensor interfaces— to interact with the immersive environment: to view, control, operate, manipulate, edit, update, and analyze. XR learns from you, recognizing your eyes, eye movements, gestures, and voice, and learning patterns in your commands.

But there is another major distinction in XR—and specifically for the HoloLens: you see the real-world environment through the visor—essential for safety and context on construction sites—and the virtual/augmented elements are projected up holographically over that view. The HoloLens was in many ways the first holographic headset, and the second generation adds much more.

The Announcement(s)

MWC19, held in the massive eight-hall Fira Gran Via convention complex near Barcelona, Spain, drew over 100,000 attendees and 2,400 exhibits for a week of mobile communications technologies, AI, 5G, immersive tools and content, robotics, IoT, and innovation in any number of related technologies. This premier global event was the perfect place for these announcements, and they took place in the form of an invite-only press and analyst event the evening before the event opening.

The usual names were on the list as would be invited to other big Microsoft announcements: The Wall Street Journal, the BBC, CNET, the Gartner Group, etc. So why was a modestly sized surveying and geospatial publication like xyHt invited? Microsoft and partners in the infrastructure, AEC, and geospatial segments recognize the potential for substantial innovation and implementation of these enhanced technologies for infrastructure.

Shortly after the developer edition of the first HoloLens began shipping in March 2016, the first teams were developing applications for AEC. In fact, one of the first commercially released (non-gaming) applications was Trimble’s SketchUp Viewer for HoloLens. xyHt took our first look at the HoloLens in 2015 as well as work that was being done in developing applications for construction, inspection, mining, infrastructure operations, and BIM in our feature: Blending Realities.

The press/analyst pre-event at MWC19 opened with a speech from Microsoft CEO Satya Nadella with a quote from tech-sage Mark Weiser: “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” We have seen the cloud, wireless communications, IoT, and AI now woven into the fabric of many of our day-to-day activities, often without realizing it.

Cool-as-hell, headline-grabbing innovations from a decade ago are now built into even consumer devices. Most of us would have first experienced early Kinect technology in a Wii or Xbox game console years ago. Even then, developers were extracting Kinect components and using them for robotics, 3D modeling, mapping, AR, and VR applications.

The HoloLens 2 announcement duties were then handed to Alex Kipman, technical fellow, mixed reality and AI, regarded as the “inventor” of the HoloLens. He said that the journey of the development of this family of products started in 2008 with the simple question: “How can technology adapt to people, instead of people adapting to technology?”

Seeking the answer to that question led to the Kinect, then the HoloLens: devices that could better take cues from people—gestures, audio, eye movements— instead of us having to enter cryptic and restrictive alpha-numeric commands though keyboards. To truly get to any semblance of AI, devices will need to be able to interact with us in much the same manner as how we connect with other humans, as the HoloLens 2 seeks to do.

Microsoft’s Julia White introduced the latest update to the Kinect family of devices and embed technologies: the Azure Kinect. This small ($399) IoT device contains multiple sensors: the Kinect “scanner,” camera, microphone, and processors that can detect and analyze movement in 3D space, as a single unit or in groups. These are connected through Microsoft’s Azure cloud services (with high digital security). An example White gave was the potential for placing sets of these in hospital rooms to detect the precursors to a patient fall. I can imagine quite a few potential applications for AEC, construction sites, monitoring, and perhaps rudimentary real-time 3D mapping.

The Basics

First, the visualization capacity has been dramatically improved. One of the major heartbreaks about the first HoloLens was the small field of view (under 40 degrees) and limited rendering capabilities. For folks wishing to develop applications for, say, 3D digital cities and BIM, there were limitations of the number of polygons rendered, which often resulted in only seeing a blurry slice of a subject model.

HoloLens 2 has on-board rendering, and through cloud services, that can yield rendering of up to tenfold (or more) than the original. Key though, is that the HoloLens 2 maintains the industry -standard 47 pixels per degree (a sweet spot in retinal resolution; higher is typically not perceptible), while doubling the field of view.

As Kipman stated, “It’s like going from a 720p display to 2K.”

There are folks who tend to gauge the utility or quality of things purely on a “bigger-the-number-the-better” basis, but miss the point on desired workflow, the users focus within the field of view, and practical considerations for enhanced interactions. But another area of improvement, requested by early adopters of the first HoloLens, was to improve on the polygon rendering limitations . This has been improved on-board, and through cloud services that can yield rendering of up to 100 million polygons.

The HoloLens 2 is much lighter than the original, with all carbon fiber housing—much better suited for mixed industrial environments. Kipman cited design studies, scanning the heads of over 2,000 people, of all body types, ethnicities, and with/without glasses to make the headset as comfortable as possible.

The center of gravity was shifted, in part by putting batteries and storage at the back of the headband. I can testify first-hand: the HoloLens 2 was as comfortable as a standard set of headphones, far more comfortable than any VR/AR wearables I’ve ever tried. And, the visor flips up out of the way when needed, something not available on the older model. Somebody was listening to their customers.

Addressing one of the goals from the early R&D days, Kipman explained the many ways the HoloLens 2 adapts to the user. When you put it on, it looks at your eyes (scanning and recognizing your iris) and logs you in securely to the system, automatically. It tracks your eye movements to anticipate, for instance, which objects you are interested in interacting with. It learns the shape of your hands to the level of individual fingers, allowing you to grab objects, press virtual buttons, move sliders (that have been designed in different styles, like with click-style increments), and even play a virtual piano.

The HoloLens uses its Kinect-like sensors and cameras to “scan” the space in its view; you can watch it creating polygons to create a live model, oriented in 3D space, aligning to a 5:3 ratio section of your field of view. This incrementally rendered 3D model also provides a spatial reference for interactions with your hands and tracked eye movements. While there is no absolute positioning at this time (though I was assured that would certainly come, likely through partner developers), a model (or site) is registered and the internal MEMs. Kinect-like sensor, and cameras provide orientation for the headset, then movements your body, eyes, and hands.

Kipman also announced that, from the start, HoloLens 2 was developed with the idea of innovation in complementing edge technology with cloud solutions. He then detailed an entire suite of Microsoft Azure, cloud-based solutions. One that may be of particular interest for AEC applications is called Spatial Anchor, which could help foster, as Kipman put it, an “internet of holograms.” Picture a space where libraries of holograms, some encompassing entire sites (or cities), could reside for access by many people wearing these devices for collaboration, training, and operations.

Julia Schwarz, senior researcher in Microsoft’s mixed reality group, demonstrates “touchable holograms” in the form of hand-motion-activated controls for the digital twin of a wind farm.

An on-stage demonstration by a very enthusiastic Julia Schwarz, a senior researcher with the Microsoft mixed reality team, included an example of the operation of a wind farm: changing environmental variables with virtual slider controls/buttons and voice commands. These and other tools inhabit a collaborative space Schwarz called, “the playground.”

There was a demo replacement of virtual bearing in a digital twin of a pump housing, where an operator could remotely guide someone to do the repair. Then Schwarz reacted to a surprise from her development team (through a service called Microsoft Teams): a holographic hummingbird that reacted to her voice and hand movements.

As Schwarz noted, when people put on a HoloLens for the first time, one thing they try to do is to touch the holograms. One of the most useful (and cool) features of the new model is that you can do just that. At one point she exclaimed, “We are touching holograms!” It’s not in a tactile sense, but the hologram reacts to your touch (by spatial proximity); you can grab something, take two hands to stretch and rotate it, remove parts, and more. Feedback from others in the audience indicated that this was one of the most exciting developments, and one I could see being very useful for future AEC applications.

The XR10

Tech media had been speculating about the announcement of the HoloLens 2 in the months leading up to MWC19, but not one whisper got out about one big AEC-focused development: the Trimble XR10. Roz Buick, vice president at Trimble, took to the stage before a giant image of their hard-hat-ready version of the HoloLens 2, dubbed the Trimble XR10 (complete with the distinctive yellow-grey housing and logo). This represents the first partner to take advantage of the HoloLens Customization Program.

Joining Alex Kipman (left) was Roz Buick, vice president at Trimble, Inc., introducing the a product of the Microsoft Hololens Customization Program: the Trimble XR10 with Hololens 2.

Buick announced that the XR10 will ship at the same time as the HoloLens 2 (in coming months). She also presented a real-world example utilizing the HoloLens and Trimble applications on an inspection of the HVAC system for a building recently constructed. The designed system displayed along with the new building structure, and actual clashes were detected.

Afterwards, I asked Buick about the process of working with Microsoft on the XR10: “We started working with them about four years ago with the first [HoloLens]. We have applications like the SketchUp Viewer for HoloLens and Trimble Connect for HoloLens.” The latter is a cloud “hub” service for BIM and construction models and data (see Logic inside the Labyrinth of BIM).

Buick added, “Working with the Customization Program on the XR10 was within the last year.”

Aviad Almagor, director of Trimble’s mixed reality programs, said, “The driver came from the market, a need to combine the physical and digital world—the customers saying they need this technology onsite, so we worked with Microsoft to be able to bring those two together.”

The key to the XR10 is the “onsite” part: a worksite-wearable needs to complement but not get in the way of safety and PPE—it had to fit on a hard hat. These were the main customized elements Trimble brought to this model. But is this new device wearable, comfortable, and practical?

After much nagging and pestering, I was able to get a hold of the only XR10 at MWC19 that was not locked up in a glass case. It fit just fine. It felt no heavier than a standard hardhat, like when you add a lamp or safety flasher, but very well balanced. The visor adjusted into place, with or without glasses or safety glasses. And as it does on the standard HoloLens 2, the visor flips out of the way when needed.

One of my misgivings about some of the VR/AR/XR wearables I’ve tried (as demonstrated for AEC applications) is that they are mostly large, heavy, and cumbersome, and some act like a black box over your eyes (you see the physical relayed though a camera—a limited view). I have heard such systems called “face boxes”. By contrast, the HoloLens 2 and XR10 give you the full and peripheral view of the physical site though the visor, with the digital elements appearing in the middle section. I’ve spent many days in the field surveying and daydreaming about heads -up displays—it seems that day has come.

I asked Buick and Almagor if we might see applications for surveying in the near future; they said that Trimble is looking at a whole range of potential applications but that developing for BIM was a logical starting point.

“We’ve always been in the field, working to provide detailed 3D models so accurate that you can take them out to the construction equipment to build to them. But there is this whole idea that BIM is in the office—pretty pictures, and maybe not so accurate—and when it gets handed over to contractors, they might go back to 2D drawings. One of our key goals is to bring BIM to the field,” said Buick.

“The process goes all the way through. We say white-collar BIM and blue-collar BIM both counts, but you have to bring it out to the field, otherwise why have a 3D design? This is what we do with Connect, SiteVision, and now the XR10.”

Test Drive

Later, in a demonstration booth I was paired with a journalist from Japan. We put on HoloLens 2s, went through a short process calibrating the headset to our eye movements and our hands, and adjusted the visor for optimal view. This only took about a minute.

Over a table in the demo booth appeared a 3D model of a building with a tower crane and external storage tanks. We were encouraged to move components around, scale them up, rotate them into place, and even hand them to each other using hand movements. I slid a time-lapse time scale, watched virtual progress of construction, and looked into windows at the structural beams and walls. That holographic hummingbird appeared near the other participant; he had it land on his hand, and when I looked over at it and said, ”I wonder if I can make it come over here?” it did.

Okay, these might have been gimmicky examples, but they certainly did showcase the mixed reality and multi-sensor interaction tools.

Bentley Systems’ SYNCHRO XR HoloLens 4D app is an immersive construction-management suite that provides the same functionality as the desktop version and also enables collaboration and virtual “hands-on” interactivity with models.

The demonstration was Bentley Systems’ SYNCHRO XR application for HoloLens 2, also announced at MWC19. I was familiar with SYNCHRO PRO 4D and had seen it run on a workstation for visualizing and tracking construction projects. SYNCHRO can bring in all manner of construction models: iModel, Revit, IFC standard models, and many more. It ingests geometries, tasks, task duration, logical links between tasks—all of this is stored in a central repository, like in the cloud. And now, though API’s they have brought this to mixed reality.

The SYNCHRO team had been an early partner-developer for the first HoloLens. Greg Demchak, technical architect with Bentley Systems, related a story from years ago when SYNCHRO (acquired by Bentley in June 2018) worked with an early HoloLens app in a hackathon. Since then they have since developed fully functional, field-ready applications. I will follow up on this as a subsequent story.

Picture This

The HoloLens 2 team relied heavily on feedback from customers in many segments: early adopters and development partners of the first model. For this launch they invited press and analysts from a broad range of disciplines, including AEC. They wanted to know what we will write and say about it and what our readers/audiences will say about potential future uses. Smart people in smart suits pulled me aside and asked me to speculate on potential applications for disciplines our publication covers. Too many came to mind; they politely listened but also asked for more details on some.

Imagine you are out in the field with your GNSS survey rover. A virtual tint is projected over the sky indicating current (or predicted) ionosphere activity. You see the satellite constellations, their health, and which are also in the view of your base or network. You draw an elevation mask with your finger, turn on and off sats, and you see live updates for DOP and even multi-path returns off objects in view. You are working through voice commands but also virtual controls with your hands.

Think about scanning operations: watching the virtual progress of the scan live, walking around in the point cloud and identifying things you missed that you can pick up with the total station and/or rover. Imagine for geodesign that you are walking through a model, correcting data as you go, “painting” features with your hands. Think of how you could train someone on a new instrument, one person operating a physical instrument and one its digital twin.

One could go on and on—but that is the point. We’ve been handed a whole new toolkit, and every time that happens, folks in user and developer communities come up with applications that the original tools -designers never dreamed of.

So, dream out loud; people are listening.

Series NavigationRent a Drone >>

Leave a Reply

Your email address will not be published. Required fields are marked *