Eric Wittner

Interview with Esri’s Eric Wittner

This entry is part 3 of 5 in the series Esri Interviews

Eric Wittner is the CityEngine and Procedural Technology product manager at Esri.

He works to expand public understanding of how Esri’s 3D products can enhance an organization’s products and process and help bring them success.

Nick Duggan: What exactly is CityEngine?

Eric Wittner: CityEngine has been with us for a while. They started as an entertainment-focused organization. We peeked over the fence and said, “You guys have a tool for creating 3D cities, powerful 3D cities.” And so, after the acquisition we said, “Okay, we’re going to repurpose this software to help people get from 2D to 3D, and we want to make that as easy as possible and let them do rule-driven generation of 3D cities.”

And that was a pretty compelling story, I would say, four years ago, five years ago. Now what we’re seeing is 3D is becoming ubiquitous. Reality capture is getting better and better, and people are able to produce data kind of out of thin air without having to do this modeling process.

From the city industry perspective, where is the sweet spot? The sweet spot now is change. Being able to use all these procedural rules to start creating and representing change really rapidly.  Focusing CityEngine as an urban design and planning tool specifically.

You’re starting to see that fleshed out, right? Things like the Visibility Analysis which we did as a cross platform story. Pro has viewshed tools. CityEngine has viewshed tools. The web team is working on a viewshed tool that are similar and going to be part of the Web Scene Viewer and the JavaScript API. The Runtime guys are working on it as well.

But then, for something like CityEngine, we expanded. Now you can do a viewshed. You can tell what part of the scene is visible, but you can also get metrics on how visible. You can get metrics on what’s occluded. It’s all scenario-based and scenario-driven, and it’s all instantaneous. The desire being that people use that to start doing design. That’s kind of an example of one tool, of a suite of tools that we want to build that are all focused on making this a really focused planning and design tool.

ND: What is ArcGIS Urban?

EW: Now we have a companion to CityEngine, which is a web-based planning tool that makes and manages plans and projects. It’s based on the same procedural technology used in CityEngine. The plans and projects you make in ArcGIS Urban can be opened and used in CityEngine. The designs created in CityEngine can be shared in this focused web app that’s specifically built for urban planners and designers. Part of this effort is enhancing the technology behind CityEngine, the Procedural Runtime, to better support the complexity of zoning, so we can utilize in both apps. However it’s all focused on making this technology more accessible to planners.

ND: How is CityEngine accessible to somebody who doesn’t know CGA? It’s more of a gaming environment; it’s more like using something like Unreal or Unity.

EW: Yeah, there’s a bit of a lift there. There are three initiatives around that. One is ArcGIS Urban, which is where we’re going to provide focused CGA rules that do zoning representation and building construction, that are standardized and data-model-driven. They’ll be available on the web, and they’ll exist in CityEngine. So the idea is you don’t actually have to write any CGA, you just basically massage your data into the right set of attributes, and you’re going to get procedural useful and attractive 3D models from these rules. That’s one piece.

We’re refocusing our effort on content. We’re adding another content person to help produce more example rules that you can use out of the box. So if you’ve ever touched an example like a building mass with detail your going to see more of those kinds of examples. Did you ever see that rule?

ND: No.

EW: Okay. Building mass with detail was a rule developed by one of our guys in the Redlands office where you’ve got all these cool push/pull tools where you can build this complex mass through editing, right? But then in the SketchUp workflow or the CATIA workflow, you would then go in and I cut out windows, etc.. And I’ve got that window form, then I cut a outside of the frame, then I cut inside of the frame, and then I cut panels. And at some point it gets exhausting, right? I’m tired of doing this.

You drop the rule on a mass and it skins it with a nice façade. It puts in the windows for you. It puts in the balconies. It puts in the doors. Two years ago, we released Handles, which is the ability to apply visual controls. Last year we released local edits.

On this massing rule—the one that we’re going to release this year—you can go in and touch any part of it and say, “Okay, make this part of the wall a window and then make that window smaller than the other, and get rid of one of the panes.” And you can do it all with this handle-ized workflow in the viewport. The procedural complexity for the architecture comes out of the rule, but the base form is coming out of these really simple push/pull tools.

That’s an example of where we’re trying to go. And the Procedural Runtime itself, there are gaps for doing zoning and planning. One of the great limitations of CityEngine is the limit of what the shape itself knows and understands. “I know that I’m a parcel and I can build something on my attributes. But I don’t know anything about parcels around me.” So we’re adding these new things that are called context queries. Individual parts of a rule can query things that are adjacent for context and then drive what they do based on context. So the parcel can say, “I know that I’m next to a residential area and if I’m commercial I have to set myself back eight meters instead of four meters.” And then we build that into the rule and then the person doesn’t have to do complex workflows tasks to accomplish that.

ND: Do you work only with CityEngine, or do you work with the rest of the platform, as well?

EW: I work with the rest of the platform. I started on the 3D Analyst team. I worked for Nathan Shephard on viz and Clayton Crawford on the analysis team building some of the analytical GP tools that were part of 3D Analyst, working on the viz capabilities and Arcscene, and working on the KML support In ArGlobe. From there I jumped to professional services. The guy who originally hired me, his name is Bill Miller. He’s the guy who kind of pioneered or came up with a really driving concept of geodesign at Esri. I worked for him a couple years implementing projects and then worked with Dominic Tarolli and team on the biz dev side as the 3D tech evangelist. And I loved 3D tech evangelism. I loved touching real-world projects, but I wanted to be able to drive the technology in new directions so that I could do more.I wanted to be product manager.

CityEngine that is one of my products. ArcGIS Urban is one of my products. I’m responsible for the procedural capabilities of ArcGIS Pro. I’m helping drive the development of the interactive analytic tools across the platform. I touch AR and VR through 360 VR and some of the broader gamin related efforts, and CityEngine has the Unreal VR templates that it’s supporting.

Just bits and bobs.

Eric WittnerND: That’s a lot!

EW: PMs at Esri typically touch three or four products. That’s not uncommon. I mean, at least I’m not Chris Andrews. I don’t own the whole 3D effort.

So that’s what I’m working on. But you’ll notice that there’s kind of a theme there which is everything’s around planning.

And that’s really what I’ve been trying to drive is building tools into all of our products that are going to enable planning and design. And that’s why I originally came on to Esri back in 2006 to do design focused tools.

ND: It’s really cool to see somebody so enthused about what they do. 

EW: I moved back to the Bay area a couple weeks ago and changed my location and instantly got a few job offers. And I went and talked to one of these guys not because I wanted a new job but because what he said he was working on was interesting. He was working on some indoor stuff. We launched into this huge discussion on indoor mapping and position accuracy and what they were doing as a product. In fact, very similar to some of the stuff that we were doing with our Indoors. He said, “It’s like you’re the perfect guy.” And I said, “You’re a startup and your aim is to gain a certain amount of market share and get acquired and grow yourself, and I work for a company that’s about fixing the world.” He had a nice idea, but I think I’ll just stay where I am.

ND: The AuGeo app from Esri for AR is brilliant, but you can’t share it with other people.

EW: Yeah, we’ve created a constraint to use base on the login pattern, and people really do want to take it off line. In ArcGIS 360 VR, if you make it public, Public 360 VR on ArcGIS Online, anybody can touch it, download it local and then they’re off the network and they never need to connect again.

Part of the thing with augmented reality is—these are the applications that it’s ideal for. The kind of people who pick up the camera point it at features and mark them, “That’s where this is, that’s where those are, that’s where we did.”

One of the military guys I was talking with, he looked at it and said, “This is awesome. If I could take this offline, I could just drop a reference point on every single building in an urban environment and say what it is. You know, this is a restaurant. This is a hotel. This is a residential complex.” Even just that basic level of information is so useful to have.

AuGeo is an Esri labs project, that came out of the prototype lab Maybe it will be a product one day.

There is something potentially related coming. It’s an expression of capabilities inside the ArcGIS Runtime. You can grab the runtime and build AR/VR apps with it. You’d need to develop that app though.

But that’s what’s frustrating, because there are quite a few things within the Esri environment which you have to be advanced to utilize them. Whereas I thought the principle of ArcGIS is the fact that it is quite simple to use.

There’s three focuses I guess. If we look at the web-based products, right? And the mobile products. The ones that we’re delivering. Those are tightly focused on specific users who need to be able to do discreet tasks, and we do pretty well with those, I would say. And then the rest of our suite is professional tools which are for folks like you and me. And then there’s the developer tools which are even more deeply capable. You can do more, but you have to have real expertise.

We used to have a guy that worked for us—his name was Milad. He was a product engineer. And the stuff that he could do with multipatches because he knew ArcPy—and the whole framework of how to touch ArcPy directly and manipulate multipatches directly—was stunning. It was just stunning. But it wasn’t exposed in a way that was easy. But it’s hard because we wind up in a dilemma where if you’re building something—so let’s take AuGeo for an example. We can push that towards a generic product, a generic and released product. It would meet some simple needs, but very, very quickly the military guy would come to us and say, “I need this, this, and this.” And the police guy would come say, “I need this, this, and this.” And you wind up getting branched into so many different directions on a product; it becomes hard from a product management perspective and you know, it’s, “Where do I push it as a generic product?”

That’s why it’s part of the way we’ve been successful in terms of working with our partners is saying, “We’ll build the framework, but for really focused apps there are some we’ll do and the rest we’re going to let our partners tackle.” Because they can go after a specific customer and say, “Okay, you want augmented reality and you are storm water, and I know you need these things and I’m going to help you get there.” And we may never do that specifically for storm water.

So hopefully that is true for AR/VR, our partners will take our tech and run with it.

ND:  I’ve heard that Augmented Reality is going to be slowly integrated into ArcGIS Pro. Is that something that may happen?

EW: That’s already what we do, but it’s in the form of published services or published data. I don’t think Pro will be an AR app in and of itself; it’ll be a supplier of information to AR apps. That app specifically, probably not. It’s not worth it.

What I think will be more interesting is as the Runtime team and Rex and company move forward with the Runtime’s ability to do AR. That’s one development path. The other one is just using the Runtime as a way to access all the GIS information from some other AR application built on Unreal or Unity. You know, being able to reach in and touch a set of published scene services and pull them directly into a VR environment to explore or project onto an AR environment where I’ve got 26 projects that are hosted in Urban and they’re all scene services with good buildings and trees and stuff, and then I say,”Show this to me.” I want to stand on the street corner and I want to see the projects that are around me.

You’ve seen the one that we’re doing at the island, right? Built in Unreal for the HTC Vive, the tabletop experience. You’ve not played with that yet?

ND: No, where’s that?

EW: Here at the UC over at 3D Island. 

We had that collaboration with Epic for the Datasmith export from CityEngine. We built an Unreal gaming template specifically for CityEngine projects. You go into CityEngine, you build your scene, you build your scenarios, then you export to Datasmith. It drops in into Unreal. You download the template, and then you get a tabletop experience. Your city project down onto a virtual table and you can pan and zoom around it. You know, pull in with both hand, push out, pan, rotate.

You can switch between scenarios to see the design alternative. You’ve have a slider bar with your left hand. You can change the time of day on the bottom to see different shadows. Then there’s hotspots down there that you look at. And with your left hand, you can point at one, and it zooms you down onto the ground. And that’s changed your navigation mode. You’re in this, what we call, a hop. In your right hand, you have this little loop thing and you can just hop around. Because people get so nauseous when you do it the other way. You can’t accelerate. If you do the acceleration, people throw up. And then tomorrow we’ll probably have the two people running because we need an extra machine, but you can have multiple people in the same environment and you see each other. So it’s like I can jump down here and then I can say, “Hey,” and pull that person in to my view immediately. We both stand and look at the place from the same perspective.

This is kind of the first effort. It’s VR immersion, but it starts on a tabletop in a VR. Because our experience was—I think same as you, and I think Adrian in the talk demonstrated it really well—people are used to interacting from an overview perspective.

And when you drop them all the way in, they get lost very quickly or they end up in a corner or in a place that doesn’t look good. That’s why 360 VR has this highly curated view perspective. We’re not going to let you do that. So that’s kind of an interesting experience.

ND: It’s just crazy where we’re going now. I’m at a stage in my career where I’ve spent years trying to get people to use 3D. Now people are using 3D and they’re using AR and VR. I’m thinking, where on earth do go to from here? Because there’s so much immersive tech that’s going on that in five years’ time, it’s going to be just naturally consumable. So where do we go from here?

EW: If you think about the way computers evolved, right? You know, print, output, line graph, to dot print, CRT Command line, CRT black and white, CRT color. Then there was the input revolution from the keyboard to the rolling ball to the mouse. We’re kind of in the same space where there’s going to be 10 years of iteration on this before we have the next sudden. And the leaps are going to be sudden, obvious, and not intuitive necessarily from the designer/developer’s perspective. 

Again, it’s going to come at us from the gaming world. Once Xbox releases their headset, Xbox and PlayStation will both be fighting for innovative experiences, and we’ll just see them revolving. Then the next one, the next one, and the next one. There will be the rest of us looking at that going, “Man, I can use that for urban planning.”

You say that Xbox and PlayStation will kind of influence where we’re going, but you’ll realize I’ve asked this of everybody I’ve spoken to. We’ve got this tool called CityEngine which is very close to being able to be a gaming—kind of like to interface and to work between the gaming industry, the GIS, the visualization side of the world. But there’s no real gaming tool or no real gaming interface for ArcGIS where it would be ideal for visualizing things.

If you look in the long term in CityEngine—let’s say five years from now—CityEngine is no longer a standalone application. Maybe it’s an extension that sits inside Unreal. You do it all in Unreal. And you use all the cool viz that Unreal has. I think about visual effects. I mean, we’ve spent a lot time thinking about cartography and right now in Urban we’re spending a lot of time thinking about how to express condition and change at different time scales and in intelligent ways. But if you spend some time in any of the modern video games and see the way they indicate where you need to pay attention, what is changing, the way things outline, flash, highlight themselves … that’s all going to be incorporated in modern apps over time.

 

Series Navigation<< Interview with Esri’s Chris AndrewsInterview with Esri’s Jeremy Wright >>

Leave a Reply

Your email address will not be published. Required fields are marked *