Drones are being used to test a space rover in familiar ways.
By Martin Azkarate
UAV manufacturer senseFly talked to Martin Azkarate of the European Space Agency about his work developing rover prototypes for planetary exploration, including how he employs a senseFly drone to test these cutting-edge robots.
Martin is a space automation and robotics engineer at the European Space Agency. Based in the Netherlands at the European Space Research and Technology Centre (ESTEC), he is in charge of the laboratory that develops and tests the agency’s rover prototypes–the cutting-edge ground robots that explore faraway planets–with special reference and interest in the rover that will be used on the agency’s forthcoming Mars mission: ExoMars.
senseFly caught up with him to learn more about his role, planetary robotics, and how a drone aids the lab’s research activities.
senseFly: Why don’t you start by telling us a little about your role?
Martin Azkarate: Sure. I’m responsible for our Planetary Robotics Laboratory, where we mainly focus on space exploration using rovers. I’ve worked here three and a half years, having started as a trainee on a scholarship from Spain. We work on using rovers, robots, to explore unknown faraway planets, which usually these days means Mars.
S: What mission are you currently working towards?
MA: The agency’s next mission is called ExoMars [which stands for Exobiology on Mars]. This features two separate phases: the first in 2016 launched an “orbiter” out to Mars: a satellite that is used to relay the future rover’s data back to Earth because it doesn’t have the means–the power or antenna–to communicate directly. Then in 2018 there will be a follow-up launch where we will send the final landing module containing the rover.
S: What will your rover do on the red planet? What are its goals?
MA: It will explore a specific area of the red planet, taking samples, drilling down to two meters below the surface, and then analyzing these materials on-board the rover itself. Our lab mainly focuses on the robotics technology that can be applied to this rover system.
There are already some areas that the agency has selected as preliminary target locations, but the final landing location has not been selected yet. Whatever that final location is, the rover will drive around that area and perform types of drilling and sample analysis.
Its drill system is pretty complex. It takes samples from up to two meters below the surface, which is deep for a rover system. Then there is a full miniature lab or “analytical drawer” onboard the rover, which it will use to analyze these samples, searching for signatures of life using instruments such as microscopes and spectrometers. The results of this analysis will then be sent back to Earth via the orbiter.
S: What are the technical challenges you face when developing a rover? There must be so many.
MA: It depends on whether you do the science in-situ, as with ExoMars, or one day send samples back to Earth. That would mean different kinds of subsistence and navigation, but generally speaking a key requirement is for some kind of autonomous navigation.
We can tell the rover which area to explore and give it coordinates, but it has to be automated enough to understand directions, measure distances, and recognize and navigate around obstacles. We can’t control a rover on Mars remotely like we could on the moon.
S: How does a rover localize itself?
MA: There are different ways of doing this. The main method is via visual information, sourced from its cameras. First, the rover uses visual odometry. This means it updates its relative position based on what it saw previously. It will take an image, move a meter, take another image, compare these two images, and then compute the transformation matrix that matches the motion it has performed from one step to the next.
By doing this repeatedly, it can constantly update its position with regards to its original position. This is all relative positioning, of course, based on where it landed.
We also need to know where its first position was, called global positioning. One way is to find reference objects in the nearby surroundings; on Earth we use ground control points. But on Mars it’s not easy to locate a nearby tower!
We don’t have a very detailed map of the terrain on Mars either, but we do have maps of up to 1 meter per pixel thanks to satellite imagery. On these maps we can identify big rocks, craters, etc. Then, if we can see any of these with the rover, we can triangulate its position with regards these landmarks.
S: So where do flying robots, meaning your drone, come into play?
MA: Our eBee has two main applications. The first is creating high-resolution maps of our rover test sites.
In the case of Mars, there is an orbiting satellite that takes images of the areas we want to explore. However, we need something here on Earth to capture the imagery we’re going to explore when testing the rover–the higher the resolution the better. At a smaller scale, we need to identify and geo-reference landmarks that the rover has to be able to see, to target. Let’s say we choose a parking lot. We first use the eBee to map this area, maybe covering a hectare or even larger (up to a square kilometer, our rover’s realistic maximum). From there we can identify landmarks that the rover can use to localize itself.
The second application, however, is all about enriching the work of the operator. We use the digital elevation models the eBee generates to feed the rover’s ground control station. This gives the operator a better understanding of where the rover is and how well it is traversing the terrain.
If you have a full DEM of the terrain and you place the rover somewhere, you get a better idea of how the rover is operating. Whatever the rover will see, in terms of obstacles, terrain etc., we can crosscheck that via the DEM.
S: What are you checking when conducting such rover tests?
MA: Everything within the scope of autonomous navigation. For instance, how well the rover can localize itself, its performance when traversing different terrains, and how accurately the rover is analyzing where it is. The rover may believe it is here and in this position or facing this direction, but we may see that differently on the DEM, in which case we know there’s a problem.
S: What kind of areas are you mapping to test the rover on? You mentioned a parking lot?
MA: Yes, we use a very nice parking area, owned by a company called DECOS, which is close to ESTEC. This has the color and characteristics of a Martian landscape, a design choice by the owner.
We also target more sandy areas like nearby beaches. Mars has a mix of different terrain, from hard rock to sandy areas, so it’s difficult to find natural terrain that has all of these. We have also tested in the Atacama desert in Chile, areas of Spain south of the Pyrenees, and the Canary Islands.
Using the eBee in such areas, you might face certification or approval issues. We had that here in the Netherlands at first, before we specified that our use of the eBee was non-commercial.
Field-testing is something we plan over the entire year. We’ll typically carry out two big field-testing campaigns, of one to two weeks each, per year. That’s when we’ll use the eBee to create those maps and models. Unlike our internal activities in the lab, related more to sub-system applications like testing cameras and algorithms, these happen every day here in our lab.
S: Lastly, looking even further ahead than 2018, what might future Mars missions look like?
MA: In the future we’re looking to bring full physical samples of Mars back to Earth. This is a very complex undertaking because it requires first having a full working launcher system, usually called an Ascent Vehicle, on the surface of Mars.
The article originally appeared in senseFly’s blog, Waypoint, October 5, 2015. It’s been edited slightly.
For more on using photogrammetry to measure distances on Mars, read this article. For more on the future of surveying Mars, read this piece!