Jeffrey Walker, a Professor of Civil Engineering at Monash University aims to address a gap in natural disaster surveillance by progressing the basis for near-space monitoring.
I never planned to be an academic. I was going to be a land surveyor, so that I could work outdoors. But when I got to the end of that five-year degree, my supervisor said I should think about doing a PhD. I was like, “What’s a PhD?”
I soon found out and thought, why not? My topic was an overlap between civil engineering and surveying, marrying the two fields around soil moisture mapping and hydrology. The surveying element related to satellite data used to measure the soil moisture, and the civil engineering element being on how that data might be used to improve models for factors like flood forecasting.
Somewhat by coincidence, I learned that people at NASA were interested in doing similar work, merging satellite data with land surface models to make better environmental predictions, and so as an early pioneer I was recruited to help. However, the NASA role involved combining actual satellite data with land surface models on a global scale, rather than ground measurements for a small catchment.
Back then we didn’t have a dedicated soil moisture satellite, and I had to do a lot of development work with non-ideal data. But a satellite mission was finally booked, and SMOS, the Soil Moisture Ocean Salinity satellite, eventually launched in 2009.
By measuring the naturally emitted microwave energy of Earth from space we can derive the soil moisture, which tells us a lot about what’s happening on the surface of our planet.
But satellites have limitations. The lowest that you can fly one is at about 160km or so. SMOS, for example, orbits at around 750km above Earth. At best we get a pixel size of about 40km, which is fairly granular. So my research is now looking at how we can improve the resolution of that data by gathering our data from near-space.
Near space? This is the area between about 20 and 80km above Earth, well above any commercial air traffic, but not quite reaching space. This is a new frontier that we’re just beginning to explore.
High-Altitude Platforms (HAPs) are unmanned aircraft with very big wing spans, up to 30-plus metres – very fragile looking, lightweight structures capable of operating at these near-space altitudes. One example is the Zephyr, developed by Airbus. The top of the wing is completely covered in solar panels to power its small electric motors – they can basically fly for up to a year at a time, and be positioned to loiter in any area of interest.
The potential is huge. If you make the entire underside of that wing into an antenna, then because your resolution is a function of how big your antenna is, and how close you are to the ground, you can get a resolution that is much, much finer than from a satellite. Instead of 40km in a single pixel, we can get to around 200 metres.
Our challenge then is designing an instrument that’s light enough and low-power enough to operate on that platform. However, the main part of my ARC laureate project is about how this 200-metre resolution data can be used to provide real-time information for natural disaster risk prediction and monitoring. This includes providing the inputs needed by fire, flood and landslide risk prediction models, as well as monitoring the actual locations and spread of fires and floods as they occur.
While providing an all-weather day and night observing capability that can see through cloud and smoke, there are a lot of other things that affect passive microwave measurements. If you’ve got a layer of vegetation, it will have some of its own emission in addition to that from the soil, because it’s got water in its leaves. This will also reduce some of the signal from the soil.
This is a problem – but it’s also valuable. It’s a problem because if you’re after the soil moisture, you need to remove the effect of the vegetation. But if you want to do fire prediction, then you want to know the moisture content of the fuel and the soil.
This is still largely uncharted territory, and so we will look at grasslands, the sclerophyll forests common down the Great Dividing Range, and the savannah typical of Northern Australia – three dominant fire regimes.
Another factor that affects the microwave emission is the actual temperature of the scene. So if you’ve got a hot bushfire, it will show up in this high-resolution data.
At the moment, the only way to track a bushfire’s path is by waiting for the next satellite to go past, if it can see through the smoke and cloud, or putting aeroplanes up and then relaying the information back for the hour or two before they run out of fuel. This is also dangerous work, as you’re flying in a smoky and probably turbulent environment.
If we can monitor these readings from near-space, then we should be able to see where the actual fire front is, what direction it’s moving, and what speed it’s moving. This information can then be given to fire fighters so that they can track wildfires in real time, which will undoubtedly save lives.
The same measurements can also be used for floods. The type of saturation run-off that causes floods typically occurs at smaller scales than can be measured by satellites. Moreover, if we are monitoring from near-space at high resolution, once a river starts to break its banks, you can map exactly where the flooding is occurring and how it’s spreading. If we’re doing this in real time, we’re able to alert people downstream when they need to get out of an area that’s about to be inundated. Again, the potential lifesaving benefits are huge.
The HAPs already exist, so in principle we could make this monitoring system happen inside the next decade. That might sound like a long time away, but most satellite missions usually take about 10 to 20 years from when they’re first proposed to when they’re actually launched.
But a 10-year timeframe is plausible if somebody said this is what we want to do, and we’re going to put the resources behind it. It all comes down to political will.
As told to Graem Sims.