“In an urban, BVLOS-based drone eco-system machines will play a much larger role in regulating UTM”  

Dr Tim McCarthy is a Senior Lecturer, Department of Computer Science, Maynooth University, Ireland

The U-Flyte research programme (see box) has set itself many different targets, more than most academic programmes of a similar type. Isn’t there a danger you are trying to do too much?

I was asking myself this back at Eurocontrol when we were learning about what Europe is planning in this area. We’re aware of what FAA and NASA are doing with UTM and the seven different test sites looking at different things alongside the Integrated Pilot Program while here in Europe we have the SESAR work. However, the feeling here is that you’ve got to consider the broader UTM eco-system which – in turn – comprises many technical, regulatory, operational, application, commercial and indeed wider societal components.

The problem at the minute is that the approach to UTM is a fairly simplistic approach addressing quite a complex problem. I know some UTM companies are addressing some of these challenges through a more conventional approach of decision-support to the UTM director or drone operator through presentation of the real-world very low level (VLL) flying environment. These appear as a stack of discrete data-layers – such as elevation, buildings, land-use building and so on. That’s fine if you’re dealing with only a handful of drones in an unconstrained airspace. But what if you’ve got hundreds or (in time) thousands of drones over a large city? The big question then is how do you architect a universal, scalable, adaptable UTM solution to handle various VLL flying environments which will always come in different shapes and sizes?

We started off by designing a very simplistic system which presents information of the real-world to the drone operator and possibly the UTM director so that the drone operator could decide how to conduct the mission and the UTM director could manage a number of drone operators and their respective activities. Then we realised there was a breakdown in the system because this scenario does not readily scale and is incapable of adapting to likely future artificial intelligence (AI), drone and sensor technology developments. Let’s take the case of a busy urban VLL flying environment with a lot of drones travelling across the city; carrying out mapping surveys, delivering goods, repairing critical infrastructure and so on. Once the drone goes beyond visual line of sight (BVLOS) the system is entirely dependent on electronic sensors and systems onboard the drone platform and on the ground to ensure safe separation and a successful mission. Why do you need a human (for example, a UTM director) relaying instructions to drone operators to manoeuvre their drones to a particular flight level or position? The UTM Director will have all the information and is better placed to optimise the overall UTM based on drone performance, priority, changing dynamic risk and so on. Let’s take this a step further – why don’t you just take the humans out of the loop and (in time) let Machine Learning/Reinforcement Learning carry out the split-second re-computation of the overall best performance for the various drone trajectories – always, of course, having human oversight at a higher level?

The AI UTM Director could then transmit the instruction directly to the drone.  In the future there may be a reduced requirement for drone operators in the loop. The point here is that Artificial Intelligence, drone and sensor technologies have tipped the balance in terms of inputs from humans and machines and we need to approach the challenges of future drone traffic activities differently rather than shoe-horning these some-times ill-fitting contemporary traffic management architectures onto future U-Space/UTM operations.  If this is to be done correctly you’ve got to prove to society in towns and cities and to the authorities that you indeed have a safe system for all-encompassing monitoring of all things relevant to drone operations, whether these are static or dynamic in nature. And you need to demonstrate that you can keep things separated safely at all times for example – the hobbyist in the park whose drone may not be registered in the UTM or indeed  migratory flock of birds that have just flown in from Canada not carrying any electronic transponders. I think we need to re-examine the whole approach to designing these new UTM eco-systems.

How do you do that?

We view U-space as the typical physical VLL environment, where drones operate, and the first step is to record a 3D digital model of that very low-level environment. That means looking at the static aspect – topography, buildings, land-use classification. This static U-Space can be brought to life by sensing near real-time phenomena that also affects drone operation, such as weather, drone-traffic and human activity. Both static features and dynamic phenomena can be then encoded into the 3D model as Risk – as it affects any drone operation, at any one location, at any point in time.

UTM operations can be based on fundamental constrained or un-constrained topologies. This simply means that if we’re rolling out a UTM over a built-up area like a city we need to constrain the airway topology – essentially the route-corridors that drones fly along – to follow a certain pattern, very much like a kids’ climbing-frame in a play-ground. Real-time events can modify risk any place along this climbing-frame network but this ‘change in state’ can be more easily handled computationally if we have a robust pre-defined network to begin with. For a region over a vast area like an ocean or a wilderness –  where we expect much less dense traffic – we can leave as un-constrained. Airway(s) for any particular drone trip can be ad-hoc in design with the ability to modify on-the-fly, as dynamic risk changes.

When we construct the initial U-space model for built-up urban areas using static inputs it should be possible to allow the constrained topology to be automatically/semi-automatically generated. This static constrained network can then be modified – in near real-time – with changing events (weather, drone-traffic, human activity, emergencies). We can use multiagent systems and optimisation to compute the best overall outcome for the UTM. Optimisation will include inputs such as desired track, drone flying performance as well as the drone’s role (carrying emergency blood products or a standard daily inspection task).

What is the timescale for findings from your research and what’s going to happen to the results of the work?

This is a four-year research programme and we just started in March 2018. There are different aspects to this and we are being very careful not to understate the big picture importance of what we are trying to do.  We have to be progressive but there is a research element and by definition that means there will be solutions which initially won’t work and will need to be corrected in the development phase. At the moment we’re looking at some of these problems and seeing how they could work from technical, operational, regulatory and commercial viewpoints. But we’re avoiding going into highly populated places, we’re looking at the low-risk environments and being progressive with the demonstrators. We want to get those BVLOS demonstrators into low-risk places and start learning how to operate unconstrained, really simple UTM in low-risk BVLOS environments and then slowly, slowly build that up.

How could the commercial side work? The big cities will probably offer UTM licences to operate in a similar fashion to Telco Mobile Licenses but the winners will have to demonstrate to city managers that they’re able to put that platform in place – the hardware, the sensors – and also operate it safely. Once that’s been shown to work they can start operations and begin charging for services. New business models will be required to spool-up next generation, large-scale BVLOS drone operations where we may see various shared services being promoted e.g. move from where an organisation maintains their own drone services to where it time-shares drones to carry out a range of tasks; mapping, monitoring, defect inspection, logistics, aerial robotics, specialist services and so on.

We are going to have so much data and so many research programmes – are we going to end up with certification agencies becoming confused by the amount of data available to them?

We think that you need to design big and build small.  You will need to demonstrate and put together a cohesive story around how this could possibly work and at the same time have some practical examples of drone applications such as information gathering, timed package delivery, air taxis – although that’s quite a long way off – we are still constrained by current hyper-localised operation i.e. within line of sight.

We are going to try to design a UTM (constrained and un-constrained topologies) to see what it will look like and then build an operating model here at Waterford Airport, Ireland. Back at the University in Maynooth we’ve actually captured LIDAR from drones and aircraft over both constrained and unconstrained environments to reconstruct the real world in digital form for these VLL environments. This VLL space is very different from conventional aviation – if you look at a highly cluttered three-dimensional, very low-level city environment, it becomes a different story and conventional radar manufacturers may have to re-think how they approach these challenging environments.

There are many competing technical solutions but my preference is for ground-based detect and avoid. I have always thought this would be the biggest problem to solve. Connectivity is a major challenge but I would go one stage further and say it’s knowing exactly the location, all the time, of everything relevant to safe drone operation.

In other words non-cooperative, low-level identification

Absolutely. But we’re not jumping into that side straight away – it would cost too much. We’re saying let’s architect that digital twin and then try to find the building blocks and be progressive in demonstrating how these can work in low-risk environments, demonstrating it to the regulatory authorities. For example, what’s the cost of sending an aircraft out to carry-out a 5km x 5km forestry plantation survey? We can do that with a drone in one and a half hours and prove that from a safety, operational, regulatory and economic point of view it makes good sense. Be progressive – start simply and then slowly, slowly deliver the more challenging environments.

Have you examined what sort of communications link would work best?

We’ve had initial discussion with the GSM providers, DJI identification guys and looked at ADS-B broadcasts for drones. But it’s still an open book.

The scope and goals of U-Flyte

U-Flyte is a strategic research partnership, coordinated by Maynooth University and funded by Science Foundation Ireland, together with key industry collaborators including; AirbusIrelandia Aviation and INTEL. The project was launched in March 2018 in Waterford Airport by Maynooth University, SFI and industry partners. The partnership will tackle the current global bottleneck impeding the wider development of drone operations and roll-out of commercial services. The R&D work-plan is based around researching a series of inter-connected work-packages that deal with investigating, building and testing Unmanned Aircraft Systems (UAS) Traffic-Management (UTM), underpinned by a 3D drone airspace model (U-Space). UTM operations will be tested around Waterford airport using drones and Air Traffic Control (ATC) as well as selected mobile locations across Ireland. Drone services, based on rapid wide-area mapping, dynamic feature-tracking in drone video clips and package drop-off/pick-up, will be developed and tested using real-world end-use case scenarios.

U-Flyte (Unmanned Aircraft Systems Flight Research) 17/SPP/3460 is funded under the Science Foundation Ireland Strategic Partnership

 

 

 

Share this:
Counter Drone System - D-Fend Solutions