Zürich based Daedalean and UAVenture have agreed to work together to evolve the “Magpie” autonomous flight control, navigation and emergency landing system for air taxis to more wide-ranging drone operations. Magpie is a lightweight version of Daedalean’s technology for personal transports, featuring:
- Real-time vision based detection of suitable emergency landing locations.
- Detection of wires and other obstacles during a landing approach and landing.
- Vision based navigation and attitude estimation for GPS denied scenarios.
- Lightweight hardware (approximately 500 g).
- Low power consumption (approximately 30 W).
Future versions may incorporate functionality targeted specifically at real-time precision agriculture.
According to a UAVenture press release this will bring full “level-5” autonomy to drone operations. “Daedalean’s engineers apply insights from modern robotics, deep learning and computer vision to build an autonomous guidance system that meets the highest bar (DAL-A) for safety critical aerospace systems ( DO-178C, DO-254 and DO-160G). Their product provides visual based guidance and navigation intended to enable both unmanned operations and personal transport certified for Visual Flight Rules (VFR) conditions. UAVenture‘s AirRails autopilot system makes the commercial operation of unmanned Vertical Takeoff and Landing (VTOL) vehicles accessible to all with its advanced flight control and highly intuitive flight planning and monitoring software.”.
“Today both parties have signed an exclusive partnership to integrate Daedalean’s vision based technology into UAVenture’s AirRails VTOL autopilot system for unmanned applications and to make it available to all AirRails users. This partnership allows UAVenture to provide unrivalled guidance, navigation and control for the most challenging UAV applications to their customers.
The cooperation allows Daedalean to gain engineering validation in real flight at scale, and to gather a high volume of high quality imagery correlated with actual flight data in a multitude of realistic environments for training and testing of their systems.”
(Picture: Downward facing segmentation by on-board deep neural network as part of Daedalean’s landing guidance system. The system recognizes roofs, low and high vegetation as well as other obstacles. A large enough gray patch is a candidate for (emergency) landing. The system updates in real time on board on a small single-board computer.
© Daedalean AG 2018)