Powering the autonomous flight of the future with labeled data

daedalean logo png
Success Story




Type of service

Semantic segmentation

Platform used



Services circle icon from humans in the loop with woman at boardi on purple globe background
Number of drone images annotated with ultra-high precision


Model validation circle icon from humans in the loop girl celebrating with robot on purple globe background
Refugees and migrants in Bulgaria impacted through this work

The client

Daedalean is a Zurich-based startup founded in 2016 by a team of engineers who have previously worked at companies like Google and SpaceX. Daedalean specializes in creating AI for autonomous flight and more specifically, systems for in-flight decision taking based on computer vision. They work at a level higher than a classical autopilot – essentially the same level as a human pilot. With uses in both classical airplanes, helicopters, and the urban air taxi of the near future: the electric vertical take-off and landing aircraft (eVTOL) – the technology is both groundbreaking and exciting.

The challenge

Tatiana Yusupova from Daedalean told us:

Our eventual goal is the creation of a fully autonomous AI pilot: a certified sensor and autopilot system that can completely replace the human pilot. Working within such a niche requires significant compliance efforts, high standards and reliable suppliers such as Humans in the Loop.

Tatiana Yusupova, Autonomous Flight Data Supervisor

“The core software for such a ‘visual cortex’ of an aircraft is built on neural networks; and to train them takes enormous amounts of data. The algorithms of visual recognition used by Daedalean require exhaustive training to achieve the “experience” and to ensure their “knowledge” of any imaginable landscape below and around the flying vehicle.  As such, the base of Machine Learning is the annotated data. Only humans can teach neural networks to interpret images.”

The task at hand was related to preparing the segmentation of the environment for the system called ‘Where can I land?’ which is one of the most important functions for a helicopter pilot or for a pilotless drone: both should be ready to land immediately if any problems occur – and the landing should be safe for the vehicle (it’s dangerous to land on a high grass, for example – it can hide a rock or a pit), and be safe for the people and infrastructure below (no landing on a car or a cottage roof).

Semantic segmentation on a drone image

The solution

Humans in the Loop did the full segmentation for this system, which meant classifying every pixel according to its class: buildings, flat surface, high and low vegetation, wires, masts, pedestrians, vehicles, etc. The work was done over several projects: starting with rural data, slowly moving towards urban and roughly but steadily developing and maturing this system and its recognition of the real world below an aircraft.

By using a small, dedicated team who underwent extra, client-specific training, it was possible to ensure consistency in the interpretation of edge cases across the images. For example, determining how train or tram rails should be classified – as flat surface or an obstacle; or lamp posts, fences and other objects on the road – as masts or unknown. Having people who already have the expertise and know the correct interpretation of the data is crucial.

The applications Daedalean are developing are safety critical. It’s aviation – incorrectly working software could be fatal.

Needless to say, the requirements for certification mean that there is a legal obligation for Daedalean to select a suitable supplier, to determine the required quality is met and that the process of working with data meets the demands of the special standards (DO-178C, DO-200A). Humans in the Loop take quality to another level at every step of the process with multi-layer verification and continuous iterations based on client feedback.

“We needed to be sure the team we select is able to comply with our processes, meet our demands, be ready to learn new tools we find appropriate. The last happened to be where you won our hearts: you were ready to adjust.”

Tatiana also shared that “we have tested this relationship quite a bit. We have come up with urgent projects, we had complex requirements and very complex data to process, and we have always received great service. Our personal manager from HITL has always been prompt with her responses. It was easy to start a new task, we have extensively discussed the requirements to make sure they’re clear to the annotators, and then the project manager was always there at the end of the project to make sure all the necessary corrections are met.”

The result

The system based on the segmentation HITL helped with is now part of the product for large UAVs (Unmanned Air Vehicles – a.k.a. Heavy professional drones) which Daedalean is launching to the market by the end of this year. But it’s more than that – the “Where can I land?” function becomes part of the bigger systems, targeted to other markets – helicopters, general aviation, and eVTOL.

Over the last few months, Daedalean have partnered with EASA (European Union Aviation Safety Agency) and signed a contract to explore how existing regulations around safety-critical applications can be adapted to encompass modern machine-learning techniques, and to create concepts and safety standards for the application of this branch of Artificial Intelligence in safety critical avionics – so, exciting times ahead!

Interested in implementing a human in the loop in your AI pipeline? Get in touch with us and we would be happy to have a call.