KI4FLIGHT – SAFE ARTIFICIAL INTELLIGENCE

KI4FLIGHT – PROOF OF CONCEPT FOR SAFE ARTIFICIAL INTELLIGENCE FOR ENVIRONMENT PERCEPTION IN AUTONOMOUS AVIATION

Environment perception for autonomous aviation with the help of artificial intelligenceFor years, traffic jams have been a common sight in and around Germany’s most populous and economically powerful cities during rush hour. There is no sign of a reversal of the trend. According to figures from the German Federal Environment Agency, passenger traffic increased by 28.5% and freight traffic by 67% in the period 1991 – 2018. One promising innovation that could counteract this trend in the future is the airborne transport of goods and people by drones and air taxis – especially on highly frequented commuting routes. An essential function for the economic operation of such electric air vehicles is their automation. A core component to ensure automatic operation is a system for safe environment perception during take-off, landing and in-flight. Artificial intelligence (AI) approaches, especially machine learning (ML), have the potential to become a key component in this, as they enable aircraft to make machine decisions human-like and in real time. However, the use of AI processes in safety-critical systems of aircraft has not been possible so far for reasons of verification and certification.Overview KI4Flight

The aim of the joint project „KI4Flight“ is therefore to enable the development of safe and efficient AI algorithms for environment perception and their testing on a compact sensor suite. In addition, the safety of these algorithms will be evaluated in the context of existing certification processes. During the project, attempts will be made for the first time to not only map the environment around an aircraft geometrically and semantically with the help of AI, but also to recognise special risk situations in real time and make them available for flight control. This is done by fusing different, complementary (dissimilar) high-resolution sensors (laser, camera, radar) and an AI unit embedded on the aircraft. This is the only way to achieve real-time object recognition of cooperative and uncooperative airspace participants at a distance of up to several kilometres. This is an essential step towards the sustainable automation of aircraft. The goals of the project are built up step by step from a function for data acquisition to an AI-based environment perception in order to ensure early testing in practical use. For example, together with the associated partner Volocopter, system requirements for a sensor suite and AI models for semantic environment perception are initially being developed and successively integrated into a demonstrator (drone).

The sensor suite to be integrated takes into account the requirements for dissimilarity and redundancy as well as the separate verification of information chains. This system and its verification are being researched and tested in the project. The focus of the verification is on techniques for machine learning in a heterogeneous system with deterministic methods in order to achieve high security levels with the help of functional decomposition and dissimilar sensor fusion. This is tested in well-defined use cases, which in turn are oriented to the flight phases and operational requirements of aircraft for „Continue Safe Flight and Landing“, so that environment perception is used precisely where no exact advance planning is possible due to missing or inaccurate maps. The planning of flight tests, for example, can be based on detect-and-avoid (collision avoidance) or safe-landing-zone (emergency landing) scenarios, since a safe environment model consisting of sensor fusion, semantic mapping, SLAM and perception play a central role here and are solved both partially and solely with machine learning methods. According to a study by the Unmanned Aerial Vehicle Association, 31,000 commercial drones are expected in Germany alone by 2020, rising to 90,000 by 2025, with a significant proportion of so-called high-payload drones. In addition, a study by Porsche Consulting foresees 500 air taxis in use by 2025, the number of which is already expected to rise to over 15,000 by 2035. In principle, all of these vehicles require a correspondingly compact and safe sensor solution in conjunction with ML applications enabled by KI4Flight for automated use. This step is essential on the roadmap of German drone and air taxi OEMs in global competition. Spleenlab, DLR and Volocopter are thus assuming a pioneering role in the development of certifiable, ML-based sensor and control systems for the field of autonomous flight and also for corresponding assistance systems for safe locomotion in airspace. Due to the platform-independent development and broad applicability of the results, KI4Flight creates a global location advantage for Germany that will support the widespread and economic use of automated aircraft in the future.

KI4KMUWith the announcement „Research, development and use of artificial intelligence methods in SMEs“ of 9 March 2020, the BMBF is pursuing the goal of supporting high-risk industrial research and pre-competitive development projects of small and medium-sized enterprises (SMEs) in Germany in the field of artificial intelligence (AI). At present, digitalisation poses major challenges for the entire SME sector and the importance of data as a key resource is continuously increasing. In this context, AI technologies, as crucial core components of information and communication technologies (ICT), are key drivers of digitalisation. The funding is intended to ensure that significantly more SMEs transfer their own research results and new scientific findings into innovative industrial and socially important applications and thus strengthen their growth and competitiveness, among other things. The funding of collaborative projects is also intended to strengthen and intensify the cooperation between SMEs and science. The funding measure is part of the implementation of the Federal Government’s AI Strategy and the High-Tech Strategy 2025.

written by