Explainable AI: How to implement ethical and legal principles for AI in defence

Fraunhofer FKIE and Data Machine Intelligence Solutions to launch joint FCAS AI project for Airbus

Airbus Defence and Space, the German prime for the tri-national Future Combat Air System Program (FCAS), has contracted Fraunhofer FKIE and DMI Solutions to develop an Explainable AI (XAI) demonstrator. The project is realized jointly with the FCAS Expert Commission on the responsible use of new technologies.

Fraunhofer FKIE and DMI Solutions will be working with former fighter Aircraft pilots, Airbus engineering experts as well as representatives of the Airbus/Fraunhofer FKIE expert group on the responsible use of new technologies in a FCAS to demonstrate the versatile use of AI in a broader ethical and legal context. The goal is to lay the foundations for future certification, admission and ethical approval of FCAS.

Prof Wolfgang Koch, Chief Scientist at Fraunhofer FKIE, says: “The decisive question is: How to concretize and operationalize ethical and legal requirements in the technical design of FCAS? As a first step, we make the pilot experience a real AI/XAI in a simulated targeting cycle and interactively explore “ethically-aligned” engineering principles.”

Bert van Heukelom, Managing Director of Data Machine Intelligence Solutions, adds: “AI based recommendations to the human operator need to be properly displayed, in order to be well understood and accepted. We are delighted to work with Fraunhofer FKIE in the context of this European lighthouse project”.

First tangible results of the project are to be presented by the fourth quarter 2021.

Scroll to Top