Drones are high-tech devices based on embedded systems. Thanks to their constant technological development, new fields of application are constantly emerging. In the future, they will be able to analyse photos and detect irregularities directly in flight using artificial intelligence. Platforms for this already exist.
The applications for drones are diverse, and new ones are constantly emerging. They are used to inspect structures such as bridges or to monitor railway tracks. They fly through sewage tunnels to locate problem areas. They are also used to provide medical assistance in the event of accidents or emergencies, for example in mountainous terrain. In agriculture, they spot fawns before the combine harvester arrives. Even the defence against drones is carried out using interceptor drones, such as the German-developed ‘Talon’ by the Berlin-based company Germandrones. Their technology is becoming increasingly complex and sophisticated. When drones fly in swarms, they usually have special capabilities to coordinate with one another and exchange data. The complexity of the formations is sometimes reminiscent of swarms of bees.
Special chip design allows for customisation to specific operational requirements
Drones are based on embedded systems. Embedded systems have become indispensable in aviation. They are small, lightweight and powerful, making them a perfect fit for the criteria in the aviation sector. The microcontroller-based boards are responsible, among other things, for real-time flight control as well as attitude and stability control. Thanks to a special chip design, System on a Chip (SoC), they can be tailored to specific requirements. This allows drones to be developed for specific tasks, such as flying through sewage tunnels, where GPS reception is usually not possible and radio remote control can also encounter problems.
The IT technology behind drones, also known as unmanned aerial vehicles (UAV) or uncrewed aircraft systems (UAS), operates on the principle of division of labour. At its core, it consists of a highly integrated embedded system that combines avionics, sensor technology and wireless communication. Modern drones are flying computing platforms that process data in real time.
The embedded system is the heart of every drone
The heart of every drone is the flight controller, an embedded system with a specialised microcontroller that acts as a flight computer. Its tasks include processing sensor data, calculating the position in space and controlling the rotor motors via electronic speed controllers. It must enable stable flight, autonomous manoeuvres and automatic take-off and landing. To this end, drones use a variety of sensors to determine their position, perform flight manoeuvres and detect and avoid obstacles. Even in difficult conditions such as gusts of wind, they must be able to fly predefined routes or navigate autonomously, for example when carrying out drone deliveries or inspections. To do this, the embedded system must process data from various sensors in real time and correlate it to make decisions, fly around obstacles or identify objects.
The basis for this is the inertial measurement unit (IMU). It consists of gyroscopes and accelerometers for determining position and movement. The IMU thus effectively functions as a built-in spirit level with an accelerometer. This makes it possible, for example, to determine whether a drone is currently turning to the right or left, or is being pushed unintentionally in a certain direction by the wind.
Equally important are sensors for precise positioning and navigation, usually via a global navigation satellite system (GNSS), typically in the form of GPS. Optical sensors and cameras, known as ‘visual positioning systems’, assist with position maintenance without GPS, for example when there is no signal. They are also used for obstacle detection. LIDAR (radar) and infrared sensors fall into the same category. They also enable 3D mapping and night vision.
AI platforms for drones
Modern drone technology is increasingly shifting data processing away from central servers in the cloud and directly onto the aircraft. Embedded systems with powerful graphics processors and image processing functions enable the analysis of video feeds, allowing obstacles to be detected more quickly and reliably, whilst also improving image analysis. In the future, these are set to be increasingly supplemented by artificial intelligence (AI). Integrated AI chips are set to improve real-time object recognition, obstacle avoidance and autonomous analysis during flight. To this end, the German Fraunhofer Institute for Microelectronic Circuits and Systems (IMS), for example, provides the AI software framework AifES as open source. AifES stands for ‘Artificial Intelligence for Embedded Systems’. This allows artificial neural networks (ANNs) to be deployed and trained on virtually any hardware. The embedded market now offers AI platforms suitable for drones. Examples include NVIDIA’s Jetson platform or the FLYC-300 system from Neousys. The ultimate goal is to grant drones ever greater autonomy.
Sources
The information on which this article is based draws on various specialist articles, product information and media reports. This includes the article “Next-gen UAVs: High-tech drone innovations to watch out for in 2025” by Asteria Aerospace, which provides an overview of future technological developments in the drone sector. In addition, content from “Artificial Intelligence for Embedded Systems” by Fraunhofer IMS was consulted, which examines the use of AI in embedded systems. Technical insights are also provided by “Embedded Solutions for Drones & UAVs” from Nvidia, as well as the german product description “Drone Mission-Computer FLYC-300” from Neousys. Current social and political references are supplemented by the german Tagesschau report “Drones from Berlin-Tegel”.



