Organizers

Evaluation Committee

Sponsors

Motivation

Drone racing is a recreational sport in which the goal is to pass through a sequence of gates in a minimum amount of time while avoiding collisions with the environment. In autonomous drone racing, one must accomplish this task by flying fully autonomously in an unknown environment, relying only on the on-board resources. Thus, autonomous drone racing is an exciting case study that aims to motivate more researchers to develop innovative ways of solving complex problems. What makes drone racing such an interesting challenge is the cumulative complexity of each sub-problem to be solved, such as perception, localisation, path planning and control.

Goal

The Autonomous Drone Racing Competition @ IEEE SSCI 2025 is an aerial robotics challenge in which research groups will validate their fuzzy logic controllers in a challenging scenario – autonomous drone racing.

Challenge

The participating groups will have the possibility to address two major challenges in autonomous drone racing:

  1. validate the control methods for navigation the drone through the racing gates in the control challenge,
  2. demonstrate the perception methods for gate detection under variable illumination in the perception challenge.
Participants can choose to tackle only one or both subchallenges.

Control Challenge

This challenge aims to design intelligent controllers that can operate in nonlinear regions of the drone's dynamical model at high speeds. The number and pose (position and orientation) of the gates will be predefined, and the drone will have to cross them in a given sequence. For the gate detection, an on-board wide-angle camera with a 90° field-of-view is used. Trajectories at fixed velocities through the centres of all gates will be generated using the curve fitting method and available to the participants. However, each team is free to implement its own trajectory generation strategy to achieve better control performances. The real-time noisy localisation, which includes estimated position and attitude, will be available to simulate visual-inertial odometry which deteriorates its performances at high speeds due to motion blur and accelerometer's noise. The task of each team is to develop a fuzzy logic controller which will take the desired trajectory and current pose of the drone, as input, and will provide the commanded attitude and thrust, as control commands, as depicted in Figure 1.

Figure 1. The racing drone with its reference frames and control inputs.

Perception Challenge

In this challenge, the task is to design 3D perception methods that can accurately estimate the location of a gate in Figure 2 relative to a global frame under various lighting conditions and motion blur due to the drone's high speeds. The detection consists in determining the x and y position of the gate's center c, distance d and orientation θ of the gate relative to the drone’s body frame. Participants are provided with a limited training dataset consisting of gate images from the drone and the associated ground-truth poses of gates and the drone in a global frame. They will be required to create a correct map of the gates in the testing dataset, given only the ground-truth pose of the drone.

Figure 2. Gate detection using the image from the racing drone.

Environment

Control Challenge

The simulation environment of the racing track with racing gates is implemented in MATLAB and tested in MATLAB R2020b. A sample environment is shown in Figure 3. The latest version of the simulation environment can be downloaded from GitHub repository. The simulation package contains five files:

  1. main.m: main file for starting the simulation;
  2. trajectory.m: trajectory generation through the center of each gate;
  3. controller.m: pose controller which participants will have to implement (a sample PD controller is provided);
  4. uav.m: dynamical model of the drone and its low-level controller;
  5. environment.m: visualisation of the racing arena and performance evaluation.
Figure 3. Racing arena simulated in MATLAB.

Please note that only the content of controller.m and trajectory.m can be modified.

Remark. The pose of the gates during the competition will be different than in the scenario provided to the teams to develop their controllers.

Remark. For any technical issues with the simulation environment, please open a new issue on GitHub or send an email enquiry to the Organisers.

Perception Challenge

For this challenge, both a training dataset, illustrated in Figure 4, and a testing dataset will be provided. The datasets include drone images in PNG format and a JSON file containing the ground-truth information of the drone's pose. For the training data, the ground-truth information of the gates is also provided.

  1. PNG images containing the racing gates;
  2. JSON file containing the ground-truth information of the gates' position in the image.
Figure 4. The dataset containing images of the racing gates under various light intensities.

Evaluation

The Evaluation Committee will evaluate the developed controllers on the testing racing track and gates' predictions. The teams will be notified of the acceptance of their results on January 20, 2025. The results will be announced during IEEE SSCI 2025 in March 2025.

Control Challenge

The controllers will be evaluated using a similar MATLAB environment, but the position of the gates will be different. The main prerequisite is the real-time implementation of the controller, in other words, the controller must run minimum at 100Hz. Each team will be given 60 seconds to cross as many gates as possible in a predefined sequence. The evaluation rules will be as follows:

Perception Challenge

Participants will be ranked based on the average accuracy of their perception results compared to the ground-truth information of the gates in the test dataset.

Submission

The submissions must be done to the Organizers via email by January 10, 2025.

Control Challenge

The teams must submit the following MATLAB files:

  1. controller.m,
  2. (optional) trajectory.m.
It is also possible to submit a compressed folder containing support files for the controller and trajectory generator.

Perception Challenge

The teams must submit the JSON file containing the predictions of the gates in the test images.

Prizes

Winning teams will receive surprise prizes.