Autonomous Navigation in Duckietown with QuackCruiser

QuackCruiser: Autonomous Navigation with Dijkstra

QuackCruiser: Autonomous Navigation with Dijkstra

Project Resources

Autonomous navigation in Duckietown with QuackCruiser - objectives and approach

The objective of this project is to implement autonomous navigation in Duckietown by integrating perception, Dijkstra planning, and control into a Duckiebot (DB21J).

Localization at intersection is achieved using AprilTag detection, YOLO-ROS is used for real-time obstacle recognition, and onboard sensors such as wheel encoders and IMU are used for odometry. These inputs provide both exteroceptive data (from the environment) and interoceptive data (from the robot itself), which are fused to estimate pose and environment state.

Planning is performed with Dijkstra planning algorithm, a graph search method that computes the shortest path on a grid-based map where intersections are nodes and lanes are edges with associated costs.

Control is implemented through PID-based lane following and parameterized turning services, where each maneuver is defined by velocity, radius, and execution time. A ROS state machine coordinates perception inputs and planning outputs to trigger the correct control actions in ‘real time’.

Autonomous navigation in Duckietown with QuackCruiser - highlights

The challenges

The principal challenges in implementing this agent emerge from hardware calibration, computational limitations, and cross-module synchronization.

Wheel encoder calibration directly influences odometric drift, while camera calibration governs the reliability of AprilTag-based localization and lane geometry estimation. The deployment of CUDA-accelerated YOLO models within the ROS ecosystem introduces compatibility constraints across GPU drivers, compiler toolchains, and real-time inference pipelines, which collectively impose significant computational overhead on limited embedded resources.

At the system integration level, temporal synchronization across perception modules (AprilTag detection, obstacle detection) and control modules (lane following, turning) constitutes a critical factor, as phase offsets and latencies propagate into localization uncertainty and trajectory deviation.

Sensor fusion must accommodate inconsistency between odometry estimates and visual updates, with conflict resolution strategies directly shaping the stability of pose estimation.

Furthermore, the tuning of control gains and maneuver execution parameters remains non-trivial, since cumulative deviations over extended trajectories amplify minor discrepancies in actuation dynamics and timing precision.

Looking for similar projects?

Autonomous navigation in Duckietown with QuackCruiser: authors

Yunwem Li is a Computer engineering graduate with a master’s in robotics from ETH Zurich, Switzerland.

Farian Keck is currently working as an Autonomy intern at Airbus Defence and Space , Switzerland.

Jiranyi has a master’s in robotics from ETH Zurich, Switzerland.

Learn more

Duckietown is a modular, customizable, and state-of-the-art platform for creating and disseminating robotics and AI learning experiences.

Duckietown is designed to teach, learn, and do research: from exploring the fundamentals of computer science and automation to pushing the boundaries of knowledge.

These spotlight projects are shared to exemplify Duckietown’s value for hands-on learning in robotics and AI, enabling students to apply theoretical concepts to practical challenges in autonomous robotics, boosting competence and job prospects.