Duckiebot Localization with Sensor Fusion in Duckietown
Project Resources
- Objective: Demonstrate accurate localization of Duckiebots using visual fiducial markers and sensor fusion in a controlled Duckietown environment.
- Approach: Use multi-camera sensor fusion to detect AprilTags and reconstruct a stitched top-down map for Duckiebot pose localization.
- Authors: Samuel Neumann, University of Alberta, Canada
Localization with Sensor Fusion in Duckietown - the objectives
The advantage of having multiple sensors on a Duckiebot is that the data provided can be combined to provide additional precision and reduce uncertainty in derived results. This process is generally referred to as sensor fusion, and a typical example is localization, i.e., the problem of finding the pose of the Duckiebot in time, with respect to some reference frame. And if the data is redundant? No problem, just discard it.
In this project, the objective is to implement sensor fusion-based localization and lane-following on a DB21 Duckiebot, integrating odometry (using data from wheel encoders) with visual AprilTag detection for improved positional accuracy.
This process addresses limitations of odometry, i.e., the open-loop reconstruction of the robots’ trajectory using only wheel encoder data in a mathematical approach known as “dead reckoning”, by incorporating AprilTags as global reference landmarks, thereby enhancing spatial awareness in environments where dead reckoning alone is insufficient.
Technical concepts include AprilTag-based localization, PID control for lane following, transform tree management in ROS (tf2), and coordinate frame transformations for pose estimation.
Sensor fusion - visual project highlights
The technical approach and challenges
This approach, at the technical level, involves:
- extending ROS-based packages to implement AprilTag detection using the dt-apriltags library,
- configuring static transformations for landmark localization in a unified world frame, and
- correcting odometry drift by broadcasting transforms from estimated AprilTag poses to the Duckiebot’s base frame.
A full PID controller was moreover implemented, with tunable gains for lateral and heading deviation, and derivative terms were conditionally initialized for stability.
Challenges included:
- remapping ROS topics for motor command propagation,
- resolving frame connectivity in tf trees,
- configuring accurate static transforms for AprilTag landmarks,
- debugging quaternion misrepresentation during pose updates, and
- correctly applying transform compositions using lookup_transform_full to compute odometry corrections.
Looking for similar projects?
Check out the following works on path planning with Duckietown:
Localization with Sensor Fusion in Duckietown: Authors
Samuel Neumann is a Ph. D. student at the University of Alberta, Canada.
Learn more
Duckietown is a modular, customizable, and state-of-the-art platform for creating and disseminating robotics and AI learning experiences.
Duckietown is designed to teach, learn, and do research: from exploring the fundamentals of computer science and automation to pushing the boundaries of knowledge.
These spotlight projects are shared to exemplify Duckietown’s value for hands-on learning in robotics and AI, enabling students to apply theoretical concepts to practical challenges in autonomous robotics, boosting competence and job prospects.