Category: dep-News
Introduction to robotics at the University of Massachusetts Lowell using Duckietown
University of Massachusetts, Lowell, December 20, 2022: Paul Robinette, Assistant Professor at the University of Massachusetts Lowell (UML), shares with us his Duckietown teaching experience.
Introduction to robotics at the University of Massachusetts Lowell using Duckietown
Thank you for finding the time to talk with us! Could you introduce yourself?
My name is Paul Robinette [Linkedin] and I’m a Professor of computer engineering at the University of Massachusetts Lowell [website].
When was your first experience with Duckietown?
My first experience with Duckietown would have been when I worked at MIT as a research scientist, just after Duckietown was run. I didn’t have a chance to see it live there, but I did talk with several of the postdocs who worked on it as it ran. I also saw it live for the first time at ICRA 2019.
Do you use Duckietown or did you use Duckietown in the past for some specific project or activity?
Sure! For the last three years, I’ve been using Duckietown robots in my class every semester. Primarily I use the Duckiebots to teach ROS and basic robot skills through the Duckietown system and infrastructure. I leverage the development infrastructure heavily and some of the course materials as well.
That sounds great! Can you tell us more about your ongoing class?
The class I teach every semester so far is called Fundamentals of Robotics [2022 class page], and we go over the basics of robotics, starting with multi-agent processing or multi-process systems, like most robots are these days, some basic networking problems, etc. The Duckiebots are perfect for that because they have Docker containers on board which have multiple different networks running. They have to work with the computer system, so it’s always at least interfacing with the laptop. The robots can be used with a laptop, with a router, you can have multiple robots out at once, and they give the students a really good sense of what moving real robots around feels like. We have students start by implementing some open loop control systems, then have them design their own lane detector, similar to the Duckietown [perception] demo, and then have them design their own lane controller again, similar to the Duckietown [lane following] demo.
Are your students appreciating using Duckietown? Would you consider it a success?
Yes, especially the newer version. The DB21s are great robots for using their class applications and infrastructure. The software infrastructure has made it pretty easy to set up our own Git repositories for the robots and be able to run them. In this way, students can run them at home or on campus.
Would you suggest Duckietown to your students or colleagues?
Yes, I’d suggest Duckietown, especially if people want to run an introductory robotics class and have every student purchase their own robot, or have the University provide the robots for all. Duckietown is much more affordable than any other robot system that could be used for this same purpose.
"In my class we go over the basics of robotics, starting with multi-agent processing or multi-process systems, like most robots are these days, and the Duckiebots are perfect for that"
Prof. Paul Robinette
It is great to hear Duckietown addresses your needs so well. What would you say is the advantage that Duckietown has when compared to other systems?
I’d say the expense is probably the biggest advantage right now. It’s a nice platform and very capable for what we wanted to do. At this point, the fact that it’s affordable for students to purchase on their own or for us to purchase a bunch of them is definitely the biggest advantage for us. You guys also have a really quick response time if we have any problems. It’s nice to be able to talk directly with the development team and work with them to set up the systems so that I can run them in my class as I need to.
Thank you very much for your time!
Learn more about Duckietown
The Duckietown platform enables state-of-the-art robotics and AI learning experiences.
It is designed to teach, learn, and do research: from exploring the fundamentals of computer science and automation to pushing the boundaries of knowledge.
Tell us your story
Are you an instructor, learner, researcher or professional with a Duckietown story to tell? Reach out to us!
Join the new “Self-Driving Cars with Duckietown” MOOC
Join the self-driving cars with Duckietown MOOC user-paced edition
Over 7200 learners engaged in a robotics and AI learning adventure with “Self-Driving Cars with Duckietown”, the first massive online open course (MOOC) on robot autonomy with hardware, hosted on the edX platform.
Kicking off on November 29th, this new edition is a user-paced course with rich and engaging modules offering a grand tour of real-world robotics, from computer vision to perception, planning, modeling, control, and machine learning, released all at once!
With simulation and real-world learning activities, learners can touch with hand the emergence of autonomy in their robotic agents with approaches of increasing complexity, from Brateinberg vehicles to deep learning applications.
We are thrilled to welcome you to the start of the second edition of Self-Driving Cars with Duckietown.
This is a new learning experience in many different ways, for both you and us. While the course is self-paced, the instructors and staff, as well as your peer learners and the community of those that came before you are standing behind, ready to intervene and support your efforts at any time.
Learn autonomy hands-on by making real robots take their own decisions and accomplish broadly defined tasks. Step by step from the theory, to the implementation, to the deployment in simulation as well as on Duckiebots.
Leverage the power of the NVIDIA Jetson Nano-powered Duckiebot to see your algorithms come to life!
MOOC Factsheet
- Name: Self-driving cars with Duckietown
- Platform: edX
- Cost: free to attend
- Instructors: Swiss Federal Institute of Technology in Zurich (ETHZ), Université de Montréal (UdM), Toyota Technological Institute at Chicago (TTIC)
Prerequisites
- Basic Linux, Python, Git
- Elements of linear algebra, probability, calculus
- Elements of kinematics, dynamics
- Computer with native Ubuntu installation
- Broadband internet connection
What you will learn
- Computer Vision
- Robot operations
- Object Detection
- Onboard localization
- Robot Control
- Planning
- Reinforcement Learning
The Duckietown robotic ecosystem was created at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) in 2016 and is now used in over 175 universities worldwide.
“The Duckietown educational platform provides a hands-on, scaled-down, accessible version of real-world autonomous systems.” said Emilio Frazzoli, Professor of Dynamic Systems and Control, ETH Zurich, “Integrating NVIDIA’s Jetson Nano power in Duckietown enables unprecedented access to state-of-the-art compute solutions for learning autonomy.”
Enroll now and don’t miss the chance to join in the first vehicle autonomy MOOC with hands-on learning!
AI Driving Olympics 2021: Urban League Finalists
AI Driving Olympics 2021 - Urban League Finalists
LF
), lane following with vehicles (LFV
) and lane following with intersections, (LFI
).
To account for differences between the real world and simulation, this edition finalists can make one additional submission to the real
challenges to improve their scores.
Finalists are the authors of AI-DO 2021 submissions in the top 5 ranks for each challenge.
This year’s finalists are: LF
- András Kalapos
- Bence Haromi
- Sampsa Ranta
- ETU-JBR Team
- Giulio Vaccari
LFV
- Sampsa Ranta
- Adrian Brucker
- Andras Beres
- David Bardos
LFI
- András Kalapos
- Sampsa Ranta
- Adrian Brucker
- Andras Beres
The deadline for submitting the “final” submissions is Dec. 9th, 2 pm CET. All submissions received after this time will count towards the next edition of AI-DO.
Don’t forget to join the #aido channel on the Duckietown Slack for updates!
Congratulations to all the participants, and best of luck to the finalists!
Congratulations to the winners of the second edition of the AI Driving Olympics!
Team JetBrains came out on top on all 3 challenges
It was a busy (and squeaky) few days at the International Conference on Robotics and Automation in Montreal for the organizers and competitors of the AI Driving Olympics.
The finals were kicked off by a semifinals round, where we the top 5 submissions from the Lane Following in Simulation leaderboard. The finalists (JBRRussia and MYF) moved forward to the more complicated challenges of Lane Following with Vehicles and Lane Following with Vehicles and Intersections.
If you couldn’t make it to the event and missed the live stream on Facebook, here’s a short video of the first run of the JetBrains Lane Following submission.
Thanks to everyone that competed, dropped in to say hello, and cheered on the finalists by sending the song of the Duckie down the corridors of the Palais des Congrès.
Don't know much about the AI Driving Olympics?
It is an accessible and reproducible autonomous car competition designed with straightforward standardized hardware, software and interfaces.
Get Started
Step 1: Build and test your agent with our available templates and baselines
Step 2: Submit to a challenge
Check out the leaderboard
View your submission in simulation
Step 3: Run your submission on a robot
in a Robotarium
AI-DO 2 Validation and Testing Registration
We are in the final countdown to AI-DO 2 at ICRA!
Now is the time to let us know if you will be using the validation and testing facilities at the Duckietown competition ground. Please register below!
[ninja_form id=27]
AI-DO technical updates
Changes to platform model in simulations
We have changed the purely kinematic model in the simulations with one that is more similar to the real robots obtained by system identification. You can find the model here. Properties:- The inputs to the model are the two PWM signals to the wheels, left and right. (not [speed, omega] like last year)
- The maximum velocity is ~2 m/s. The rise time is about 1 second.
- There is a simulated delay of 100 ms.
purely kinematic platform model | more realistic platform model |
Infrastructure changes
- We have update the Duckietown Shell and commands several times to fix a few reported bugs.
- We have started with provisioning AWS cloud evaluators. There are still sporadic problems. You should know that if your job fails with the
host-error
code, the system thinks it is a problem of the evaluator and it will try on another evaluator.
Open issues
- Some timeouts are a bit tight. Currently we allow 20 minutes like for NeurIPS, but this year we have much more realistic simulation and better visualization code that take more time. If your submission fails after 20 minutes of evaluation, this is the reason.
- We are still working on the glue code for running the submissions on the real robots. Should be a couple of days away.
- Some of the changes to the models/protocol above are not in the docs yet.
Countdown to AI-DO 2!
Kicking off the Duckietown Donation program with Cali, Colombia
We’ve reached our Kickstarter goal!
This is great news because it means that we can kick off our donation program, with our first donation of a Class Kit, to students at the Universidad Autónoma de Occidente in Cali, Colombia.
Why a donation program?
Artificial Intelligence and Robotics are the sciences of the future, which is why we want everyone to have the chance to play and learn with Duckietown. While we design our robot platform to be as inexpensive as possible, we realize that cost might be an obstacle for educators or students with limited resources.
That is why we have designed a donation program where individuals, organizations or companies can make Duckietown truly accessible to all. Everybody can support STEM education by donating Duckiebots, or an entire Class Kit, to deserving individuals or educators.
Our first recipient
Our first recipient is Prof. Victor Romero Cano, a professor from the Universidad Autónoma de Occidente in Cali, Colombia.
Victor has a Ph.D. in field robotics obtained at the University of Sydney, Australia. He teaches two courses at his institution, and supervises over 40 undergraduate students who are working towards their final research projects.
Victor will teach two classes using the Duckietown platform. The first is an introductory class to robotics, covering kinematic analysis, teleoperation, control and autonomous navigation for wheeled robots. The second class is more specifically about robotic perception, and will go in detail about mapping and SLAM (simultaneous localization and mapping), covering lane detection as well as object detection, recognition and tracking.
Victor’s first Duckietown class starts in January 2019. We welcome him to the community and look forward to hearing about his journey!
You can help us sponsor more donations by sponsoring our Kickstarter.