University of Nevada doing IoT with Duckietown

University of Nevada doing IoT with Duckietown

Here’s an extract from Nevada Today’s article “Integrating big data into robotics with Duckietown”, written by Kaeli Britt. 

For the third year, the University of Nevada, Reno’s Computer Science & Engineering (CSE) department conducted a Research Experience for Teachers (RET) program focused on “Integrating Big Data into Robotics.”

Through the six-week program, participants were able to gain hands-on robotics experience that can be applied in classrooms later, in a fun, nontraditional way.

Duckietown, an engineering and robotics/artificial intelligence (AI) project, focuses on accessible and engaging styles of learning. The project started at the Massachusetts Institute of Technology in 2016 as a graduate class, where they created a video “Duckumentary” highlighting the background and purpose of the research project but also its adaptability for varying age groups.

This year’s University project was taught by Ph. D. candidate and instructor Amirhesam Yazdi as well as CSE associate professor and principal investigator Lei Yang.

Participants were able to learn how to assemble the robots, build and design the track, and program the robots and the track.

“Duckietown is a freely available robotics platform and curricula for all levels of education. It is tangible, accessible, and fun. It has mobile robots and roads, constructed from exercise mats and tape,” Yang said. “The mobile robots are built from off-the-shelf parts and using open-source software and the curricula, such as lectures and exercises are provided on the Duckietown website. These unique features set Duckietown apart from other engineering, robotics and/or AI projects.”

Tell us your story

Are you an instructor, learner, researcher or professional with a Duckietown story to tell? Reach out to us!

Introduction to robotics at the University of Massachusetts Lowell using Duckietown

University of Massachusetts, Lowell, December 20, 2022: Paul Robinette, Assistant Professor at the University of Massachusetts Lowell (UML), shares with us his Duckietown teaching experience.

Introduction to robotics at the University of Massachusetts Lowell using Duckietown

Paul Robinette is an Assistant Professor in the Department of Electrical and Computer Engineering at the University of Massachusetts Lowell. He shares his experience, and that of his students, using Duckietown for teaching throughout the years. His “Fundamentals of Robotics” (EECE 4560/5560) course with Duckietown platform has been repeating since 2019.
Paull Robinette and Duckietown

Thank you for finding the time to talk with us! Could you introduce yourself?

My name is Paul Robinette [Linkedin] and I’m a Professor of computer engineering at the University of Massachusetts Lowell [website].

When was your first experience with Duckietown? 

My first experience with Duckietown would have been when I worked at MIT as a research scientist, just after Duckietown was run. I didn’t have a chance to see it live there, but I did talk with several of the postdocs who worked on it as it ran. I also saw it live for the first time at ICRA 2019.

Do you use Duckietown or did you use Duckietown in the past for some specific project or activity?

Sure! For the last three years, I’ve been using Duckietown robots in my class every semester. Primarily I use the Duckiebots to teach ROS and basic robot skills through the Duckietown system and infrastructure. I leverage the development infrastructure heavily and some of the course materials as well.

That sounds great! Can you tell us more about your ongoing class?

The class I teach every semester so far is called Fundamentals of Robotics [2022 class page], and we go over the basics of robotics, starting with multi-agent processing or multi-process systems, like most robots are these days, some basic networking problems, etc. The Duckiebots are perfect for that because they have Docker containers on board which have multiple different networks running. They have to work with the computer system, so it’s always at least interfacing with the laptop. The robots can be used with a laptop, with a router, you can have multiple robots out at once, and they give the students a really good sense of what moving real robots around feels like. We have students start by implementing some open loop control systems, then have them design their own lane detector, similar to the Duckietown [perception] demo, and then have them design their own lane controller again, similar to the Duckietown [lane following] demo.

Are your students appreciating using Duckietown? Would you consider it a success?

Yes, especially the newer version. The DB21s are great robots for using their class applications and infrastructure. The software infrastructure has made it pretty easy to set up our own Git repositories for the robots and be able to run them. In this way, students can run them at home or on campus.

Would you suggest Duckietown to your students or colleagues?

Yes, I’d suggest Duckietown, especially if people want to run an introductory robotics class and have every student purchase their own robot, or have the University provide the robots for all. Duckietown is much more affordable than any other robot system that could be used for this same purpose.

"In my class we go over the basics of robotics, starting with multi-agent processing or multi-process systems, like most robots are these days, and the Duckiebots are perfect for that"

It is great to hear Duckietown addresses your needs so well. What would you say is the advantage that Duckietown has when compared to other systems?

I’d say the expense is probably the biggest advantage right now. It’s a nice platform and very capable for what we wanted to do. At this point, the fact that it’s affordable for students to purchase on their own or for us to purchase a bunch of them is definitely the biggest advantage for us. You guys also have a really quick response time if we have any problems. It’s nice to be able to talk directly with the development team and work with them to set up the systems so that I can run them in my class as I need to.

Thank you very much for your time!

Learn more about Duckietown

The Duckietown platform enables state-of-the-art robotics and AI learning experiences.

It is designed to teach, learn, and do research: from exploring the fundamentals of computer science and automation to pushing the boundaries of knowledge.

Tell us your story

Are you an instructor, learner, researcher or professional with a Duckietown story to tell? Reach out to us!

Join the new “Self-Driving Cars with Duckietown” MOOC

Join the self-driving cars with Duckietown MOOC user-paced edition

Over 7200 learners engaged in a robotics and AI learning adventure with “Self-Driving Cars with Duckietown”, the first massive online open course (MOOC) on robot autonomy with hardware, hosted on the edX platform.

Kicking off on November 29th, this new edition is a user-paced course with rich and engaging modules offering a grand tour of real-world robotics, from computer vision to perception, planning, modeling, control, and machine learning, released all at once!

With simulation and real-world learning activities, learners can touch with hand the emergence of autonomy in their robotic agents with approaches of increasing complexity, from Brateinberg vehicles to deep learning applications.

We are thrilled to welcome you to the start of the second edition of Self-Driving Cars with Duckietown.

This is a new learning experience in many different ways, for both you and us. While the course is self-paced, the instructors and staff, as well as your peer learners and the community of those that came before you are standing behind, ready to intervene and support your efforts at any time.

Learn autonomy hands-on by making real robots take their own decisions and accomplish broadly defined tasks. Step by step from the theory, to the implementation, to the deployment in simulation as well as on Duckiebots.

Leverage the power of the NVIDIA Jetson Nano-powered Duckiebot to see your algorithms come to life!

MOOC Factsheet

Prerequisites

What you will learn

Designed for university-level students and professionals, this course is brought to you by the Swiss Federal Institute of Technology in Zurich (ETHZ), in collaboration with the University of Montreal, the Duckietown Foundation, and the Toyota Technological Institute at Chicago.

Learning autonomy requires a fundamentally different approach when compared to other computer science and engineering disciplines. Autonomy is inherently multi-disciplinary, and mastering it requires expertise in domains ranging from fundamental mathematics to practical machine-learning skills.

Pedestrian detection: there are many obstacles in Duckietown - some move and some don't. Being able to detect pedestrians (duckies) is important to guarantee safe driving.

The Duckietown robotic ecosystem was created at the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) in 2016 and is now used in over 175 universities worldwide.

“The Duckietown educational platform provides a hands-on, scaled-down, accessible version of real-world autonomous systems.” said Emilio Frazzoli, Professor of Dynamic Systems and Control, ETH Zurich, “Integrating NVIDIA’s Jetson Nano power in Duckietown enables unprecedented access to state-of-the-art compute solutions for learning autonomy.”

Enroll now and don’t miss the chance to join in the first vehicle autonomy MOOC with hands-on learning!

AI Driving Olympics 2021: Urban League Finalists

AI Driving Olympics 2021 - Urban League Finalists

This year’s embodied urban league challenges were lane following (LF), lane following with vehicles (LFV) and lane following with intersections, (LFI). To account for differences between the real world and simulation, this edition finalists can make one additional submission to the real challenges to improve their scores. Finalists are the authors of AI-DO 2021 submissions in the top 5 ranks for each challenge. This year’s finalists are:

LF

  • András Kalapos
  • Bence Haromi
  • Sampsa Ranta
  • ETU-JBR Team
  • Giulio Vaccari

LFV

  • Sampsa Ranta
  • Adrian Brucker
  • Andras Beres
  • David Bardos

LFI

  • András Kalapos
  • Sampsa Ranta
  • Adrian Brucker
  • Andras Beres

The deadline for submitting the “final” submissions is Dec. 9th, 2 pm CET. All submissions received after this time will count towards the next edition of AI-DO.

Don’t forget to join the #aido channel on the Duckietown Slack for updates!

Congratulations to all the participants, and best of luck to the finalists!

Amazon Web Services (AWS)

Congratulations to the winners of the second edition of the AI Driving Olympics!

Team JetBrains came out on top on all 3 challenges

It was a busy (and squeaky) few days at the International Conference on Robotics and Automation in Montreal for the organizers and competitors of the AI Driving Olympics. 

The finals were kicked off by a semifinals round, where we the top 5 submissions from the Lane Following in Simulation leaderboard. The finalists (JBRRussia and MYF) moved forward to the more complicated challenges of Lane Following with Vehicles and Lane Following with Vehicles and Intersections. 

Results from the AI-DO2 Finals event on May 22, 2019 at ICRA

If you couldn’t make it to the event and missed the live stream on Facebook, here’s a short video of the first run of the JetBrains Lane Following submission.

Thanks to everyone that competed, dropped in to say hello, and cheered on the finalists by sending the song of the Duckie down the corridors of the Palais des Congrès. 

A few pictures from the event

Don't know much about the AI Driving Olympics?

It is an accessible and reproducible autonomous car competition designed with straightforward standardized hardware, software and interfaces.

Get Started

Step 1: Build and test your agent with our available templates and baselines

Step 2: Submit to a challenge

Check out the leaderboard

View your submission in simulation

Step 3: Run your submission on a robot

in a Robotarium

AI-DO technical updates

Here are some technical updates regarding the competition. Thanks for all the bug reports via Github and Slack!

Changes to platform model in simulations

We have changed the purely kinematic model in the simulations with one that is more similar to the real robots obtained by system identification. You can find the model here. Properties:
  • The inputs to the model are the two PWM signals to the wheels, left and right. (not [speed, omega] like last year)
  • The maximum velocity is ~2 m/s. The rise time is about 1 second.
  • There is a simulated delay of 100 ms.
We will slightly perturb the parameters of the model in the future to account for robot-robot variations, but this is not implemented yet. All the submissions have been re-evaluated. You can see the difference between the two models
purely kinematic platform model more realistic platform model
  The new model is much more smooth. Overall we expect that the new model makes the competition easier both in simulation, and obviously, in the transfer.

Infrastructure changes

  • We have update the Duckietown Shell and commands several times to fix a few reported bugs.
  • We have started with provisioning AWS cloud evaluators. There are still sporadic problems. You should know that if your job fails with the host-error code, the system thinks it is a problem of the evaluator and it will try on another evaluator.

Open issues

  • Some timeouts are a bit tight. Currently we allow 20 minutes like for NeurIPS, but this year we have much more realistic simulation and better visualization code that  take more time. If your submission fails after 20 minutes of evaluation, this is the reason.
  • We are still working on the glue code for running the submissions on the real robots. Should be a couple of days away.
  • Some of the changes to the models/protocol above are not in the docs yet.

Kicking off the Duckietown Donation program with Cali, Colombia

Our first donation of a class kit goes to Cali, Colombia.

We’ve reached our Kickstarter goal! 

This is great news because it means that we can kick off our donation program, with our first donation of a Class Kit, to students at the Universidad Autónoma de Occidente in Cali, Colombia.

 

Why a donation program?

Artificial Intelligence and Robotics are the sciences of the future, which is why we want everyone to have the chance to play and learn with Duckietown. While we design our robot platform to be as inexpensive as possible, we realize that cost might be an obstacle for educators or students with limited resources.

That is why we have designed a donation program where individuals, organizations or companies can make Duckietown truly accessible to all. Everybody can support STEM education by donating Duckiebots, or an entire Class Kit, to deserving individuals or educators. 

Our first recipient

Our first recipient is Prof. Victor Romero Cano, a professor from the Universidad Autónoma de Occidente in Cali, Colombia. 

Victor has a Ph.D. in field robotics obtained at the University of Sydney, Australia. He teaches two courses at his institution,  and supervises over 40 undergraduate students who are working towards their final research projects.

 

 

Victor will teach two classes using the Duckietown platform. The first is an introductory class to robotics, covering kinematic analysis, teleoperation, control and autonomous navigation for wheeled robots. The second class is more specifically about robotic perception, and will go in detail about mapping and SLAM (simultaneous localization and mapping), covering lane detection as well as object detection, recognition and tracking.

 

Victor’s first Duckietown class starts in January 2019. We welcome him to the community and look forward to hearing about his journey!

You can help us sponsor more donations by sponsoring our Kickstarter.