A class being held this January at MIT centers around the Rapid Autonomous Complex-Environment Competing Ackermann-steering Robot, or RACECAR. I went to Boston, and was fortunate enough to get a walk through of the hardware used last year for the class. Looky here:
Here’s the RACECAR in action:
Background
In January 2015, the MIT RACECAR class was held. Here’s a course description:
We will design and implement perception and planning algorithms for cars that can quickly navigate through complex environments. The class will be divided into six teams. Each team will be given one RC race car, powered by an NVIDIA Jetson embedded supercomputer as well as an inertial measurement unit, a visual odometer, a laser scanner, and a camera. We will teach not only the basics of perception and planning algorithms, but we will also show the participants how to run the Robot Operating System (ROS) on the NVIDIA platform while interfacing with sensors and actuators. At the end of the course, we will race through the MIT tunnels to determine the winning team!
Here’s a link to the Github repositories for the design of the hardware and software for the car.
The class was taught by members of Lincoln Lab and Dr. Sertac Karaman (MIT AeroAstro).
For the January 2016 class, a new car is being built incorporating some of the lessons learned from the first class. The really exciting part is that the new RACECAR will be open source! This includes not only the vehicle hardware and software, but also the lectures and notes themselves! I thought it would be interesting to take a look at last years hardware and get a feel for what it takes to take make an autonomous Ackerman steered robot.
Hardware Overview
Overall the design of the car is straightforward. Most of the hardware used is available off the shelf. Here’s a list of the major components:
The R/C Car – Traxxas Rally 7407
On board computer – NVIDIA Jetson TK1
2D LIDAR – Hokuyo UST-10LX
Camera – Point Grey Firefly MV
Battery for electronics – Energizer XP8000AB
A couple of the electronic components come from Sparkfun, specifically an opto-isolator board and Razor 9DOF IMU.
The structure of the vehicle is augmented by acrylic platforms to mount the sensors and electronics, along with some 3D printed parts for the overall structure itself.
An optical flow visual odometer, a PX4FLOW, is mounted on the top platform. However in practice the device was not used very much because it did not provide sufficient resolution for the environment where the cars were operating.
There is one custom built electronic part on the vehicle. A custom circuit board connects to the Jetson J3 header which adds access to the Jetson GPIO signals, adds a real time clock, and an opto-isolator. The GPIO access is used to send PWM signals to the vehicle’s servos and motors.
Software
The software is based on the Robot Operating System (ROS) running on the Jetson. The motors are controlled over PWM signals from the Jetson J3 header. ROS nodes for the Hokuyo LIDAR and other sensors are part of the software package given to students.
Images
Here’s some images of the class cars:
Conclusion
After the first class the instructors noticed there was room for improvement in a couple of areas. This, along with the every changing landscape of better electronics and sensors, gave a good reason to redesign the vehicle towards the goal of making it a better teaching tool and research platform.
The class is only a couple of months away, so I’m pretty excited to see what’s coming next.
5 Responses
I was wondering where the class resources are open scoured to since nothing is on the class site.
They have not been released yet, my understanding is that the release is still several weeks away.
They updated their github page with hardware and software resources for the car. https://github.com/mit-racecar But they haven’t put up any of the class material like lectures and notes.
I talked to the MIT guys at GTC, they’re a couple of weeks into the class.