Back in January of 2015, MIT offered a class called RACECAR (Rapid Autonomous Complex-Environment Competing Ackermann-steering Robot). The class outline:
We will design and implement perception and planning algorithms for cars that can quickly navigate through complex environments. The class will be divided into six teams. Each team will be given one RC race car, powered by an NVIDIA Jetson embedded supercomputer as well as an inertial measurement unit, a visual odometer, a laser scanner, and a camera.
In the class, 4 teams of 5 students each competed to complete a 515 foot course in the underground MIT tunnels in the shortest amount of time. Autonomously!
The hardware and software designs and code are available on the RACECAR Github repository.
Hardware
Each team was given a robotic platform. They placed a NVIDIA Jetson TK1 on a Traxxas 7404 chassis, along with a few well chosen sensors. The sensors include:
- Sparkfun 9 DOF Razor IMU (mounted Flat)
- 3DRobotics PX4Flow sensor (pointing up – used as a visual odometer)
- HOKUYO UST-10LX 3D-scanning Lidar – 10 meter range planar laser range finder
- Point Grey Firefly USB camera, 0.3 MP monochrome (facing forward)
and an Energizer 8000mAh battery for powering the Jetson and sensors.
Software
The software platform for the embedded Linux Jetson platform included the Robot Operating System (ROS) stack, along with OpenCV. The participants used the platform to implement localization and planning algorithms for race day.
Results
Here’s a video about the class and some actual race footage. I’m not going to give it away, but there may be crashes! Looky here: