Back in January of 2015, MIT offered a class called RACECAR (Rapid Autonomous Complex-Environment Competing Ackermann-steering Robot). The class outline:
We will design and implement perception and planning algorithms for cars that can quickly navigate through complex environments. The class will be divided into six teams. Each team will be given one RC race car, powered by an NVIDIA Jetson embedded supercomputer as well as an inertial measurement unit, a visual odometer, a laser scanner, and a camera.
In the class, 4 teams of 5 students each competed to complete a 515 foot course in the underground MIT tunnels in the shortest amount of time. Autonomously!
The hardware and software designs and code are available on the RACECAR Github repository.
Hardware
Each team was given a robotic platform. They placed a NVIDIA Jetson TK1 on a Traxxas 7404 chassis, along with a few well chosen sensors. The sensors include:
- Sparkfun 9 DOF Razor IMU (mounted Flat)
- 3DRobotics PX4Flow sensor (pointing up – used as a visual odometer)
- HOKUYO UST-10LX 3D-scanning Lidar – 10 meter range planar laser range finder
- Point Grey Firefly USB camera, 0.3 MP monochrome (facing forward)
and an Energizer 8000mAh battery for powering the Jetson and sensors.
Software
The software platform for the embedded Linux Jetson platform included the Robot Operating System (ROS) stack, along with OpenCV. The participants used the platform to implement localization and planning algorithms for race day.
Results
Here’s a video about the class and some actual race footage. I’m not going to give it away, but there may be crashes! Looky here:
3 Responses
Is any more detail available on how to build one of these
Specifically how to connect the various components and R/C controls to the Jetson TK1
Hi David,
My current understanding:
The Jetson TK1’s pulse width modulation (PWM) output signals drive the motor electronic speed controller (ESC) and steering servomotor on the Traxxas, bypassing the RC Receiver.
They used the “Grinch” kernel
They used the Robot Operating System (ROS) framework
Existing ROS drivers (urg_node, razor_imu_9dof, pointgrey_camera_driver, and px4flow_node) receive data from the sensors.
The Lidar Scanner is Ethernet – should be able to plug into the Ethernet port
The Point Grey camera is USB 3.0
The IMU is I2C – This is an easy interface to the Jetson on the GPIO pins
The PX4Flow – The optical flow, distance sensor is also I2C, I’m not sure how they implemented that. In one of the pictures on the website, it looks like there’s a little breakout board of some sort, but I can’t be sure.
There appears to be a USB hub mounted in front of the Jetson on the “drivers” side
I looked up the Sparkfun Razor IMU and noticed that it is actually a TX/RX serial output. So they may have done a FTDI USB connection. In the photos, there are wires coming out of the Jetson GPIO connectors, so it is interfacing with something, it could be a UART connection, or it could just be the PWM interface.