Developing on NVIDIA® Jetson™ for AI on the Edge

Jetson RACECAR Restart!

The green flag is back out, and the Jetson RACECAR project is starting again! The Jetson RACECAR is a ROS based, 1/10 scale, R/C car platform autonomous vehicle. The Jetson RACECAR is based on the MIT RACECAR project, an open source hardware and software project.

Here’s a quick look at some of the changes that we’re making to the previous work. Looky here:

Project Intent

Autonomous vehicles are a very interesting area to study. Current research vehicles tend to be rather expensive. The idea here is to build a scale model with many of the same features and sensor types so that any particular problem area can be broken down into the component parts. The idea here is play. We want to be able to play with the parts that interest us.

Component Selection

With that idea in mind, that means we should be able to look into any given component as deeply as desired, while maintaining full control of vehicle. As an example, the stock Electronic Speed Ccontroller (ESC) on the TRAXXAS performs admirably, but is difficult to control at slow speeds. After all, it’s intended use is off road racing, there’s not much need for going at a snails pace!

However, under robotic control we should be able to control the vehicle at any speed we choose. For that reason, the VESC controller is the choice because it both can be controlled at slow speeds, and is open source. In theory, that means if you’re interested in control theory, you can look ‘at the bottom of the hardware stack’.

The selection of sensors is also a major choice point. For example, the MIT RACECAR uses a Hokuyo UST-10LX laser range finder. MIT races in the tunnels underneath the campus, so the LIDAR is a great mapping aid. On the other hand, it’s not very well suited towards outdoor racing. Cameras seem to be the choice for the great outdoors. The Hokuyo is a little on the pricey side, a major strike against it for some folks.

Things change

When working with consumer products, there are frequent changes to products. Sometimes the products are discontinued, are changed significantly, or are difficult to acquire. If they are difficult to get a hold of because of popularity, that’s a good thing. If it’s because of lack of demand, that can be bad.

In this particular project, there have been several parts that have changed or been replaced. The TRAXXAS car itself is difficult to acquire. The SparkFun IMU has been superseded. The original battery to drive the electronics has been superseded. The VESC (an open-source, brushless motor controller) which originally had quite a long lead time, is now more easily available. The VESC also is now offered in a version with considerably friendlier device package.

It’s Go Time

In the video, we go over some of the hardware selections. Some platforms made from laser cut 1/4″ ABS mount on the TRAXXAS car chassis to support the Jetson Dev Kit, sensors and electronics.

We’ve chosen to follow the MIT RACECAR electronics selection. This includes the VESC (for the reason listed above), an Amazon USB 3.0 hub, and a Jetson TX1 Development Kit.

This means that for actual control of the vehicle, we’ll be able to use the MIT RACECAR ROS software without change. We’ll worry about sensor selection further on down the line.

As noted in the video, this is the second prototype. There is a third prototype which uses updated parts, and some of the lessons learned from this particular build.

As a note, this doesn’t mean there aren’t other equally interesting component choices. You can read the Daniel Tobias’ Cherry Autonomous Racecar article to see an alternate and absolutely amazing implementation.

Looking forward to getting this going again!


20 Responses

    1. I haven’t released a bill of materials for the car yet, and since I haven’t tested the VESC-X (so I can’t say if it meets the need) I don’t post that information. You can do a Google search for it and find it easily enough if you’re motivated. Thanks for watching!

  1. Hello, I was wondering why are you not using a sensored BLDC motor for odometry?

    I have been trying to use Roboteq SBL1360 to control sensored BLDC motor (Hubbywing XeRun V10 G2) in a 1:10 RC car and as a beginner in ROS, I am having problems writing a serial communication protocol for the motor controller.

    After following your RACECAR project, I am thinking of using a matching ESC instead (XeRun XR10 PRO) to be able to control it with Arduino’s Servo library.

    Could you please give me any advise on whether I should keep trying to write a serial communication protocol for the motor controller in ROS or should I buy an ESC?

    I apologize if this question is not related to Jetson RACECAR, I have been stuck for a while now. 🙁 My biggest problem is not having any experience with ROS apart from completing beginner’s tutorials.

    I found Roboteq ROS driver here but could not understand how to use it and ROS community was not very helpful.
    Also I found an Arduino Roboteq library here but it does not work.

    1. Lot questions here, I’m not sure what the answer should be. For example, if you read this article: this is about as simple as a ROS setup gets. The Jetson is connected to an Arduino which controls the stock ESC on the TRAXXAS, as well as the steering servo using PWM.
      If you’re not interested in the ROS part, you can just control the PWM pulses from the Arduino itself.
      ROS is complex, and can be difficult to understand when first starting out. If your goal is to become proficient in ROS, then you should consider getting the Roboteq to work. It is not going to get any simpler than serial communication, and once you understand what’s going on, you’ll have a lot better understanding of ROS.

      If you go with an ESC, then you just move over to a different set of problems. You still need to learn to interface with the ESC. Typically it’s done through PWM pulses, a big ESC like the VESC provides a variety of inputs such as USB, UART, PWM and CAN. PWM is certainly more complicated than straight serial. Again, you have to figure out how to interface through ROS with it.

      If you don’t care about ROS, you can use an Arduino, Raspberry Pi or something similar to get started and experiment. Those computers are well suited for lower level interfacing. My advice would be to figure out what you’re trying to learn first, and then figure out how to learn it. There are several communities that can help you in your quest, such as: where they build 1/10 cars with various levels of complexity. Find a project that’s slightly above your ability level and get started learning. It sounds to me like this is a case of “You don’t know what you don’t know”. So it’s important to work through something that you know you can do, and then build on that knowledge to get to the next goal.
      Good luck!

      1. Thank you so much for such a fast and insightful reply!

        You are right, I don’t know what I don’t know. 🙂 I am an undergraduate student trying to set up an experiment for the autonomous car project in the research lab. My main goal is to have an accurate motion control of the car but the struggle with ROS is slowing me down. It is not strictly required to use ROS in this project but I am supposed to learn about it anyways.

        Roboteq SBL1360 with Arduino setup does not work because the controller’s baud rate is too fast (115200 bits/s) to have a proper serial communication with Arduino.
        I was thinking to use RS232 port on Jetson TK1 to send commands to the controller directly instead but the chatty log could be a big problem.

        After reading this article I thought maybe interfacing with the ESC would be easier but then again it means spending more money.

        Bottom line, I would really like to get Roboteq to work with ROS but I have no idea where to start. I have been following guidelines for UPenn and Jetson RACECAR projects but the approaches are very different from mine. There are plenty of “easy” tutorials about ROS online but they are not so helpful in my situation. How do people get better at it?

        1. A couple of things. Most of the time when confronted with these types of problems, I try to break it down to the simplest part parts and try to accomplish some simple things first. To talk with the Roboteq, you’ll need a device which can physically interface with it.
          An Arduino should be able to do 115KBaud (I’ve done it in the past), so that’s the first task. You have to ask yourself if it Is a hardware issue, software flow control (there are things like parity bits, stop bits, and so on) or something else? Are you getting buffer overflow issue? Is it an Arduino, or a clone. Can using a larger Arduino like a Due fix the issue?

          You need to pick a small part of the problem and set a realistic first goal. Can you actually get your laptop or desktop to talk to the Roboteq? It’s serial, so you should be able to run a terminal emulator of some sort and see it respond. Do you know what commands you need to actually use it? That in and of itself is usually a daunting task on a complicated device like that. There’s usually some type of programming manual that goes along with the device. Once you can actually talk to it and get it under your control, then you can worry about integrating it into a design. As a side note, the ROS library by ClearPath probably is close to working, you probably just need to give it a good talking to.

          The only way people get better at any task is to work on getting better. The way to get better at playing piano? Play piano. Become better at a game like tennis? Play tennis. Get better at solving math problems. Solve math problems. So it is with ROS and programming. Write small programs and get them to work, then scaffold your understanding and knowledge to climb your way up. You can get better faster if you have coaching of course, find someone there who knows a thing or two and then steal their knowledge. Your at school, I’d guess there’s someone there who knows something and wants to show how smart they are. Good luck!

          1. I did get my PC to talk to the Roboteq and I researched the commands I need to feed it. So I have been worrying about integrating it into my car for a while now.
            As for my Arduino Uno/Nano, I also tried to talk to it with my PC like here and it works perfectly with slower baud rates but for 115200 bits/s even after experimenting with different delays (though it should be around 8 microseconds), it does not work. I thought of buying Arduino Mega or Due but not sure if hardware is really the issue.
            Once again, thank you for answering my questions!

          2. Oh, and it works with Arduino Serial and SoftwareSerial libraries to communicate with the local PC but not with the Roboteq.
            It really got off topic, I just wanted to explain my issue properly but did not do a good job. I really wish Roboteq, Arduino and ROS communities were as responsive as you so I would not have to flood the comments here.

          3. Hi Alena,
            It sounds like you’ve narrowed it down to two choices. You can get a faster microcontroller to talk with the Roboteq, or get a different motor controller/ESC. Each has their advantage, each disadvantage. I would think getting a small PC compatible type of board might be an option (like an UP board or something). Obviously because I don’t have the hardware you’re using I’m not a good resource. But the first step is to find a setup where you get motor and servo control.

  2. Great! I was waiting for the update for a long time. Hope this will see the light at the end of the tunnel and won’t be stuck. I am very excited. Thanks for coming back to this project.

  3. Hey this is a very cool project!
    I already bought a NVIDIA Jetson TX1 but had no time to start with a project. But your project and video so cool I have to update my robot now with the Jetson. Thank you so much for sharing all the information and How To guides.

  4. Hello, What a cool project. I have read a lot about building a remote controlled car with autonomous driving functions and would like to built it myself. I have an old remote controlled car and a NVIDIA Jetson TX2 as a base to start building. I would like to follow your project but have some different hardware.
    My programming skills are not that good but I’m learning fast so that will not be a big problem.
    I’m looking forward for the rest of your project.
    Greetings, Ton

  5. Hi~ This is good news!! You came back to make the Jetson Race Car.
    There are some changes. Is there information about how to make a low and upper platform and the platform where the sensors and various devices are attached?
    I wish to know how to make it. 🙂

  6. Hello!
    I was watching your videos and wanted to ask something:
    1. How do you get the data about the turning angle and throttle? Can you obtain those messages, do some software stuff (Deep learning, sensor fusion, localization and etc) and then send back the required throttle and steering angle? If yes, how do you obtain those pieces of information, based on your current built (JSON may be?)
    2. I am planning to use Python for the deep learning (incl. OpenCV for the lane detection, pavement detection using SegNet) and mix it with C++ through ROS platform for sensor fusion, localization and path planning). Is that something doable with your configuration?
    3. And finally, I have read about the ZED camera and it is amazing. Why do you need Lidar at the end, if you can detect objects as far as 20 meters?

    Thank you. I am curious because I have ordered TX2 and want to build the delivery autonomous robot for my research project. Of course it will not be perfect but still.

    1. The RACECAR software is built on ROS, see the Github repository: for questions pertaining to controlling the motor. The nodes for throttle interpolation and steering servo control are in under the ackerman-cmd_mux nodes.

      Some people use only cameras for their sensors, and don’t use lidars. However, there’s a bit of work that has to be done for cameras, such as lighting changes (shadows on the road, abrupt changes in lighting when entering a tunnel or parking garage, night time makes things a bit iffy, etc). Generally people use multiple sensors (IMU, ultrasonic, radar, lidar, cameras, and so on) and then perform what is known as “sensor fusion” to try to get the truth as to what is going on. That’s what the RACECAR does, it provides a platform for simple sensor fusion and experimentation. Thanks for reading!

    1. The Racecar/J shop will not be shipping internationally anytime soon. The cutting files are indeed from version 2.0, they have not been updated yet. This should happen soon, but it is done through MIT. Thanks for reading!

Leave a Reply

Your email address will not be published. Required fields are marked *


Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities