JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

Anthony Virtuoso’s Jetson TX1 Rover Project

Anthony Virtuoso from New York, New York just finished building a hardware stack for a rover robot based on the Jetson TX1. Now he’s getting ready to write the software that allows the robot to run autonomously. Best of all, Anthony has graciously agreed to share the robot build with us here on JetsonHacks!

Background

By day Anthony Virtuoso is a Senior Software Development Engineer at Amazon in NYC where they are working on building a massive data analytics platform for several thousand users. The platform is built on Amazon Web Services and the associated ecosystem. Big metal software. Anthony feels that the rapidly maturing Machine Learning and Deep Learning fields can deliver innovative features for his groups customers.

Anthony is well qualified to make such assessments as he is just finishing up a Master’s Degree in Computer Science (Machine Learning) from Columbia University. Currently Machine Learning is all about utilizing the GPU. Money quote from Anthony:

I personally learn by doing, so I needed a project where I could use this technology to solve a real world problem. I needed a way to see, first hand, what Nvidia’s CUDA or OpenCV could really do when pitted against a top-of-the-line CPU in an intensive task. So, I did what any bored engineer would do I fabricated a complex problem to answer a simple question: “How difficult is it to use a GPU to speed up a largely compute-bound operation?”

But why build a robot?

I’m a software engineer by trade but I’ve never really been able to get the opportunity to work on/with hardware that enables my software to interact with the physical world in a meaningful way. For that reason and because the Jetson seemed like such an amazing platform I set out to build an autonomous rover… but to do so a bit differently. I had read up on ROS and their navigation stack but before handing control over to these season frameworks I wanted to understand how far a naive implementation could go… basically “Why is SLAM and navigation such a hard problem to solve?”

Hardware Build

Here’s what the completed hardware looks like. The project is in the ros_hercules repository on Github.

External View of the Hercules Rover
External View of the Hercules Rover
Internal View of the Hercules Rover
Internal View of the Hercules Rover

The robot uses a couple of the usual suspects for sensors, a Stereolabs ZED stereo camera and a RP-LIDAR unit which is a cost effective 2D LIDAR for robotic applications.

Software

With the hardware base well underway, Anthony is starting to turn attention towards the more interesting part of the project, which is the robot software. Included in the ros_hercules README.md are several great tips and tricks for interfacing with the the rover sensor hardware and micro controllers.

It promises to be very interesting (and fun!) to watch an experienced machine learning expert apply and explore their craft here.

Facebook
Twitter
LinkedIn
Reddit
Email
Print

8 Responses

  1. Is there a way for people interested in this to get in contact with Anthony? I’d love to talk with him about his project. I am working on something very similar.

  2. Hi

    I saw that you had to compile a kernel to use the usb-uart converter for the lidar. Would it be possible to elaborate in detail on how you did this? I’m working on a similar project. I plan to use the RPLIDAR A2 (newer version of the one you used) which includes a (probably) similar/equal Adapter (in combination with either a Jetson TX1 or Jetson TX2 – not sure if compiling the kernel is necessary with the TX2 since it comes with a newer version of the L4T where the kernel is hopefully already included).

    Best Regards

    1. Hi Michael,
      The CP210x UART driver is not built in the default Jetson TX1 or TX2 kernel. You have to compile the module yourself. Since it’s a development kit, most of the drivers are not built so that developers can pick and choose the ones that they need. Compiling a kernel onboard the device isn’t terribly difficult, it’s just not ‘plug and play’. There are articles on JetsonHacks here that will show you the process.

      This is an article about Anthony, you can contact Anthony through his Github account. Thanks for reading!

      1. Hi

        Thanks for the fast answer – I apologize for the probably quite naiv question – this is the first time working in a Linux environment. I’ll have a look around here for articles and if I don’t get it working I’ll consider contacting Anthony. For the moment I’m happy to know that somebody got the RPLIDAR working on a Jetson!

        Michael

        1. Most people coming from desktop environments are usually a little surprised that there isn’t a wide selection of drivers already installed. In the embedded world, space is considered to be a scarce commodity so the general wisdom is to offer just the bare bones support for the development board, and then let people add their own peripherals as they choose. That way, others don’t have the overhead of having a ton of drivers installed that they don’t use. Have a look around, there’s a lot of stuff here!

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities