JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

Intel RealSense Camera

The Intel RealSense Camera is a RGBD (Red, Green, Blue, Depth) camera which fits a large amount of imaging technology into a small package. Looky here:

Background

Over the last couple of years, we’ve had several articles about RGBD cameras. RGBD cameras provide a color image stream (the RGB part) and a depth stream (the D part) which can be used for a variety of imaging and robotic applications. In the video you saw how the mechanical packaging of these types of devices has changed over the years.

A couple of months ago Intel announced the Intel® RealSense™ Robotic Development Kit which includes a RealSense R200 camera, along with an Intel Atom x5 processor board. I thought that was interesting enough to order the kit from Intel. After all it had the word “robot” in the name!

At the same time, I am intrigued with the form factor of the R200 camera. This uses the same technology as the ‘Project Tango’ tablet (now called ‘Tango’), though without the fisheye camera. The R200 is available separately, and for a price of $99 USD it is certainly worth investigating. Will it run on the Jetson?

Please, you’re on this website. You know it will!

Technology

From the librealsense documentation:

R200 Component Layout
R200 Component Layout

The R200 is an active stereo camera with a 70mm baseline. Indoors, the R200 uses a class-1 laser device to project additional texture into a scene for better stereo performance. The R200 works in disparity space and has a maximum search range of 63 pixels horizontally, the result of which is a 72cm minimum depth distance at the nominal 628×468 resolution. At 320×240, the minimum depth distance reduces to 32cm. The laser texture from multiple R200 devices produces constructive interference, resulting in the feature that R200s can be colocated in the same environment. The dual IR cameras are global shutter, while 1080p RGB imager is rolling shutter. An internal clock triggers all 3 image sensors as a group and this library provides matched frame sets.

Outdoors, the laser has no effect over ambient infrared from the sun. Furthermore, at default settings, IR sensors can become oversaturated in a fully sunlit environment so gain/exposure/fps tuning might be required. The recommended indoor depth range is around 3.5m.

In a little less techy terms, the camera appears pretty capable. The camera works outdoors (a big drawback to the indoors only Kinect/infrared types of devices), and multiple R200s can be used in the same physical indoor space without having to worry about infrared pattern interference. All of the depth processing and image registration is done in hardware on the camera, so there isn’t computational drag on the host computer. That is one of the drawbacks of the Stereolabs ZED camera, where the host processor builds the depth maps from the gathered camera images. That takes a large amount of compute cycles on a small processor like a Jetson TK1. The biggest thing of all? The R200 is small. For the amount of horsepower, the $100 price is a bargain in the current marketplace.

Software

Intel has made available an open source library, librealsense on Github. librealsense is a cross platform library which allows developers to interface with the RealSense family of cameras, including the R200. Support is provided for Windows, Macintosh, and Linux.

The next couple of articles on JetsonHacks will cover how to install librealsense on the Jetson TK1. This includes building a kernel with support for the cameras, along with installing the librealsense library. At this point, let’s say that this is a project for non-noobs.

Intel also has a ROS interface for the R200. There will be an upcoming article about installing the ROS interface too!

Conclusion

The Intel RealSense R200 is an interesting entry into the 3D imaging field. If sophisticated imaging robotic applications are to enter into the personal developer market (as opposed to corporate developer), this device will be one of the key enablers.

Facebook
Twitter
LinkedIn
Reddit
Email
Print

8 Responses

  1. It is good to see you back posting again.
    This device has a laser in it… where is the plush shark!?

    A day/night outdoor test would be great for a future video content.
    The spec sheet mentions 13 feet as the outside range of measurement.
    10 feet would seem reasonable for the day/night outdoor comparison.

    This looks like Intel did not try to rush it out the door. Very polished for a first gen product. It may be time to retire my trusty Kinect 360.

    Does the module connector allow multiple cameras to trigger from one clock source?

    1. You are absolutely correct, I forgot the shark! This is a sad day, indeed.
      From what I understand, 10 meters is the most that can be expected outdoors. When outdoors the laser is off, it’s just acting as a stereo camera, though in the infrared light range. I don’t know what to expect outdoors at night, I would guess it’s the same as the indoor if you turn the laser projector on, but not sure.

      Once I mount the camera on a robot platform with an external power supply, I should be able to make some comparisons. I’m hoping to substitute the RealSense camera(s) for the Stereolabs ZED + Occipital Structure cameras on the RACECAR. Intel makes a short range camera, the F200 which is a little bit more money ($129) that covers the 0.2 to 1 meter range.

      I think this is the second generation of the RealSense products. Intel is spending a lot of money on this product line as they think it represents a direction of value to their company. Intel has invested large ($50M+) amounts in several different drone manufacturers, including Yuneec which has a RealSense cameras option for their Typhoon drones. Robotics is another area Intel would like to target. There has also been a push in the tablet/phone markets and webcam markets for this technology. Google Tango tablets will have this technology built in, the first products have just been announced.

      In answer to your last question, I believe the answer is no. The camera plugs into USB.

  2. Hi,

    It was interesting to see what you done with R200.

    One thing I don’t understand is that you tested two of R200 together? I thought when IR pattern overlaps the depth sensing quality decreases significantly when using multiple RGB-D sensor together.

    In your article, you said
    “The laser texture from multiple R200 devices produces constructive interference, resulting in the feature that R200s can be colocated in the same environment. ”

    is it possible? I’m doubting since two laser projector can not be placed at the exact same position. (at least there is one inch difference between two R200 even though you place one R200 at the very top of the other R200)

    1. I would point out in the video that two cameras are being used on the same subject.

      The interference issue depends on the sensing method being used by the camera. Earlier RealSense cameras, and the Prime Sense based cameras such as the Kinect V1, use a structured light technique. In that case when the patterns overlap you may get destructive interference, meaning that the cameras have issues sensing depth.

      On the other hand, the R200 is an active stereo camera so it uses the patterns in a different manner. I seem to recall reading that the R200 is a time of flight camera. Intel states that you get “constructive interference” with multiple R200s in the same space. You’ll need to find some technical papers if you want a more in depth explanation.

  3. Hi Sir, Could you teach how to install driver and SDK of Point Grey camera? I try a lot but I still not success. Thank you Sir.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities