The Intel RealSense T265 Tracking Camera solves a fundamental problem in interfacing with the real world by helpfully answering “Where am I?” Looky here:
One of the most important tasks in interfacing with the real world from a computer is to calculate your position in relationship to a map of the surrounding environment. When you do this dynamically, this is known as Simultaneous Localization And Mapping, or SLAM.
If you’ve been around the mobile robotics world at all (rovers, drones, cars), you probably have heard of this term. There are other applications too, such as Augmented Reality (AR) where a computing system must place the user precisely in the surrounding environment. Suffice it to say, it’s a foundational problem.
SLAM is a computational problem. How does a device construct or update a map of an unknown environment while simultaneously keeping track of its own location within that environment? People do this naturally in small places such as a house. At a larger scale, people have been clever enough to use visual navigational aids, such as the stars, to help build their maps.
This V-SLAM solution does something very similar. Two fisheye cameras combine with the information from an Inertial Measurement Unit (IMU) to navigate using visual features to track its way around even unknown environments with accuracy.
Let’s just say that this is a non-trivial problem. If you have tried to implement this yourself, you know that it can be expensive and time consuming. The Intel RealSense T265 Tracking Camera provides precise and robust tracking that has been extensively tested in a variety of conditions and environments.
The T265 is a self-contained tracking system that plugs into a USB port. Install the librealsense SDK, and you can start streaming pose data right away.
Here’s some tech specs:
- Global Shutter, Fisheye Field of View = 163 degrees
- Fixed Focus, Infrared Cut Filter
- 848 x 800 resolution
- 30 frames per second
Inertial Measurement Unit (IMU)
- 6 Degrees of Freedom (6 DoF)
Visual Processing Unit (VPU)
- Movidius MA215x ASIC (Application Specific Integrated Circuit)
The Power Requirement is 300 mA at 5V (!!!). The package is 108mm Wide x 24.5mm High x 12.50mm Deep. The camera weighs 60 grams.
To interface with the camera, Intel provides the open source library librealsense. On the JetsonHacksNano account on Github, there is a repository named installLibrealsense. The repository contains convenience scripts to install librealsense.
Note: Starting with L4T 32.2.1/JetPack 4.2.2 a swap file is now part of the default install. You do not need to create a swap file if you are using this release or later. Skip the following step if using 32.2.1 or above.
In order to use the install script, you will either need to create a swapfile to ease an out of memory issue, or modify the install script to run less jobs during the make process. In the video, we chose the swapfile route. To install the swapfile:
$ git clone https://github.com/jetsonhacksnano/installSwapfile
$ cd installSwapfile
$ cd ..
You’re now ready to install librealsense.
$ git clone https://github.com/jetsonhacksnano/installLibrealsense
$ cd installLibrealsense
While the installLibrealsense.sh script has the option to compile the librealsense with CUDA support, we do not select that option. If you are using the T265 alone, there is no advantage in using CUDA, as the librealsense CUDA routines only convert images from the RealSense Depth cameras (D415, D435 and so on).
The location of librealsense SDK products:
- The library is installed in /usr/local/lib
- The header files are in /usr/local/include
- The demos and tools are located in /usr/local/bin
Go to the demos and tools directory, and checkout the realsense-viewer application and all of the different demonstrations!
The Intel RealSense T265 is a powerful tool for use in robotics and augmented/virtual reality. Well worth checking out!
- Tested on Jetson Nano L4T 32.1.0
- If you have a mobile robot, you can send wheel odometry to the RealSense T265 through the librealsense SDK for better accuracy. The details are still being worked out.
Thanks, this looks great. Would it be possible to update the Xavier repo for JP 4.2? I’d imagine it would be very similar to the Jetson Nano
Update: There’s now a PR on the Xavier repo that works fine.
Thank you! I can always find great information from your blog.
By the way, just curious, do you have any insights of using ZED Mini camera in this case?
You are welcome. I do not have any experience with the ZED Mini and the Nano yet. Thanks for reading!
These two look like a dandy pair of devices for small robots! Will be interesting to see the machines made with them.
FYI: The software for the ZED cameras has been going through a big update this past year.
Thank you for ALL the information! It’s super helpful! I just got my T265 up and running on my Jetson Nano and have a D435i that I’ll be testing later. After setting everything up and getting the camera working through the RealSense Viewer, I’m having trouble exporting a .PLY file using the GUI. Have you had any luck with this? Thanks again! I’m fairly new to this world and watching your videos and reading your site has been a huge help!
Thank you for the kind words. I have not tried exporting .PLY files. Thanks for reading!
Great tutorial, really easy to follow- however I used the additional patches and such from the github page in order to get the D435i working on the nano as well, but the device does not get recognized – any ideas?
It is difficult to tell from your description what the issue may be. Did you build and install everything on a fresh SD card? Or did you use a USB drive?
I used a fresh 64GB SD card- followed the exact instructions from github, watched the other video you posted about using the D435i and still won’t work. I saw a note in the github comments that the patches don’t work for the “i” camera, and that they only work for the D435, is this true?
It is difficult to provide much help without knowing what “don’t work” might mean.
The patches work for both the D435i and the D435.
My apologies, I should clarify, whenever the camera is plugged in the nano does not recognize it. When I run rs-enumerate-devices it says “No device detected. Is it plugged in?” and I receive the “DS5 group_devices is empty” warning when trying to run realsense-viewer.
I added what I am hoping is a fix for your issue. v0.8 (the current master) addresses the problem. Please try it out and let us know how it goes!
Thank you for the post. I set up everything and it looks all worked before I open the realsenseview. The system will be power off when I try to open the realsenseview. Do I have to use 5V-4A instead of 5V-2A if I power mouse/keyboard/t265 at the same time?
It depends on how you intend to use it. If you are running in 10W mode (2A) on the Jetson Nano, the Nano Module is using 2A for just the module. If you want to run peripherals in 10W mode, you will need more current. Thanks for reading!
Is there any solution to get the camera working with the realsense2_camera ros package on the Jetson Nano?
I am interested in installing RealSense T265 tracking camera on RacecarJ which has Jetosn TX2. Is the installation process the same on Jetson Tx2 as the Nano?
It is similar, but not the same. It also depends on which version of L4T you are running. If you are running L4T 28.2.1 for example, you can use: https://github.com/jetsonhacks/buildLibrealsense2TX as a template. installLibrealsense.sh needs to be modified to set the LIBREALSENSE_VERSION=v2.21.0 (I think it is now v2.23.0). For L4T 31.X, there is a separate set of kernel mods that you have to do. You can read the comments in: https://github.com/jetsonhacks/buildLibrealsense2Xavier which is similar. Basically you have to sign the kernel image, so you can’t just build it on the TX2 and copy it over. Instead, you would probably build it on a PC with the kernel changes, sign it, and then flash it over to the TX2.
Thanks for your post, I followed the instruction step by step. Binary files are working well. I am having trouble with ROS installation on Jetson Nano with 4.9.140-tegra kernel. It cannot find RealSense SDK.
I don’t have anything more to share than: https://github.com/JetsonHacksNano/installRealSenseROS
Did you figure that out eventually? I am also helping problems like that. Any suggestion is highly appreciated!
What accuracy can be achieved, for example, if you walk around your house around?
I am getting an error:
[ 83%] Building CXX object tools/realsense-viewer/CMakeFiles/realsense-viewer.dir/__/__/common/fw-update-helper.cpp.o
/home/lcs/librealsense/common/fw-update-helper.cpp:15:10: fatal error: common/fw/D4XX_FW_Image.h: No such file or directory
I have not found anyone with a clear fix for this.
You appear to be having issues downloading the firmware binaries. Please check the output of the cmake connection test.
Have you tried running D435 and T265 simultaneously?
I have not tried yet, but wondering whether it would work… (btw: on rpi 4 it does not)
I did not have any issues with running both, your mileage may vary. Thanks for reading!