JetsonHacks

Developing on NVIDIA® Jetson™ for AI on the Edge

Jetson Orin Nano Tutorial: SSD Install, Boot, and JetPack Setup

The NVIDIA Jetson Orin Nano Developer Kit can boot from a Solid State Drive (SSD). You can set up your Orin Nano in under 45 minutes from the command line. Looky here:

Background

The Jetson Orin Nano Developer Kit has two M.2 Key M slots which allow installation of PCIe SSDs. You can boot the system from the SSD, bypassing the SD card. A SSD is much faster than the default storage medium of a SD card (5-10X in practice).

Previous Jetson production modules in the Nano/NX form factor use eMMC for their main drive storage. eMMC is a type of flash memory. The new Orin Nano and Orin NX forego the eMMC, and boot directly from a SSD attached via a M.2 Key M slot. This brings along a couple of changes. First, there is a small amount of flash memory, called QSPI, on the Jetson module which holds the bootloader and hardware configuration details. Here’s an article which speaks more in depth about that. QSPI is normally used in the SD card version of the previous generation Jetson Dev Kits. QSPI is present on all Orin Nano/NX modules.

The second change is the lack of eMMC on the production modules. With ever increasing storage demands for Deep Learning tasks, many users were running out of memory on their eMMC drives. Rather than offering ever larger eMMC, NVIDIA decided to simply use an external SSD. eMMC is expensive (and may be difficult to source), SSD is less so. Another advantage is that SSDs are available in a large range of sizes, 128GB to 1TB are available at consumer prices under $60 currently.

Selecting a SSD

Here are some things to keep in mind when buying a SSD for your Jetson Orin Nano Developer Kit. The first is that you must use a PCIe SSD. SSDs are available in two flavors, PCIe and SATA. Many PCs use SATA drives, so you have to be careful when buying. Typically SATA drives have two notches on their connector edge, while the PCIe drives have only one. The SATA drives no-worky on the Jetson.

The second thing to know is that the Orin Nano Devkit NVMe slot is PCIe Gen 3. Which means that you won’t get Gen 4 speeds if you are using a Gen 4 drive. Most Gen 4 drives are compatible with the Orin Nano, but you usually pay a premium for no added benefit.

The third thing to know is that you should probably know how much power the drive draws. In the video for example, we’re using a 1TB drive with a power draw of 2.5A at 3.3V. This is 8.25W. The Orin Nano Devkit board can provide 36 watts total to the system. Unlike a traditional desktop, you need to keep track of your power budget. Remember that the Orin Nano SoC itself can use 15W (you would add a little more for a safety buffer). If you add in another 8W, that means you have about another 8-10 for other peripherals give or take. Smaller capacity drives tend to use less power.

Also, remember you’re going to need backups. Right? So a larger drive is not always a blessing.

Some SSD Recommendations

Here’s a couple that have worked well over the years. This is one area where we’ve seen price decreases!

The drives are available in different sizes, find one that fits your needs. While you’re shopping, you may also want to pick up a jumper to put the Jetson into force recovery mode. You’ll also need a data capable USB-C cable to connect your Jetson to your host PC.

The Flashing Process

NVIDIA provides different ways of flashing the Jetson. You must have an x86 host machine, desktop or Linux. The requirements for the distribution are different depending on which method you choose. One way to set everything up is to use the official NVIDIA SDK Manager. This is the recommended way if you are new to the Jetson ecosystem. SDK Manager runs on different distributions and under Docker. There’s a video on the JetsonHacks YouTube channel that goes through the process here.

If you are a little more hard core, you can flash from the command line. I’ve written some convenience scripts to help with this. These scripts are derived from instructions directly from the NVIDIA Jetson Linux site and Jetson Linux Developer Guide Quick Start section.

To be clear, flashing from the command line is serious business. If you are a professional developer, you will need to learn the different flashing options. These options have to do with security, redundancy, and flashing multiple Jetson at once. All important in a production environment.

These scripts, on the other hand, take the most common use case and automates it.

The process takes ~ 45 minutes, depending on your host computer and network connection speeds. The JetsonHacks scripts provide guard code around the NVIDIA supplied flashing scripts. The JetsonHacks scripts verify that they are operating on the right machine and that the suitable Jetson is in recovery mode. Things like that.

In overview, the scripts do the following:

  • Download the Board Support Package (BSP, flashing tools, bootloader, hardware definition, etc)
  • Download the rootfs (These are the system files that you usually see on a Linux system)
  • Assemble the BSP and rootfs
  • Copy NVIDIA user space libraries to the rootfs (apply_binaries)
  • Install prerequisites on the host used for flashing
  • Flash the Jetson
    • Updates the QSPI
    • Flashes the SSD

To download the helper scripts and setup the flashing environment:

$ git clone https://github.com/jetsonhacks/bootFromExternalStorage.git

$ cd bootFromExternalStorage

$ ./get_jetson_files.sh

Flashing the Jetson

Once the archives are expanded and put in the correct place, put the Jetson into Force Recovery mode. Follow the procedure in the video. Make sure the power is off, jumper pins 9 and 10 on the button header together, and apply power. With the USB-C cable attached to the Jetson and host, make sure that you can see the Jetson using the lsusb command.

You can then use the convenience script in the repository to flash the Jetson:

$ flash_jetson_external_storage.sh

It takes ~ 20 minutes to install Jetson Linux. After flashing, the Jetson will be in ‘oem-config’ mode, ready to be setup. At this point, we’re done on the host side.

Installing JetPack

Remove the power from the Jetson. Make sure that the jumper pin is removed from the button header. Connect your Jetson to the Internet. You can set up the Jetson headless, or as shown in the video connected to a monitor, keyboard and mouse. After going through the oem-config sequence, you will have a basic install of Jetson Linux. You can then install JetPack.

JetPack is in NVIDIA repositories, and can be installed via APT.

$ sudo apt update

$ sudo apt upgrade

$ sudo apt install nvidia-jetpack

After about 20 minutes (depending on your connection speed), all of the NVIDIA JetPack goodness will be installed.

Notes

  • In the video, the host is running Ubuntu 20.04
  • In the video, Jetson Linux version 35.3.1 is installed
  • JetPack 5.1.1 is shown being installed on the Jetson
  • Jtop is shown in use in the video
  • In the video, the SSD is unformatted before flashing
Facebook
Twitter
LinkedIn
Reddit
Email
Print

29 Responses

  1. These “helper scripts” are dangerous and I don’t think should be used by most users. They are hard-coded to flash the first NVMe partition on the machine which, which may very well be the internal NVMe on your machine. The scripts also use root privileges to install software and enable systemd services, without prompting for confirmation or mentioning in the help text. I was shocked to see that such scripts are being distributed to potentially novice users without any warnings about these contents.

    1. Thank you for sharing your concerns. Since you are an expert, have read the scripts, and have a full understanding of the intended use case, you must have a much different development methodology. Perhaps you can share?
      For the rest of us:
      * The flash script checks to see if you’re on an x86 machine running Ubuntu, and that a Jetson is attached over USB.
      * You must physically place the Jetson into force recovery mode by going through a process which places a physical jumper over force recovery pins and powering the Jetson on.
      * systemd is running on the target device, not the host. The host NVMe cannot be reset using these scripts.

      For me, if a user unwittingly puts their Jetson target machine into force recovery mode using physical jumpers, plugs into their host machine with a USB cable, and then runs scripts on that host which flash the Jetson with an operating system during a 45 minute process, then there’s no amount of ‘help text’ that’s going to rectify that situation. In fact, the scripts mentioned here add extra guard code to ensure that you are running this in the correct environment. The flashing scripts themselves are provided by NVIDIA, these just wrap their process.

      Maybe that’s just me though, and it warrants not ever flashing the Jetson at all because of some perceived danger.

  2. I gave up trying to get Jetson Linux/Jetpack installed on NVMe via the SDK Manager. I could not get the device to flash successfully. The scripts / process flow here worked first time without any quirks. Nice job!

  3. I was SSD shopping for my Jetson Orin and almost blindly bought a Gen 4 NvMe SSD …hello…… Thank you, Jim for saving me $ 50 !!

  4. Excellent video and notes. I am working up the procedure to setup the jetson orin nano for a graduate class. I noticed after I ran the procedure, OpenCV 4.5.4 with Cuda was not installed. I saw on your video that it should have been in the nvidia-jetpack-dev install, but on your video it also was not installed. Do you know how to correct this?

    1. Thank you for the kind words. OpenCV from the NVIDIA repository does not have CUDA enabled. Here’s a video about how to build OpenCV with CUDA support on the Jetson: https://youtu.be/art0-99fFa8
      You should check to see if the algorithms you are using actually take advantage of CUDA before you start building. Thanks for watching!

  5. Hi, I got the Nvidia Orin Nano, and I have installed the SD Card flashing it from Windows 10. Now I want to install an SSD Drive, actually, I have installed and I can see from Ubuntu within the Jetson, and now I want to boot from the SSD and remove the SD Card, work only with the SSD, I have followed the instructions but when I run the script flash jetson external storage I’m receiving an error related to a directory, dtc is missed. I have searched it and I see that directory within the usr/bin directory.

    Do I need to add something to the PATH?

    Thank you for sharing your knowledge, very helpful

  6. At the end of the first paragraph of “The Flashing Process” section above I would advise adding something like,

    “Flashing with the sdkmanager is covered in our NVIDIA SDK Manager Tutorial: Installing Jetson Software Explained video.”

    My success with sdkmanager is largely due to that video. Highly recommended.

  7. Please delete my last message! I’ve found a few missing pre-requisites for the host that the scripts weren’t picking up. All is good in the world again 🙂

    Thanks for the great videos and content.

  8. It’s turned out more problematic that I first thought, mostly because of biological input error 🙂 I use 22.04 on my x86 host machine, I fought with it hard before realising it isn’t supported 😔

    The symptom was that I get all the way to the reboot of the Orin, and waiting for SSH to come live, it connects and then refuses to write. The missing packages are a herring, because of my host OS. I also wasted some time questioning my SSD compatibility (SN570 1TB WD Blue) the original that came with the Seeed Studio dev is a low quality cheap item (labelled FSB0C128G-C4C7299). My final debug step was repeat the same process with the original 128GB SSD, exact same symptom 😂 it’s at this point I noticed the clearly documented supported hosts.

    I’m preparing a live USB of 20.04 to go again.

    1. The only way I could get my Orin nano to flash was to build a physical x86 20.04 host. The latest versions of VirtualBox and VMware fail the flashing process when running 20.04 virtualised on Monterey. They seem unable to enumerate the USB connection fast enough/correctly between the host and target as it switches state. The SN570 1TB SSD worked and is a noticeable improvement.

  9. Great review, I appreciate it!

    On my Orin Nano, I’m having issues installing nvidia-jetpack; there seems to be a circle of broken dependencies preventing it from installing. I’m still digging through the error messages to figure out exactly what the problem is, but did you see anything like this when setting it up? I’ve poured through NVidias documentation and have run their troubleshooting commands, no dice.

    I think one of the packages updated with ‘apt update’ broke a dependency in jetpack, and trying to install jetpack fails because it’s refusing to downgrade that package.

    Unrelated, or maybe related, I installed the nVidia CUDA 11-8 compatibility package but the software I’m running isn’t seeing it (the nVidia CUDA 11-8 container). Maybe I need to use the CUDA 11-4 container and put the compatibility package on that? I’m just frustrated and feel at a dead end with this, any troubleshooting ideas are greatly appreciated.

    1. I haven’t seen any error messages on a normal install.
      When you talk about containers, what are you talking about? My current understanding is that the CUDA 11-8 compatibility package is for x86 machines, not arm64 machines like the Jetson.

      Each version of JetPack has an associated version of CUDA with it. On a direct install, the nvidia-jetpack repository should install the associated version of CUDA. You can use
      $ apt info
      to explore.
      For example:
      $ apt info nvidia-jetpack

      If you follow the dependencies, you should get to nvidia-cuda, which will report the coda-runtime version. It’s probably 11.4.
      You should flash the system, update and then apt install nvidia-jetpack.
      Hope this helps.
      Jim

      1. Jim,
        There are cuda-compat packages on the aarch64 apt repo, I installed it using apt install.

        By containers I mean I installed aarch64 docker and the nvidia container runtime. This allows for containerized docker workloads (such as the automatic1111 webui) but getting it to work with the required cuda 11-8 is tricky. I ended up finding a fantastic resource and was able to deploy gpu accelerated ML workloads via Docker on my Jetson Orin Nano using https://github.com/dusty-nv/jetson-containers/tree/master/packages/diffusion/stable-diffusion-webui
        Dusty-nv has several Jetson compatible containers available.
        My end goal is a 4 node Jetson Orin Nano cluster running K3S (probably with K3D as the k8s distro) with hw accelerated workloads. I’m almost there, I think.

  10. Jim, When I run ./flash_jetson_external_storage.sh on an orin, the error message says: Error: Reading board information dailed. Command tegrarcm_v2 –new_session –chip 0x23 0 –uid –download bct_br br_bct_BR.bct –download mb1 mb1_t234_prod_aligned_sigheader.bin.encrypt –download psc_bl1 psc_bl1_t234_prod_aligned_sigheader.bin.encrypt –download bct_mb1 mb1_bct_MB1_sigheader.bct.encrypt
    ERROR: Unsupported device.
    This method currently only works for the Jetson Xavier or Jetson Orin
    I’ve made sure the Orin is in Recovery mode and the device becomes pretty hot when I leave it powered on in the recovery mode. Any suggestions what went wrong?
    Thanks,
    Rosa

    1. A couple of questions. Which Orin are you using, AGX or Nano?
      What version of Ubuntu are you using on the host?
      Is this on a VM, or a native Ubuntu installation?

      NVIDIA has been trying to find this issue for quite some time. It has to do with instability of USB signals they are thinking. They suggest in order:

      $ sudo -s
      $ echo -1 > /sys/module/usbcore/parameters/autosuspend

      * Using a different USB cable
      * Using a different USB port
      * Finding a different host machine to flash
      * Ubuntu 18.04 on the host seems more reliable, 20.04 seems to work many times, 22.04 not supported

      I would try doing this using their SDK Manager instead of these scripts. These were written before the SDK Manager supported this functionality

  11. Jim,
    Thanks for the info. The first two commands solve the problem.

    To answer your questions, the host computer is a native Linux (Ubuntu 20.04) on a Dell Precision. The device is a Jetson Orin Nano 8GB development kit. The SSD card is a Crucial P3 PCIe 3.0 M2 with 512 GB. I had previously tried SDK Manager and the error log showed similar error messages.

    On a side note, I think the physical USB-C plug may cause some problems. 1. it needs a big push to plug in securely on the Orin Nano. 2. I had prior experience that the USB-C to USB cable may need to rotate the orientation of the USB-C plug. This happened with the RealSense Stereo camera. If the computer can’t detect the camera, then rotate the USB-C 180 degrees on the RealSense and relaunch the program again. That would solve the problem. I did try switching cables, orientations, and ports yesterday and could not solve the problem. Anyway, the commands helped today.
    Thanks again for your wonderful work and videos.
    Rosa

    1. Thank you for the kind words. It is very disconcerting when you can’t trust your connections! I am glad you were able to get it to work, and thanks for reading!

  12. Hi! I have a problem that occurred also when I use sdkmanager, even I have 128GB NVMe SSD, I creates 14GB partition + set of other partitions 🙁 don’t know how to solve this, tried different things but no success.

      1. hey! sorry for late answer I didnt receive update on this comment, but answering your question: no there wasn’t such step. Not sure if it should be a part of nvsdkmanager_flash script?

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer

Some links here are affiliate links. If you purchase through these links I will receive a small commission at no additional cost to you. As an Amazon Associate, I earn from qualifying purchases.

Books, Ideas & Other Curiosities