What is NVIDIA® Jetson AGX Orin™ Developer Kit Emulation Mode?

NVIDIA® Jetson AGX Orin™, which has the same shape as NVIDIA® Jetson™ AGX Xavier™ and boasts towerPC-class performance, has released the "Jetson AGX Orin Developer Kit" for module evaluation. Jetson AGX Orin Developer Kit can emulate NVIDIA® Jetson Orin™ NX and NVIDIA® Jetson Orin™ Nano as well as evaluate Jetson AGX Orin module.

In this article, what is emulation in the first place? We will answer the question of what can be done in emulation mode.

What is emulation emulation?

"Emulation" means "mimicry", and in the world of computers, it is used in the sense of "executing software that operates on a specific device in a different environment".

All Jetson Orin modules share a single SoC architecture, allowing the developer kit to mimic (emulate) the performance of any module. In fact, by changing the software with the Jetson AGX Orin developer kit, it is possible to create a pseudo operating environment and evaluate the Jetson Orin series listed below on a single piece of hardware.

Procedure for using emulation mode

  1. Hardware preparation
  2. Software preparation
  3. Emulation mode setting

   

*supplement)

This time, we are introducing the procedure using the following software (version) in the Jetson AGX Orin developer kit.

・L4T (Linux): Jetson Linux 35.1

・Jetpack: JetPack 5.0.2

・DeepStream SDK: DeepStream 6.1.1

1: Hardware preparation

1. Connect the Jetson AGX Orin developer kit to the host PC (Linux: Ubuntu20.04) with the following USB cable.

Hardware preparation

2. Connect the AC adapter that comes with the Jetson AGX Orin developer kit.

3. Follow the instructions on the manufacturer 's website and follow the steps below to boot into Force Recovery Mode.

Step1: Power on the carrier board and press and hold the RECOVERY button.

Step2: Press the reset button.

Step3: Make sure the power light is on but the display is blank.

   

4. Enter the following command on the host PC to confirm that the USB interface of the Jetson AGX Orin developer kit is recognized.

host$ lsusb

Check the display of “Nvidia Corp.” as below.

2: Software preparation

Prepare the software required for emulation settings by referring to the guide on the manufacturer 's website.

This time, referring to the procedure on the following site, we will introduce the procedure for setting to the configuration of Jetson Orin Nano (8GB) as an example.

1. Download Orin_Nano_Overlay_35.1.tbz2.

2. Refer to the guide on the manufacturer's site, execute the command on the host PC as follows, and prepare the necessary software.

host$ cd ~/Downloads host$ wget https://developer.nvidia.com/embedded/l4t/r35_release_v1.0/release/jetson_linux_r35.1.0_aarch64.tbz2 host$ wget https://developer.nvidia.com/embedded/l4t/r35_release_v1.0/release/tegra_linux_sample-root-filesystem_r35.1.0_aarch64.tbz2 host$ sudo tar -xpf jetson_linux_r35.1.0_aarch64.tbz2 host$ cd Linux_for_Tegra/rootfs host$ sudo tar -xpf ~/Downloads/tegra_linux_sample-root-filesystem_r35.1.0_aarch64.tbz2 host$ cd ../ host$ sudo apt-get install qemu-user-static hsot$ sudo ./apply_binaries.sh host$ cd ../ host$ sudo tar -xpf ~/Downloads/Orin_Nano_Overlay_35.1.tbz2 host$ cd Linux_for_Tegra

3: Emulation mode setting

Run a command similar to the following to set the Jetson AGX Orin Developer Kit to the Jetson Orin Nano (8GB) configuration.

(Required time: about 30 minutes)

host> sudo ./flash.sh jetson-agx-orin-devkit-as-nano8gb mmcblk0p1

Orin will automatically reboot after displaying a message similar to the following on the console screen of the host PC.

The display connected to Orin will show the startup screen, so please make the desired settings.

After completing the settings, the start screen is displayed.

Left-click on "Power display" in the upper right and select "Run Jetson Power GUI" to see the configuration of the NVIDIA Jetson AGX Orin developer kit.

You can see that it is configured with the Nano8GB settings as shown below.

Operation confirmation

Now that we have configured the Nano8GB configuration, let's run the application and check how it works.

This time, I will introduce the procedure using the PeopleNet inference sample in the following steps.

・Install Jetpack

・Install DeepStream SDK

・Operation check of PeopleNet inference sample

Install Jetpack

Refer to the manufacturer 's site guide and execute the following command to install Jetpack.

(Required time: about 30 minutes)

$ sudo apt update $ sudo apt dist-upgrade $ sudo reboot $ sudo apt install nvidia-jetpack

Install DeepStream SDK

Refer to the guide on the manufacturer 's site and execute the command below to install DeepStreamSDK.

(Required time: about 30 minutes)

<execution example>

$ sudo apt install \ libssl1.1 \ libgstreamer1.0-0 \ gstreamer1.0-tools \ gstreamer1.0-plugins-good \ gstreamer1.0-plugins-bad \ gstreamer1.0-plugins-ugly \ gstreamer1.0-libav \ libgstreamer-plugins-base1.0-dev \ libgstrtspserver-1.0-0 \ libjansson4 \ libyaml-cpp-dev
$ cd ~/Downloads $ git clone https://github.com/edenhill/librdkafka.git $ cd librdkafka $ git reset --hard 7101c2310341ab3f4675fc565f64f0967e135a6a $ ./configure $ make $ sudo make install $ sudo mkdir -p /opt/nvidia/deepstream/deepstream-6.1/lib $ sudo cp /usr/local/lib/librdkafka* /opt/nvidia/deepstream/deepstream-6.1/lib

Download and install “DeepSreamSDK” according to the following guide on the manufacturer site.

Operation check of PeopleNet inference sample

Let's check the operation by executing the PeopleNet inference sample referring to the following procedure attached to DeepSteamSDK.

/opt/nvidia/deepstream/deepstream-6.1/samples/configs/tao_pretrained_models/README

First, download the files necessary for execution and set up for execution as shown in the execution example below.

<execution example>

$ cd /opt/nvidia/deepstream/deepstream-6.1 $ cd sources/apps/sample_apps $ sudo git clone https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps.git $ cd ../../../ $ sudo cp ./sources/apps/sample_apps/deepstream_reference_apps/deepstream_app_tao_configs/* samples/configs/tao_pretrained_models/ $ cd /opt/nvidia/deepstream/deepstream-6.1/samples/configs/ $ sudo apt-get install git-svn $ sudo git svn clone https://github.com/NVIDIA-AI-IOT/deepstream_reference_apps/trunk/deepstream_app_tao_configs $ sudo cp deepstream_app_tao_configs/* tao_pretrained_models/ $ cd /opt/nvidia/deepstream/deepstream-6.1/samples/configs/tao_pretrained_models/ $ sudo ./download_models.sh

Once setup is complete, run the following command to run the Peoplenet sample.

$ sudo deepstream-app -c deepstream_app_source1_peoplenet.txt

You should be able to see the video display of the inference image below.

As a result of this execution, it seems that it is about 30 [FPS].

Next, we will introduce an example of improving processing performance, including deleting processing related to video display.

First, set the operating clock to MAX with the following command.

$ sudo nvpmodel -m 0 $ sudo jetson_clocks

You can confirm that it is set to operate at the maximum frequency as shown below.

Next, let's modify the following file to speed up the inference process, including the process related to image display.

deepstream_app_source1_peoplenet.txt

Disable processing related to image display as follows. (Corrections are written in red)

Disabling processing related to image display

Modify the max config file for Tracker as follows. (Corrections are written in red)

Regarding Tracker, modified max config file

Let's run it again and see how it works.

I think that you can confirm the speedup of processing as follows. (It seems that the execution result this time is about 96 [FPS].)

At the end

This article introduced the Jetson AGX Orin developer kit emulation mode.

This time, we introduced the procedure for the nano-8GB configuration as an example, but emulation with other configurations is also possible.

The execution results with other configurations are also listed below, so please refer to them.

With image display

No image display

Jetson AGX Orin 64GB

30

354

Jetson Orin NX 16GB

30

140

Jetson Orin Nano 8GB

30

96

Unit [FPS]