Skip to content

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson

Notifications You must be signed in to change notification settings

Jen-Hung-Ho/ros2_jetbot_tools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

70 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Jetbot Tools with Jetson Inference DNN Vision Library and NanoLLM Container for NAV2 ROS2 Robot - Version 2.0

Jetbot Tools is a collection of ROS2 nodes that leverage the Jetson Inference DNN Vision Library and the Jetson NanoLLM Docker container for NVIDIA Jetson. With Jetbot Tools, you can build a cost-effective, two-wheel robot equipped with a camera and a lidar sensor, enabling it to perform the following impressive tasks:

  • Voice-Activated Copilot: Unleash the power of voice control for your ROS2 robot with Jetbot Voice-Activated Copilot Tools.
  • Large Language Model (LLM) Chat: Empower your Jetbot to respond using LLM chat. By default, it utilizes the meta-llama/Llama-2-7b-chat-hf model hosted in a ROS2 node.
  • Vision-Language Model (VLM) Robot Camera Image Description: Enable your Jetbot to describe images captured by its camera. By default, it employs the Efficient-Large-Model/VILA1.5-3b model hosted in a ROS2 node.
  • Lidar-Assisted Object Avoidance Self-Driving: Enable your robot to navigate autonomously and avoid obstacles using the lidar sensor.
  • Real-Time Object Detection and Tracking: Allow your robot to detect objects using the SSD Mobilenet V2 model. You can also make your robot follow a specific object that it detects.
  • Real-Time Object Detection and Distance Measurement: Enable your robot to detect and measure the distance of objects using the SSD Mobilenet V2 model and the lidar sensor. You can also make your robot follow a specific object and stop when it is too close.
  • NAV2 TF2 Position Tracking and Following: Allow your robot to track its own position and follow another Jetbot robot using the NAV2 TF2 framework.

Here is a brief overview of the jetbot tools design diagram/architecture

Setup

Jetbot tools source code and video demos:


  • Empower your robot with Voice-Activated Copilot Tool:
  • Lidar-assisted object avoidance self-driving:
    • Code logic explanation:
      • Use the LIDAR sensor to collect data from all directions and divide it into 12 segments of 30 degrees each
      • Compare the distances of the objects in the first three segments (front 90 degrees) and select the segment with the farthest open area
      • If the object in the selected segment is closer than a threshold distance to the target object
        • Repeat the comparison for the first six segments (front 180 degrees) and select the segment with the farthest object
        • If the object in the selected segment is still closer than the threshold distance to the target object
          • Repeat the comparison for all 12 segments (360 degrees) and select the segment with the farthest open area
            • Rotate the robot to face the selected segment
      • Publish a ROS2 Twist message to move the robot towards the open area
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools laser_avoidance.launch.py param_file:=./jetbot_tools/param/laser_avoidance_params.yaml
      • ros2 param get /laser_avoidance start
      • ros2 param set /laser_avoidance start true
  • Real-time object detection and tracking:
    • Code logic explanation:
      • Use Jetson DNN inference ROS2 detectnet node to detect the targeting object position of the image capture from camera
      • Calculate the angle between the image center and the targeting position
      • Use the size of the detected image to determine the distance between robot to the target
      • Send a ROS2 Twist message to move the robot follow the detection object
      • Stop the robot if it is too close to the target
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools DNN_SSD_source.launch.py model_path:=/home/jetbot/dev_ws/pytorch-ssd/models/toy/ssd-mobilenet.onnx class_labels_path:=/home/jetbot/dev_ws/pytorch-ssd/models/toy/labels.txt launch_video_source:=false topic:=/video_source/raw
      • ros2 launch jetbot_tools detect_copilot.launch.py param_file:=./jetbot_tools/param/detect_toys_copilot_params.yaml
      • ros2 param get /detect_copilot follow_detect
      • ros2 param set /detect_copilot follow_detect true
  • Real-time object detection and distance measurement:
    • Code logic explanation:
      • Use Jetson DNN inference ROS2 detectnet node to detect the targeting object position of the image capture from camera
      • The LIDAR sensor collected 360-degree ROS2 LaserScan raw data
      • Calculate the rotation angle by measuring the difference between the camera’s field of view (FOV) and the detection image position
      • Use lidar data on rotation angle to calculate the distance of an object
      • Send a ROS2 Twist message to move the robot follow the detection object
      • Stop the robot if it is too close to the target
    • Source code:
    • Usage:
      • ros2 launch jetbot_tools DNN_SSD_source.launch.py model_name:=ssd-mobilenet-v2 launch_video_source:=false topic:=/video_source/raw
      • ros2 launch jetbot_tools follow_copilot.launch.py param_file:=./jetbot_tools/param/follow_copilot_params.yaml
      • ros2 param get /follow_copilot follow_detect
      • ros2 param set /follow_copilot follow_detect true
  • NAV2 TF2 position tracking and following:
    • Code logic explanation:
      • To run this tf2_follow_copilot program, you need two robots that can use tf2 broadcaster to publish their coordinate frames.
      • The tf2_follow_copilot program uses a tf2 listener to calculate the difference between the robot frames and determine the direction and distance to follow.
      • The program publishes a ROS2 Twist message to control the GoPiGo3 robot's speed and steering, so that it can follow the jetbot robot.
    • Source code:
    • Usage:
      • Pre requirements: ros2 launch <follow_copilot.launch.py> or <detect_copilot.launch.py>
      • ros2 launch jetbot_tools tf2_follow_copilot.launch.py param_file:=./jetbot_tools/param/tf2_follow_copilot_params.yaml
      • ros2 param set /tf2_follow start_follow true

Requirements:

References

About

Jetbot tools is a set of ROS2 nodes that utilize the Jetson inference DNN vision library for NVIDIA Jetson

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published