/r/ROS

Photograph via snooOG

This subreddit is for discussions around the Robot Operating System, or ROS.

This sub is for discussions around the Robot Operating System, or ROS.


Note that ROS (aka ROS 1) and ROS 2 are different. Please mention which one you're talking about when asking for help or starting a discussion.


See also

Both ROS (aka ROS 1) and ROS 2

ROS 1

ROS 2


If you're looking for help, please read the support guidelines before asking your question. Following those guidelines really helps in getting your question answered.

It is also recommend to post your question on Robotics Stack Exchange, since that is meant to be the central place for questions about ROS. You can of course post here and link to your Robotics Stack Exchange question.


Rules

  1. Be civil and respectful
  • Do not harass or insult others, and avoid poisoning the mood.
  • No spam or plain advertisements
    • No posts/comments that are just plain ads. You can, for example, link to a specific blog post on your website, link to a video on your YouTube channel, etc. as long as it is relevant, that it is not low-effort, and that you do not spam.
  • ROS-related posts only; stay on topic
    • Posts have to be ROS-related. Comments should not substantially derail the conversation.
  • Memes are allowed; make sure to also follow rules 2 and 3
    • As long as you follow rules 2 and 3, you can post memes. If /r/ROS turns into /r/ROSmemes, then we will reconsider this rule as a community.
  • No vague "plz help me" requests
    • If you need help about something, ask specific questions and provide as much information as possible, including the error message (as text) if applicable. Follow the ROS support guidelines (link is in the sidebar).

    Also see reddit's rules and the ROS etiquette.


    Related subreddits

    /r/ROS

    25,727 Subscribers

    1

    Message Package not Found (ROS2)

    I have two packages, gps_driver and gps_msg.

    Driver: gps_driver->gps_driver->python->gps_driver.py Msg: gps_msg->msg->GpsMsg.msg In the relevant package.xml, gps_msg is listed as a dependency for gps_driver. The gps_driver has the following line:

    from gps_msg.msg import GpsMsg

    which causes the error: ModuleNotFoundError: No module named gps_msg

    Both packages are in the .src folder of the same workspace, everything is built.

    1 Comment
    2025/02/03
    22:22 UTC

    3

    Mesh not showing up in rviz

    Hi, i tried line follower robot in ROS2 humble with a box geometry and two cylinders, gave it mass intertia etc and a camera it worked nicely, i then made a model of AGV and saved the stl file, however the model is not showing up in the rviz, its loaded correctly, I intentionally gave it wrong stl path in mesh tag and it gave error it cannot load, so the stl path is correct, the stl file is also proper, its just that its not visible and i have found no fix for it, please help

    0 Comments
    2025/02/03
    19:36 UTC

    0

    Build keeps failing for ros2_canopen

    So I'm working on a robot for a school project, and I have motors that work from CANOpen. I found the ros2_canopen repository on github to use with ros2_control for this, but whenever I go to build it there is always a failure when trying to build the canopen_core section of the repo. I am very much a beginner at this and I have no idea how to fix this issue or what other alternatives I could use for control. The robot uses a Jetson Orin Nano Dev board with ROS2 Humble.

    2 Comments
    2025/02/03
    19:35 UTC

    1

    EAI X2L LiDAR returning zeros in the ranges

    My lidar appeared to be working correctly, but then I wrote a ROS2 node that subscribes to the /scan topic, and suddenly, the /scan message returned all zeros in the "ranges" lists. See the inserted image.

    Has anyone experienced this? Any ideas as to why that could be the case?

    Thanks

    https://preview.redd.it/9awpqbozgwge1.png?width=3106&format=png&auto=webp&s=a165b3d972180475f8d8fd311f5ffb3cf2746b31

    0 Comments
    2025/02/03
    10:20 UTC

    11

    Can someone tell me a little bit about what makes ROS so great for robotics? Specifically what can ROS do that can’t be done in other programming languages.

    16 Comments
    2025/02/03
    09:16 UTC

    2

    trying to find a code that compatible to ROS2 Jazzy

    Im been looking a diffdrive_arduino similar to the Articulated robotics that is compatible in ROS2 jazzy. Can someone help me. Sorry for asking but Im just new to this.

    1 Comment
    2025/02/03
    06:00 UTC

    5

    Ros beginner feels lost

    I'm new to ROS and I'm beginner in programming as well and I'm weiting my first few ros publishers and subscribes in python faced few problems but got them right after some help from documentation but when I proceeded to write my launch files it felt like a big puzzle as I was stuck for two days and asked chatgpt for help and it imported many libraries and used many weird methods fast forward I tried to understand every line but my question is how could I as developer get a good knowledge of what Libraries to use and how to use them how do I know whats missing in my program when its not working cause its missing a library without any hint at it ?

    7 Comments
    2025/02/03
    00:12 UTC

    2

    Beginner-friendly Guided Projects

    Hello! I have been exploring ROS and Gazebo for almost a month now. The basics were pretty simple and I got a grasp of them quite quickly. After that I started doing Articulated Robotics' "Building a mobile robot" and was able to somehow survive till the control part but now I am completely lost. If anyone knows of a simple step-by-step project with a detailed explanation, I would really appreciate it.

    0 Comments
    2025/02/02
    18:32 UTC

    8

    Ackermann steering

    I'm building a robot with ackermann steering using ROS2 Humble but I'm running into problems with the controller. There are DiffDrive controllers but I'm not able to find something similar for ackermann driving in ROS2 and as a result I'm not able to drive it around in Gazebo using keyboard teleop or joystick.

    I can write a controller by myself but it will take a lot of time which I don't have at this point, so I'm looking for existing controllers that I can use.

    Thanks!

    4 Comments
    2025/02/02
    14:43 UTC

    1

    Lidar compatibility with raspberry pi 5

    I'm building an autonomous mobile robot and i have a raspberry pi 5 and I'm willing to buy this Lidar

    https://uk.rs-online.com/web/p/sensor-development-tools/2037609

    but I'm not sure if it is compatible with Ros2 jazzy and raspberry pi 5, I'm a beginner at this so excuse me if its a dumb question.

    3 Comments
    2025/02/02
    12:32 UTC

    4

    Autonomous forklift project

    Hey guys I Am working on an automated forklift project for my graduation project that: -detects boxes. -goes to the nearest one. -inserts fork in pallet correctly . -reads the qr via a normal qr scanner and knows the locarion in the warehouse it's supposed to go in. -sorts boxes besides each other . I am also a beginner in ros and only did simulations --any advice for the steps i need to finish this project or if i should use a jetson nano or raspberry pi? --if any one tried to do a similar project please contact me.

    8 Comments
    2025/02/02
    12:13 UTC

    3

    Simulators for Underwater Robotics that support ROS2 natively

    I am a software team member in a Underwater Robotics team. We have to participate in a robotics contest where we are supposed to provide a simulation of an ROV which does a specific task. I was thinking about using Gazebo for simulation but I really don't know where to get started + the fact that most Tutorials for gazebo are for Wheeled robots.

    I was thinking about simulating our model using something like Gazebo and then add plugins, but I heard simulating with something like Unity or Unreal (holocean simulator) gives better results when going for vision-based tasks.

    Also, what would be the estimated time that this process might take coz our competition is due in 3-4 weeks and, we have to make our model and have our simulation working without major flaws.

    2 Comments
    2025/02/02
    11:58 UTC

    3

    Remapping Turtlebot4 for Logitech F710 USB Wireless Game Controller

    It is just wonderful the Turtlebot4 Game Controller has left and right wall following, dock and undock functions, except that my "TurtleBot4 clone" TB5-WaLI uses the much less expensive Logitech F710 USB Wireless Game Controller.

    Luckily, I figured out how to configure the Turtlebot4 code to listen to my controller!

    https://preview.redd.it/piomhhiy4lge1.jpg?width=4032&format=pjpg&auto=webp&s=d769cf0c9826462cc73cbe51cb2d6eefca83c931

    Using the Undock, Dock button:

    2025-02-01 12:00|wali_node.py| ** WaLI Noticed Undocking: success at battery 95%, docked for 2.0 hrs **
    2025-02-01 12:23|wali_node.py| ** WaLI Noticed Docking: success at battery 72% after 0.4 hrs playtime **

    0 Comments
    2025/02/01
    20:17 UTC

    1

    problem in running a node with ros-carla-bridge and carla simulator

    ~/carla-ros-bridge/catkin_ws$ ros2 run carla_spawn_objects carla_spawn_objects -n ego_vehicle -m vehicle.tesla.model3

    [FATAL] [1738424917.141950174] [default]: Exception caught: Could not read object definitions from

    [INFO] [1738424917.142442756] [carla_spawn_objects]: Destroying spawned objects...

    https://preview.redd.it/t8m3vn44yjge1.png?width=1920&format=png&auto=webp&s=c8504dfe474ecf3d2966e8b234d1be020425d9af

    2 Comments
    2025/02/01
    16:12 UTC

    12

    The ros2_utils_tool, a GUI/CLI-based toolkit for everday ROS2 utility handling!

    Hey everybody,

    I'd like to present to you a toolset I've been working on during the past few months: The ros2_utils_tool!
    This application provides a full GUI based toolset for all sorts of ROS2-based utilites to simplify various tasks with ROS at work. Just a few features of the tool as follows:

    • Edit an existing ROS bag into a new one with the function to remove, rename or crop tasks
    • Extract videos or image sequences out of ROS bags
    • Create ROS bags out of videos or just using dummy data.
    • Publish videos and image sequences as ROS topics.

    For most of these options, additional CLI functionality is also implemented if you want to stick to your terminal.
    The ros2_utils_tool is very simple to use and aimed to be as lightweight as possible, but it supports many advanced options anyway, for example different formats or custom fps values for videos, switching colorspaces and more. I've also heavily optimized the tool to support multithreading or in some cases even hardware-acceleration to run as fast as possible.
    As of now, the ros2_utils_tool supports ROS2 humble and jazzy.
    The application is still in an alpha phase, which means I want to add many more features in the future, for example GUI-based ROS bag merging or republishing of topics under different names, or some more advanced options such as cropping videos for publishing or bag extraction.
    The ros2_utils_tool requires an installed ROS2 distribution, as well as Qt (both version 6 and 5 are supported), cv_bridge for transforming images to ROS and vice versa, and finally catch2_ros for unit testing. You can install all dependencies (except for the ROS2 distribution itself) with the following command:

    sudo apt install libopencv-dev ros-humble-cv-bridge qt6-base-dev ros-humble-catch-ros2

    For ROS2 Jazzy:

    sudo apt install libopencv-dev ros-jazzy-cv-bridge qt6-base-dev ros-jazzy-catch-ros2

    Install the UI with the following steps:

    Then run it with the following commands:

    • source install/setup.bash
    • ros2 run ros2_utils_tool tool_ui

    I'd love to get some feedback or even more ideas on tasks which might be useful or helpful to implement.
    Thanks!

    2 Comments
    2025/02/01
    11:03 UTC

    5

    Question about transforms

    Hi, I'm new to ROS and I have a question about transforms (tf). Let's say I have a simple two-wheeled robot and there is a controller that publishes odometry to /odom. Let's say I also have an IMU in the robot and a node that publishes to /imu/raw_data

    Then let's say I use the ekf_node from the robot_localization package to fuse /odom and /imu/raw_data and there is a resulting topic, /odometry/filtered.

    Let's also say I have a lidar that is published to a /scan topic.

    If I then go to use the slam_toolbox to do some mapping and localization, I assume for the odom topic in the slam_toolbox config file I want to put in /odometry/filtered and not just /odom right? And if this is the case, then I need to make sure I have a transform for /odometry/filtered to /base_link?

    Thanks for any help or insights.

    2 Comments
    2025/01/31
    20:36 UTC

    3

    Testing library for robots

    I was curious if there are any libraries that essentially allow you to record known data from a sensor for example, and then use that for unit testing for robot controllers. I.e. essentially playing back data to the controller to make sure the final state is within a tolerated deviation from the setpoint. Maybe this is easily doable with rosbag but I was curious if there is anything without that heavy ROS dependency. Because if not, I think I would develop something by myself. If the idea is not clear, please tell me, it's essentially replaying known state data that were manually recorded to be a "successful" run and checking if the correct controller inputs are generated by your code. For example with a walking robot, play back joint position and velocity and check if the generated torques are correct, as well as if the final state the robot arrives is within a specified tolerance to the "correct" one.

    3 Comments
    2025/01/31
    15:19 UTC

    0

    Does ros2 humble code works in ros2 jazzy?

    Does it work?

    6 Comments
    2025/01/31
    14:49 UTC

    2

    Rviz2 showing gray image only

    Hello guys,

    As discussed from my last post, I am unable to see any image in rviz2. It just showing a gray image when I try to visualize it rviz2 even I place any object infront of the camera. Can somebody help me in that. I am posting my urdf file and the launch file here for information. It would very kind of you if someone can help

    URDF:

    <link name="camera_link">             <visual name="camera">                <origin xyz="0 0 0" rpy="0 0 0"/>                <geometry>                  <mesh filename="package://robotiq_description/meshes/visual/d455.stl" scale="0.0005 0.0005 0.0005"/>                 </geometry>             </visual>             <collision name="camera">               <origin xyz="${d455_zero_depth_to_glass-d455_cam_depth/2} ${-d455_cam_depth_py} 0" rpy="0 0 0"/>               <geometry>                 <box size="${d455_cam_depth} ${d455_cam_width} ${d455_cam_height}"/>               </geometry>             </collision>             <inertial>                <mass value="0.072" />                <origin xyz="0 0 0" />                <inertia ixx="0.003881243" ixy="0.0" ixz="0.0" iyy="0.000498940" iyz="0.0" izz="0.003879257" />             </inertial>         </link>          <link name="camera_frame_link"></link>          <joint name='robotiq_85_base_link_to_camera_link' type="fixed">           <parent link="robotiq_85_base_link"/>           <child  link="camera_link"/>           <origin xyz="0.052 -0.0001 -0.020" rpy="${pi/2} ${pi} ${pi/2}"/>         </joint>          <joint name="camera_link_to_camera_frame_link" type="fixed">           <parent link ="camera_link"/>           <child link="camera_frame_link"/>           <origin xyz="0 0 0" rpy="${-pi/2} 0 ${-pi/2}"/>         </joint>                                                         launch file:    gz_ros2_bridge = Node(         package="ros_gz_bridge",         executable="parameter_bridge",         arguments=[             '/clock@rosgraph_msgs/msg/Clock[gz.msgs.Clock',             "/image_raw@sensor_msgs/msg/Image[gz.msgs.Image",             "/camera_info@sensor_msgs/msg/CameraInfo[gz.msgs.CameraInfo",         ],         output='screen',     )               tf2_ros_bridge = Node(             package='tf2_ros',             namespace = 'base_to_wrist_3',             executable='static_transform_publisher',             arguments= ["0", "0", "0", "0", "0", "0", "base_link", "ur5_robot/wrist_3_link/camera"]         ) 

    https://preview.redd.it/0fsuphagdcge1.jpg?width=1918&format=pjpg&auto=webp&s=157726ee4b939036a93e32bafeccff3554be8717

    0 Comments
    2025/01/31
    14:43 UTC

    2

    Best set Up for ROS and ROS2

    Hi guys! I'm starting a ROS course that needs having installed ROS and ROS2. The thing is I don't know wich kind of set up i should go for. I want to keep windows so i'll have a dual boot with an Ubuntu system. What kind of setup do you recommend me?(Ubuntu version, ROS2 version using containers or not)I'm kinda noob in this stuff so I would appreciate any help!

    8 Comments
    2025/01/31
    12:03 UTC

    5

    Underwater Simulation Plugin of Gazebo with Ros2 Humble

    0 Comments
    2025/01/31
    06:41 UTC

    2

    Teleop twist keyboard doesn't work on real robot

    ros2 run teleop_twist_keyboard teleop_twist_keyboard --ros-args --remap cmd_vel:=/diff_drive_controller/cmd_vel -p stamped:=true

    It works on the gazebo but doesn't work on the real robot. Im using ros2 jazzy. Can someone help me how to move the real robot

    11 Comments
    2025/01/31
    06:07 UTC

    3

    Gazebo install on macOS can't import gz into python

    Hey all,

    I've been trying for a bit of time to set up gazebo simulator on my m3 Mac, I can't seem to get gz to install for my home-brew python 3.11 install when I use:

    brew install gz-ionic

    I have 3.12 and 3.13 installed for home-brew and it works fine for those. Anyone experienced something similar?

    0 Comments
    2025/01/30
    20:48 UTC

    21

    Best Practices for Deploying a Production-Ready ROS2 Robot

    Hi,

    (Jazzy, Ubuntu 24.04, Nvidia Jetson, Docker)

    I am currently deploying my ROS2-based robot using a launch file inside a Docker container on a Jetson device. I manually start the system by running ros2 launch within the container. However, I want to take it to the next level and make my robot truly production-ready by ensuring all necessary nodes and processes start automatically upon boot.

    What are the best practices for achieving this? Specifically, I’d like to know:

    1. How to automatically launch all required ROS2 nodes after the robot boots up (is systemd the best way to do so).
    2. How do I make the code "invisible" inside the docker container I will build.
    3. How to handle error recovery and ensure robustness in case a node or process crashes?
    4. Any other considerations for a production-level deployment of a ROS2 robot.

    I’d really appreciate insights from those who have deployed ROS2 robots in real-world applications.

    Thank you !

    18 Comments
    2025/01/30
    19:52 UTC

    1

    Problem running localization on ROS humble

    I used slam to map the surrounding environment, and then switched to localization and gave the path to the serialized map with no extensions. but still, slam can't open the file.. any body has an idia why?

    0 Comments
    2025/01/30
    14:58 UTC

    3

    discarding message because the queue is full

    Hey,
    Hope you are doing well. Actually I am a begineer in robotics and ros2, recently I am trying to attach a depth camera to the gripper base link of my robot. Moving forward when I try to visualize the point cloud 2 , i keep getting error saying messages:

    [rviz2-3] [INFO] [1738243875.620376965] [rviz2]: Message Filter dropping message: frame 'ur5_robot/wrist_3_link/camera' at time 23.400 for reason 'discarding message because the queue is full'

    can someone please help me with it

    11 Comments
    2025/01/30
    13:35 UTC

    Back To Top