/r/robotics

Photograph via //r/robotics

How to get started in robotics

See the Wiki for frequently asked questions, career advice, resources, and previous AMAs

Official Discord Server

For any question, please post on our sister subreddit /r/AskRobotics.


This subreddit is a place for

  • News, research articles & discussion about developments in Robotics (NOT wild far fetched speculation).
  • Showcasing your robot From seasoned roboticists to hobbyists

Questions

  • Questions about your robotics projects (Try our sister subreddit, /r/AskRobotics!)
  • AMAs Are you a professional roboticist? Do you have a really impressive robot to talk about? An expert in your field? Why not message the mods to host an AMA?

Related subreddits

/r/robotics

237,506 Subscribers

1

Servo motor and motherboard

What would happen if you just attached a servo motor to a completely new motherboard? Or a onboard computer to a new unprogramed motherboard?

0 Comments
2024/07/15
21:02 UTC

128

Bilateral teleoperation for 2 DOF arms

9 Comments
2024/07/15
15:15 UTC

7

Why doesn't coffee shops utilize robots to get a RobotVendor-HumanCustomer experience?

I'm sorry if this isn't in the scope of this subreddit, but I expect the reason to be technical since it seems like it should've been done long ago with the technical cabapilites at hand.

25 Comments
2024/07/15
12:56 UTC

3

[Academic] Survey on Smart Planting Device Development (All Ages, Worldwide)

Hey, everyone! I’m Jimmy from the UK. We’re a group of students working on a startup to develop an innovative smart planting device, and we are looking for some real data and opinions. It only takes 3-6 minutes to finish. We need your insights to make it the best it can be. 🌟

Why Take the Survey?

• Influence the next big thing in green tech!

• Help us understand your preferences and needs.

• Be a part of our journey to revolutionize gardening.

Ready to dig in? 🪴

🔗 [Survey Link]

Thanks a bunch! 🍃

0 Comments
2024/07/15
12:34 UTC

38

Does anyone here knows what is the name of this program?

I'm trying to find this for hours using Google search, Yandex and even asking AI.

I really liked this program and that's exactly what I need.

8 Comments
2024/07/15
06:14 UTC

5

What to use for scaffolding?

Hello,
I want to build a custom robot. I have the electronics but I don't know what people use for scaffolding prototypes these days.
I have a 3d printer but I want to use metal parts considering the strength/volume ratio, for example on the chassis which should be thin but strong.
Are there widely used metal parts for this purpose and what name should I look for while searching ?
I am thinking like a metal rectangle piece with multiple holes and matching L pieces ect ..
Thanks.

8 Comments
2024/07/15
05:22 UTC

31

Automatic coffee-making by 6-axis robot arms

3 Comments
2024/07/15
03:25 UTC

1

RoboDK and ABB

Hello,

Im thinking about buying the RoboDK software for programming/simulating ABB robots.

The reason is considering the high price for robotstudio yearly, the roboDK would be cheaper.

But what pro's and con's do you see when using the roboDK software?

What kind of issues have you encountered?

There is alot to read at the roboDK page but I want your honest opinion about the software.

Thank you.

1 Comment
2024/07/14
20:11 UTC

39

How are industrial 6-axis robots manufactured - tolerances and stackup at the TCP

I work with 6-axis industrial robots and, especially on the large ones, wonder how they are manufactured and calibrated to achieve pretty good accuracy over such a large work volume. Specifically the tolerance stackup of the bearing positions on each link. As the radius of each axis' arm can be quite long very small deviations can add up to considerable displacement at the TCP. My thoughts on the potential avenues are:

  1. They just held to a very tight GD&T true position tolerance.
  2. They measured with something like a CMM after machining and the very precise meaasurement is calibrated into the controller,.
  3. They calibrated after assembly and the specifics input into the controller?

I could understand the processes if each arm was $100k-$500k, but many are priced in the $20k-$50k range (at least the ones in the 10-150kg size I use from a unnamed worldwide brand).

If there is something else I haven't considered please let me know!

17 Comments
2024/07/14
19:35 UTC

13

Soft Robotics HELP

i and my team of three is doing a project related to soft robotics for college, it would be helpful if u guys can provide some ideas and suggestions. my ideas include a jellyfish like robot where its tenticles help in locomotion in water as well as grabbing things and other idea is to make a exoskeleton to assist spacesuit gloves .

another doubt is that we need to 3d print the molds which we can do but what type of sillicon to use(something which is flexible and not permeable for air) and how to provide air supply , something cheap yet effective as we are low on budget , any suggestions and help will be great, thankyou.

8 Comments
2024/07/14
15:11 UTC

0

How does the odometer act in mobile robots like Limo?

For mobile robots, there are three basic questions: Where am I? Where am I going? How do I get there? The first question is describing a robot positioning topic. The positioning topic can be explained in more detail as follows: the mobile robot determines its position and posture in the world (global or local) in real time based on its own state and sensor information.

In this project, we will discuss details about odometer in the mobile base such as Limo.

https://preview.redd.it/tvvwe8elphcd1.jpg?width=680&format=pjpg&auto=webp&s=ff108646ac5cfa904b2ac07ff6edba9470bcb658

https://global.agilex.ai/

Introduction of robot wheel odometer and calibration test

The main positioning solutions for driverless cars in Ackerman turned to include: wheel odometer, visual odometer, laser odometer, inertial navigation module (IMU+GPS), and multi-sensor fusion. Wheel odometer is the simplest and lowest-cost method. Like other positioning solutions, the wheel odometer also requires sensors to perceive external information, but the motor speed measurement module used by the wheel odometer is a very low-cost sensor. The speed module is shown in figure below.

https://preview.redd.it/9gon4fdsphcd1.png?width=569&format=png&auto=webp&s=f23260ef75bffe95776e9c6ba4c634e361c8c24c

The pose model of a mobile robot is the state of the robot in the world coordinate system. The random variable Xt = (xt, yt, θt) is often used to describe the state of the robot in the world coordinate system at time t, referred to as pose. Among them, (xt, yt) represents the position of the robot in the world coordinate system at time t, and θt represents the direction of the robot. The positive X-axis of the world coordinate system is assumed to be the positive direction, and the counterclockwise rotation is the positive direction of rotation.

At the initial moment, the robot coordinate system and the world coordinate system coincide. The pose description of the robot at a certain time t is shown in the figure.

https://preview.redd.it/jv657g4wphcd1.png?width=699&format=png&auto=webp&s=1e550cf0990220e38fcdd8f8f78220e2ac5ddf17

The rotational angular velocity of the two wheels can be obtained through the wheel speed odometer. Therefore, the angular velocity of the wheel is needed to represent the x displacement, y displacement, and angle calculated by the odometer.

The quantities we need to calibrate are the wheel spacing and the wheel radius. The formula for establishing the mathematical model is to use the wheel spacing and wheel radius to represent the angular velocity and linear velocity of the vehicle body. The wheel spacing diagram is shown.

https://preview.redd.it/h18pmiuzphcd1.png?width=607&format=png&auto=webp&s=cd5380c1741eaaa0ef8887bc339db279e89dea9c

The angular velocity of the chassis center relative to the body’s rotation center is equal to the angular velocity of the two wheels relative to the body’s rotation center. That is:

https://preview.redd.it/ur7pghv5qhcd1.png?width=255&format=png&auto=webp&s=3f0910f20e090bd0b7cdfc90904a73f727e63f74

Through the relationship between linear velocity and angular velocity, d is introduced:

https://preview.redd.it/un0lnjjcqhcd1.png?width=213&format=png&auto=webp&s=bc54662c96e68590a934051d9b8644f02112c783

So we can get r:

https://preview.redd.it/ghusfv8pqhcd1.png?width=202&format=png&auto=webp&s=a9440c6c96e2413005f69b5978416dcf5129c78b

The motion solution solves w. Bringing r back, we can find w as:

https://preview.redd.it/yauebggrqhcd1.png?width=418&format=png&auto=webp&s=1c450d7d2229c87397634d301e705e1ad0cb41ea

Solve v in motion. By simplifying w*r, we can get v as:

https://preview.redd.it/aztjge1wqhcd1.png?width=428&format=png&auto=webp&s=ee3fbde01ef4494fc524d634fb78e370c0f2ff0a

The calculation of the odometer refers to the cumulative calculation of the robot’s position and posture in the world coordinate system at any time, starting from the moment the robot is powered on (the robot’s heading angle is the positive direction of the world coordinate system X).

The usual method for calculating the odometer is speed integral calculation: the speeds VL and VR of the left and right wheels of the robot are measured by the encoders of the left and right motors. In a short moment △t, the robot is considered to be moving at a uniform speed, and the increments of the X and Y axes of the robot in the world coordinate system at that moment are calculated based on the heading angle of the robot at the previous moment. The increments are then accumulated, and the yaw value of the IMU is used for the heading angle θ. Then the robot’s odometer can be obtained based on the above description.
The specific calculation is shown in the figure below:

https://preview.redd.it/n0jw8hyxqhcd1.png?width=929&format=png&auto=webp&s=6d7b6b849bacb1c3d44c86026a244b9d05f0a28b

Wheel odometer calibration

The three main sources of odometer system errors are “the deviation between the actual diameter of the left and right wheels and the nominal diameter”, “the deviation between the actual spacing between the left and right wheels and the nominal spacing” and “the actual average of the diameters of the two wheels is not equal to the nominal average”.

“The deviation between the actual diameter of the left and right wheels and the nominal diameter” will cause the distance error of linear motion. “The deviation between the actual spacing between the left and right wheels and the nominal spacing” will cause the direction error of rotational motion. “The actual average of the diameters of the two wheels is not equal to the nominal average” will affect both linear motion and rotational motion.

We usually assume that the actual position is linearly related to the wheel odometer. By recording the actual position by ourselves and the position x and y of the odometer of the car, we can use the least squares rule to obtain a linear equation: y=ax+b. The coefficients of the equation can be added when calculating the odometer to correct the odometer.
The code can be viewed in the driver package scout_base/src/scout_messenger.cpp of the robot.

First, data needs to be collected, that is, the actual distance moved by the car and the distance of the odometer of the car.
Running the code in Matlab, the results are as follows
p = [1.0482 -0.0778]
That is, a=1.0482, b=-0.0778, which are the calibration parameters in the x direction. Similarly, the calibration parameters in the y direction and the yaw angle can be calculated. This calibration is reflected in line 28 of the following code.

Detailed explanation of the wheel odometer code released by ROS

  1. Create package

catkin_create_pkg pub_odom roscpp tf nav_msgs
  1. Create the pub_odom_node.cpp file in the src folder under the pub_odom function package and add the following code:

#
include
 
<ros/ros.h>
#
include
 
<tf/transform_broadcaster.h>
#
include
 
<nav_msgs/Odometry.h>

int main(int argc, char** argv)
{
    ros::init(argc, argv, "odometry_publisher");  
// Init ROS code
    ros::NodeHandle n;  
// Create handle

    
// Create publish object to publish odometer message
    ros::Publisher odom_pub = n.advertise<nav_msgs::Odometry>("odom", 50);

    
// Create TransformBroadcaster object to publish transformation
    tf::TransformBroadcaster odom_broadcaster;

    
// Initial status of robot
    double x = 0.0;
    double y = 0.0;
    double th = 0.0;
    double vx = 0.1;
    double vy = -0.1;
    double vth = 0.1;

    
// Init time
    ros::Time current_time, last_time;
    current_time = ros::Time::now();
    last_time = ros::Time::now();

    
// set roop as1Hz
    ros::Rate r(1.0);

    
// Enter loop
    while(n.ok())
    {
        ros::spinOnce();  

        current_time = ros::Time::now();  
// Get current time

        
// 计算机器人的位移
        double dt = (current_time - last_time).toSec();  
// calculate time difference
        double delta_x = (vx * cos(th) - vy * sin(th)) * dt;  
//Calculate the x-direction displacement
        double delta_y = (vx * sin(th) + vy * cos(th)) * dt;  
// Calculate the y-direction displacement
        double delta_th = vth * dt;  
// Calculate the angle change

        
// Update robot position and angle
        x += delta_x;
        y += delta_y;
        th += delta_th;

        
// publish robot transformation
        geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);  
// Convert angle to quaternion
        geometry_msgs::TransformStamped odom_trans;
        odom_trans.header.stamp = current_time;
        odom_trans.header.frame_id = "odom";
        odom_trans.child_frame_id = "base_link";
        odom_trans.transform.translation.x = x;
        odom_trans.transform.translation.y = y;
        odom_trans.transform.translation.z = 0.0;
        odom_trans.transform.rotation = odom_quat;
        odom_broadcaster.sendTransform(odom_trans);

        
// pubilsh odometer
        nav_msgs::Odometry odom;
        odom.header.stamp = current_time;
        odom.header.frame_id = "odom";
        odom.pose.pose.position.x = x;
        odom.pose.pose.position.y = y;
        odom.pose.pose.position.z = 0.0;
        odom.pose.pose.orientation = odom_quat;
        odom.child_frame_id = "base_link";
        odom.twist.twist.linear.x = vx;
        odom.twist.twist.linear.y = vy;
        odom.twist.twist.angular.z = vth;
        odom_pub.publish(odom);

        last_time = current_time;  
// Update timestamp
        r.sleep();  
    }
}

Code Review

ros::Publisher odom_pub = n.advertise<nav_msgs::Odometry>("odom", 50);
tf::TransformBroadcaster odom_broadcaster;

We need to create a ros::Publisher and a tf::TransformBroadcaster to send messages using ROS and tf respectively.

double x = 0.0;
double y = 0.0;
double th = 0.0;

We assume that the robot starts at the origin of the “odom” coordinate system.

double vx = 0.1;
double vy = ‐0.1;
double vth = 0.1;

Here we will set some velocities which will cause the “base_link” frame to move in the “odom” frame at 0.1m/s in the x direction, -0.1m/s in the y direction, and 0.1rad/s in the th direction. This will more or less cause our simulated robot to go in a circle.

ros::Rate r(1.0);

In this example, we will publish the mileage information at a rate of 1 Hz to make the display more concise, most systems will publish the mileage information at a higher rate.

//compute odometry in a typical way given the velocities of the robot
double dt = (current_time ‐ last_time).toSec();
double delta_x = (vx * cos(th) ‐ vy * sin(th)) * dt;
double delta_y = (vx * sin(th) + vy * cos(th)) * dt;
double delta_th = vth * dt;
x += delta_x;
// x = a * x + b;
x = 1.0482x -0.0778;
y += delta_y;
// y = m * m + n;
th += delta_th;
// th = q * th + p;

Here we are updating our mileage information based on the constant speed we set. Of course, a real mileage system would incorporate speed into its calculations.

//since all odometry is 6DOF we'll need a quaternion created from yaw
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);

We generally try to use 3D versions of all messages in our system to allow 2D and 3D components to work together where appropriate and to keep the number of messages to a minimum. Therefore, it is necessary to convert our yaw values ​​to quaternions. tf provides functions that allow quaternions to be easily created from yaw, and yaw values ​​to be easily obtained from quaternions.

//first, we'll publish the transform over tf
geometry_msgs::TransformStamped odom_trans;
odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "base_link";

Here, we’ll create a TransformStamped message to send over tf. We want to publish the transform from the “odom” coordinate system to the “base_link” coordinate system at current_time. So, we’ll set the message header and child_frame_id accordingly, making sure to use “odom” as the parent coordinate system and “base_link” as the child coordinate system.

odom_trans.transform.translation.x = x;
odom_trans.transform.translation.y = y;
odom_trans.transform.translation.z = 0.0;
odom_trans.transform.rotation = odom_quat;
//send the transform
odom_broadcaster.sendTransform(odom_trans);

Stuff our odometry data into the transform message and send the transform using the TransformBroadcaster.

//next, we'll publish the odometry message over ROS
nav_msgs::Odometry odom;
odom.header.stamp = current_time;
odom.header.frame_id = "odom";

We also need to publish a nav_msgs/Odometry message type so the navigation package can get velocity information from it. We set the header of the message to the current_time and the “odom” frame.

//set the position
odom.pose.pose.position.x = x;
odom.pose.pose.position.y = y;
odom.pose.pose.position.z = 0.0;
odom.pose.pose.orientation = odom_quat;
//set the velocity
odom.child_frame_id = "base_link";
odom.twist.twist.linear.x = vx;
odom.twist.twist.linear.y = vy;
odom.twist.twist.angular.z = vth;

This will populate the message with the mileage data and send it off. We set the child_frame_id of the message to the “base_link” frame, since that’s the frame we want to send velocity information to.

  1. Add the following two lines of code in the CMakeLists.txt file

add_executable(pub_odom_node src/pub_odom_node.cpp)
target_link_libraries(pub_odom_node
${catkin_LIBRARIES}
)
  1. Compile using catkin_make
  2. Run the code First open roscore Then run the code we wrote

rosrun pub_odom pub_odom_node
  1. After the code runs successfully, use rostopic echo to view the published odom information

rostopic echo /odom

Test result

https://preview.redd.it/n2jk0a42rhcd1.png?width=905&format=png&auto=webp&s=67f30ae95164118d0f4c7c261e9ee465c4ee338e

After-class QUIZS
● In ROS, how to use the robot’s wheel odometer data to realize the robot’s pose estimation? Please write a ROS node, subscribe to the robot’s wheel odometer data, use the odometer data to realize the robot’s pose estimation, and publish the estimated pose information.

● How to calibrate the robot’s wheel odometer? Please write a ROS node, let the robot move on a specific trajectory, record the robot’s wheel odometer data and real pose information, and use the calibration algorithm to calibrate the wheel odometer, and finally save the calibration results in the ROS parameter server.

About Limo

If you are interested in Limo or have some technical questions about it, feel free to join AgileX Robotics or AgileX Robotics. Let’s talk about it!

0 Comments
2024/07/14
14:04 UTC

1

CoppeliaSim Edu Multiple Object Property editing

Does anyone know how to edit the property of multiple objects in coppeliasim?
I have to increase the z parameter of the objects by 1 and there are like 100 objects

0 Comments
2024/07/14
12:22 UTC

1

Suggestion on how to build navigation system for home robot

I am trying to build a home assistant robot like Astro from Amazon. So I am wondering what software and hardware is needed to allow the robot to navigate around the house just like Astro?

Thanks in advance!

2 Comments
2024/07/14
04:14 UTC

1

Purethermal mini Pro JST-SR mounting / connections help

Hi all,

I recently purchased a Purethermal mini Pro JST-SR with a FLIR Lepton 3.1R and wanted to ask if anyone had any insight on accurately mounting it and working with it in general. It currently has mounting holes w/ a diameter of 1.1mm, so I was planning on attaching it using 1mm screws I found on McMaster-Carr with some threaded inserts, then 3D printing a mount for the board w/ insertion points for threaded inserts. Even with the accuracies of 3d printing the mount, I feel that the tolerance stack is pretty high considering how small the parts are.

In addition, GroupGets sells the corresponding cable that works with the board. Has anyone been able to find a similar cable that’s significantly shorter (I need one that’s like 6 inches max, not 3 feet)? They sell the header for the cable on digikey, but know nothing about soldering up a cable like that.

1 Comment
2024/07/13
22:30 UTC

3

Recommendations for sensor fusion algorithms for IMU data in a balacing robot

I'm working on a 2 wheeled balancing platform, using an MPU9250. Looking for other's experience with accel/gyro fusion methods, and if anyone has any particular recommendations for this hardware/ application.

0 Comments
2024/07/13
19:51 UTC

7

Am i screwed?

So I am planning on applying for robotics msc in UK (wherever i get the chance) , I saw some places let cs undergraduate apply,but my problem is my programme barely taught any calculus and no kinematic& dynamics. Will I be okay in msc, if not how do these uni expect computing student to survive the msc.

17 Comments
2024/07/13
18:23 UTC

144

Halloween dummy animatronic - best way to power for longer use?

Hey there! I’m fairly new to the world of robotics and while I’ve gotten fairly comfortable with building mechanisms with servos, I am the real dummy when it comes to power supply.

For this dummy, I have two HS-645MG servos controlling the eyes and the mouth, and one HS-53 servo controlling the eyelids. These are connected to a Pololu Mini Maestro 12-Channel USB Servo Controller.

I am using a NiMH 6V 2000mAh battery pack to power the servos, and the maestro controller is powered by the USB plugged into my laptop.

I would like to be able to provide power to the servos and controller for longer use (at least 4 hours) for a Halloween party, but when reading the documentation for supplying power I am absolutely lost. And terrified of killing my motors and/or controllers.

Last year I made an animatronic raven with a similar motor configuration that I used two 6V battery packs to control the servos and controller, and it stopped working within a half hour. Total bummer, and I’d like to avoid that this time.

What is the best way to power this for longer use? Are there ways to use a wall plug to power my dummy?

Thanks in advance!

36 Comments
2024/07/13
16:35 UTC

74

BB1-Zero Update . Arms field test 🦾

BB1 seems stoked about having arms 😂 First “field test” with added weight. His tread motors are definitely too underpowered for how much this robot has grown 🦾

10 Comments
2024/07/13
15:56 UTC

135

It's alive!!!

21 Comments
2024/07/13
13:27 UTC

88

How would I control this

I am making a walking robot that in theory would use my arms as input for the legs but it ain’t working so I need some more ideas for how to control it

21 Comments
2024/07/13
13:00 UTC

7

What's your reference industrial robotics newsletter/magazine?

Just looking for some content about industrial solutions, hardware, software related to robotics in the industry. Also interested in startups in the sector and new technologies.

Something like: https://www.roboticstomorrow.com/main/factory-automation

I wonder what's more popular in the industry, it's very hard to tell.

0 Comments
2024/07/13
08:57 UTC

0

help me make an InMoov

Hi i would like if some body could help me https://www.gofundme.com/f/help-me-make-a-inmoov-robot

if you help me you are making a better future

1 Comment
2024/07/12
20:07 UTC

9

Looking for person following robots

As part of a school project I am building a person following robot. I am looking for other robots that exist which fulfill their rolls by following a person.

For example the Airwheel SR5 is a smart suitcase that follows the user. Have you seen or heard of other commercial or in development products that have this feature? If so I would love a link or the name of the product. Thank you very much!

10 Comments
2024/07/12
17:51 UTC

72

BB1-Zero Update. “I know kung fu”

Day 3 of having arms ! Smoothened out the motions a bit and tightened stuff up. Cant wait to tie the arms to the rest of the behaviors. Working on figuring out both the arms moving at the same time all slick like … currently my attempts punch him in the face 😂. This robot is evolving so fast !

7 Comments
2024/07/12
15:32 UTC

14

HumanPlus is an open-source humanoid robot developed by researchers at Stanford University. Take a look at this, I believe that after AI, Robotics will be the next trend.

It is capable of learning and accurately reproducing complex sequences of human movements, including in real-time.

Open-source Link: HumanPlus | SmilingRobo

2 Comments
2024/07/12
15:29 UTC

Back To Top