/r/robotics

Photograph via //r/robotics

How to get started in robotics

See the Wiki for frequently asked questions, career advice, resources, and previous AMAs

Official Discord Server

For any question, please post on our sister subreddit /r/AskRobotics.


This subreddit is a place for

  • News, research articles & discussion about developments in Robotics (NOT wild far fetched speculation).
  • Showcasing your robot From seasoned roboticists to hobbyists

Questions

  • Questions about your robotics projects (Try our sister subreddit, /r/AskRobotics!)
  • AMAs Are you a professional roboticist? Do you have a really impressive robot to talk about? An expert in your field? Why not message the mods to host an AMA?

Related subreddits

/r/robotics

238,452 Subscribers

1

Exploring the Infinite Possibilities of AI

In today's rapidly evolving digital age, artificial intelligence (AI) is transforming our lives, work, and learning at an astonishing pace. Whether it's smart assistants in our daily lives or data analysis in businesses, the applications of AI are profoundly impacting various fields.I'm very curious about the fascinating changes that occur when AI is integrated into robots 🌟.For example, could they play table tennis with us? 😝

https://reddit.com/link/1ecjzcf/video/y67pr8bm2ued1/player

0 Comments
2024/07/26
09:38 UTC

3

droid_arm, a 5 dof 3d printed robotic arm with a gripper

I am building a pick-and-place robotic arm using servo motors and 3d printed parts. The goal is to make it autonomous to identify and move objects. As of now, it is joystick-operated.

https://reddit.com/link/1ecfn8u/video/dlv684y8nsed1/player

I have created a hackaday page for it which contains stl files for 3d printing and the components list and I'll soon be adding build instructions & project logs. If you want to know more about my projects, drop in your email here where I share monthly updates on what I am working on.

0 Comments
2024/07/26
04:51 UTC

7

Amber robotics

I have seen this company Amber robotics and their products look good and well priced for what they are, but I have seen on their kickstarter for their new robot they haven’t shipped out the products for people yet and keep saying reasons but some people believe it’s a scam because of this, https://www.kickstarter.com/projects/amberobotics/lucid-1-zero-code-and-ai-planning-7-axis-portable-robotic-arm/comments , I wanted to buy the lucid one pro from their site https://shop.amberobotics.com/collections/all?page=2 but I’m just wondering if anyone knows anything about if they’re legit or anyone has bought the lucid one from the site and actually had it delivered, on their first robot I know people who bought it and it did arrive and was exactly as they described with no issues, so does anyone think it’s safe to buy or should I avoid and find a different option? Sorry just worried to spend this amount of money and I can’t find any information from anyone who bought a lucid one from their site

3 Comments
2024/07/26
00:30 UTC

3

AI Robot Dog Bittle Collects an Acorn with a YOLO model running on Raspberry Pi

0 Comments
2024/07/25
18:37 UTC

0

From Marketing Major to Roboticist and Neuroscientist: Building a Brain for a Robot

Coming straight out of college with a marketing degree, I never thought in a million years that I’d be building a brain for a robot, yet here I am. The world of programming is a daunting place for anyone, especially someone without any coding experience, but I’ve been exploring this fascinating field thanks to a user-friendly platform called Neurorobotics Studio.

Neurorobotics Studio allows you to create custom behaviors for robots based on biological neuroscience principles, using both physical and simulated robots. Recently, I worked on a project to create both an autonomous and bodily motion-tracking game of pong. Here’s how I approached it:

  1. Creating Autonomous Behavior:
    • I introduced new cortical areas into the brain’s default genome.
    • The brain activated an output response according to a stimulus input.
    • Tracking the vision of the pong ball resulted in moving the pong paddle to the respective side to avoid dropping the ball.
  2. Developing Motion-Tracking Behavior:
    • The pong paddle moved in response to movements detected by the computer’s camera.
    • Neurons fired based on the motion control cortical area that controls left to right movements.

These projects have shown me the incredible potential of neurorobotics. Imagine a future where robots perform precise tasks to assist in healthcare, making ethical decisions that mimic human learning. Neurorobotics Studio’s open-source format encourages community collaboration, which is crucial for rapid progress in this field.

I’m excited to see what others can innovate and how we can collectively advance this technology. Don’t just take my word for it; explore this field yourself and share your creations!

Technical Details and Challenges:

  • Algorithm/Software: I used the built-in functions and tutorials of Neurorobotics Studio to design and simulate the behaviors.
  • Hardware: Although I primarily used simulated robots, the platform supports integration with physical robots.
  • Challenges: Learning the intricacies of neuronal responses and integrating them into the robot's behavior was initially challenging but immensely rewarding.

Potential Use Cases:

  • Healthcare: Robots could assist in mundane tasks in nursing homes or perform delicate surgeries with precision.
  • Ethical AI: Developing AI that makes ethically sound decisions by mimicking human brain processes.

Feel free to ask any questions or provide feedback on my approach and the use of Neurorobotics Studio. I’m eager to engage with the community and learn from your insights.

https://reddit.com/link/1ebydtn/video/evooxlwssoed1/player

2 Comments
2024/07/25
15:55 UTC

2

Machine Vision Camera

Hello, I am creating a robot with the intention of having it mow my lawn. I am using an arduino and want to learn about machine vision, so I plan to implement a camera to use for obstacle avoidance. Does anyone have recommendations for good cameras to use? Or any good methods of obstacle avoidance?

5 Comments
2024/07/25
14:44 UTC

57

what is going on here ?

i have two servo motors that i want to control using the joystick. i followed a tutorial on how to do it and i followed each step perfectly. the servos are being really jittery and moving without any input from me. i’ll put a pic of the code in the comments.

35 Comments
2024/07/25
13:39 UTC

3

Can someone help with Robotic research topics?

I'm currently in Senior high and I'm trying to do robotics research but am unable to think of a topic, any suggestions are very helpful. Our budget is about $500 thanks.

5 Comments
2024/07/25
12:55 UTC

1

RVIZ2 Graph Plugin

Is there rviz2 graph plugin? I found something similar in jsk_visualization repository, and also graph_rviz_pligin, but both of them are written for ROS noetic. Is there something similar on ROS2? Thanks!

0 Comments
2024/07/25
11:04 UTC

0

How does an IMU work in Limo?

What is IMU

IMU stands for inertial measurement unit, which is composed of three single-axis accelerometers and three single-axis gyroscopes. The accelerometer detects the acceleration signal of the object in the carrier coordinate system in three independent axes, while the gyroscope detects the angular velocity signal of the carrier relative to the navigation coordinate system. After processing these signals, the attitude of the object can be calculated.

https://preview.redd.it/c4jglwqp1ned1.png?width=557&format=png&auto=webp&s=289e311e30958799a9a8bc0fe15660177891e386

It is worth noting that IMU provides relative positioning information. Its function is to measure the route of the object relative to the starting point, so it cannot provide information about your specific location. Therefore, it is often used together with GPS. When the GPS signal is weak in some places, IMU can play its role, allowing the car to continue to obtain absolute position information so as not to get lost.

In fact, the mobile phones we use every day, the cars and airplanes we take, and even missiles and spacecraft all use IMU. However, the cost and accuracy vary.

According to different scenarios, IMU has different requirements for accuracy. High accuracy also means high cost.

https://preview.redd.it/h8rwbcrr1ned1.png?width=891&format=png&auto=webp&s=f98c1859f232f014480127302a25c43cfd51e2c8

Low-precision IMU: used in ordinary consumer electronic products. This low-precision IMU is very cheap and is commonly used in mobile phones and sports watches. It is often used to record the number of steps.

Medium-precision IMU: used in unmanned driving. The price ranges from a few hundred to tens of thousands of yuan, depending on the positioning accuracy requirements of the unmanned vehicle.

High-precision IMU: used in missiles or space shuttles. Take missiles as an example. From the launch of the missile to the hitting of the target, the aerospace-grade IMU can achieve extremely high-precision calculations, and the error can even be less than one meter.

In addition to the accuracy and cost, IMU has two very critical characteristics. The first is the high update frequency, and the operating frequency can reach more than 100Hz; the second is the high calculation accuracy in a short period of time, without too much error.

IMU message under ROS

The IMU message under ROS looks like:

std_msgs/Header header
  uint32 seq
  timestamp 
// timestamp
  string frame_id
geometry_msgs/Quaternion orientation 
// orientation
  float64 x
  float64 y
  float64 z
  float64 w
  float64[9] orientation_covariance 
// orientation covariance
geometry_msgs/Vector3 angular_velocity 
// angular velocity
  float64 x
  float64 y
  float64 z
  float64[9] angular_velocity_covariance 
// angular velocity covariance
geometry_msgs/Vector3 linear_acceleration 
// linear acceleration
  float64 x
  float64 y
  float64 z
  float64[9] linear_acceleration_covariance 
// linear acceleration covariance

This message type provides IMU data, including orientation, angular velocity, and linear acceleration. Here’s a detailed explanation of each part:

  1. Header
  • seq: Sequence number of the message.
  • timestamp: Timestamp indicating when the message was generated.
  • frame_id: Identifier of the reference coordinate frame.
  1. Orientation
  • x, y, z, w: Components of the quaternion representing the IMU’s current orientation.
  • orientation_covariance: A 9-element array representing the covariance matrix of the orientation, indicating the uncertainty of the orientation measurement.
  1. Angular Velocity
  • x, y, z: Components of the angular velocity, corresponding to the three axes.
  • angular_velocity_covariance: A 9-element array representing the covariance matrix of the angular velocity, indicating the uncertainty of the angular velocity measurement.
  1. Linear Acceleration
  • x, y, z: Components of the linear acceleration, corresponding to the three axes.
  • linear_acceleration_covariance: A 9-element array representing the covariance matrix of the linear acceleration, indicating the uncertainty of the linear acceleration measurement.

How to record IMU data in Limo

  1. Open a new terminal. Run the following command:

roslaunch limo_bringup limo_start.launch
  1. Record IMU data

rosbag record -O bag_name.bag/topic1_name/topic2_name/xxx

Press Ctrl+C to end the recording. The file is automatically saved in the root directory with the name imu.bag.
Rosbag records a specific topic name:
rosbag record -O bag_name.bag /topic1_name /topic2_name /xxx

  1. Play back the data at 0.5 times the speed:

rosbag play -r 0.5 imu.bag

The terminal will display:

https://preview.redd.it/hcbmtn2v1ned1.png?width=930&format=png&auto=webp&s=84e4e141131bc203e4b097bc26d28974cab390cb

Visualize the data by rqt_plot

Replay the recored IMU data. And open the terminal and input:

rqt_plot

Close the LIMO driver.
In the interface, choose IMU topic data. Click ‘+’. Then you can see 3 angular velocity of IMU. And click the right side of the interface. Finally, you will see the IMU data changing.

https://preview.redd.it/z7knkygx1ned1.png?width=930&format=png&auto=webp&s=81652c462109a3551fcebaefc32ab3267f3bd682

Quiz

● Use ROS and rviz to visualize IMU sensor data.
Requirements:

  1. Subscribe to the IMU data topic and parse the data.
  2. Publish the parsed data to rviz.
  3. Use the IMU plugin in rviz to visualize the data in the form of a 3D model.

● Tips:

  1. You need to use an IMU data parsing library, such as the imu_filter_madgwick library that comes with ROS.
  2. You can refer to the IMU display tutorial on the ROS Wiki.

● Use IMU sensors to implement robot posture control.
Requirements:

  1. Subscribe to the IMU data topic and parse the data.
  2. Calculate the robot’s posture based on the IMU data.
  3. Implement robot posture control, such as keeping the robot stable, adjusting the robot’s posture, etc.

● Tips:

  1. You can use the robot control library in ROS, such as ROS Control.

About Limo

Limo is a smart educational robot published by AgileX Robotics. More details please visit: https://global.agilex.ai/

https://preview.redd.it/nj6pos502ned1.png?width=680&format=png&auto=webp&s=66db03d9996744d1f5ca372ec3ed9274760cbf6c

If you are interested in Limo or have some technical questions about it, feel free to join AgileX Robotics or AgileX Robotics. Let’s talk about it!

0 Comments
2024/07/25
10:03 UTC

1

Motor Driver (Cytron) Blows off, unable to comprehend the reason.

Hi

I am using Cytron MD30C R2 for driving my rover via pixhawk and using arduino for pwm ( and direction). My driver is rated for 30 amp continuous current and 80 amp peak. I am using 24V, 75kgcm torque 100 RPM brushed DC motors (2 on one motor driver, shorting the terminals), I am using two motor drivers for 4 motors.

The motor driver works perfectly fine while working on no load condition, But after driving for a while and changing directions one of the motor drivers stops working, the light dims, and mosfet starts heating. The driver becomes unusable.

Any help would be appreciated. Thanks !!

0 Comments
2024/07/25
09:47 UTC

1

Rc crawler compatibility check

2 Comments
2024/07/25
07:55 UTC

0

Power supply

You know anything about AI, tridreduntant procs and armoured power. I'm an almost amateur at this stuff. I know nothing about servos and how much voltage and amps to feed. My guess is 48v and 50 amp. That's I very high amp that current batteries can't supply at light weight. My guess is lithium polymer. It would have to be distributed and I only do that voltage and amps without big batteries.

10 Comments
2024/07/25
07:37 UTC

4

Looking for Robotic arm

Hi im looking to buy a robotic arm that uses brushless motors, hopefully with at least a 1kg payload but not a deal breaker, I can spend around £8000/$10300, I was hoping for one that is more like a straight robot arm because I’d like to be able to make human like movements, I’ve seen Ufactory arms and they look very good but the shape of them wouldn’t work for exactly what I’m wanting to do with it, any suggestions is really appreciated thanks :)

3 Comments
2024/07/25
07:02 UTC

8

What level of accuracy should I expect from SLAM?

HI, not too new to robotics but new to SLAM here. Practically speaking, what is the level of accuracy from running visual+imu (inertial) SLAM? For example, if I feed a 720P video to ORB-SLAM3, with well-calibrated intrinsics, is it accurate to 10cm? 1cm?

I'm working on a project where trajectories are computed from videos shot by cameras equipped with imu, hence the question. Thank you.

18 Comments
2024/07/24
17:18 UTC

102

BB1-zero Update! Beefier arms Egg test passed ! Great success !

Bigger beefier arms are working great sofar ! He lost his jitters !

12 Comments
2024/07/24
17:09 UTC

0

Robotic solutions for deploying an ultra high-pressure water jet in highly radioactive and confined environments

Post Operational Clean Out (POCO) of the Sellafield site requires the internal cleaning of large, highly radioactive vessels with complex internal configurations. Sellafield Ltd is seeking innovative solutions for deploying vessel cleaning heads and ultra-high/high-pressure water jetting or spray nozzles to apply decontaminating chemicals and foams inside these structures.

The vessels' internal structures can include cooling/heating coils; fluid mixing and transfer devices; instrument dip-legs; internal vessels; and pipe support brackets. All of these internal structures require full surface decontamination. Any tool deployed to clean these internal structures would need to be able to navigate around them whilst cleaning them.

Sellafield Ltd is looking for technology that can deploy an ultra-high pressure water jetting (UHPWJ) head or other tools into a vessel that can:

  • Cover the full internal surface of the vessel including internal pipework and structures.
  • Operate from a nozzle to surface metal distance of 50mm to 100mm.

The aim is to decontaminate the vessel by removing the entire internal surface layer of metal.

Access into the vessel is achieved via 150mm inspection ports that pass through the 1.5m thick concrete cell wall, then through a 150mm diameter access port on the metal wall of the vessel itself.

Technology is being developed to cut an access port into a vessel and then fit a removable access plug through which any technological solution could be deployed.

Some vessels have open tops or engineered access ports, but many don’t and would need access to be created. Therefore, the smaller and lighter the solution the better.

Aggressive chemicals can be deployed by filling the entire vessel undergoing POCO; however, this produces unacceptable volumes of effluent. The ability to spray chemicals onto the internal surfaces of the vessel would use less of the reagent and reduce the effluent challenge. Ultra-high pressure water jetting (UHPWJ) or electrochemical methods would remove the need for reagents, if the deployment challenge can be solved.

FIND OUT MORE

See linked the full challenge statement: https://www.gamechangers.technology/static/u/Challenge%20statement%20-%20Directional%20decontamination%20head%20deployment%20into%20large,%20congested,%20highly%20radioactive%20vessels.pdf

Visit www.Gamechangers.technology and head to the challenges section to apply.

Successful solutions will receive £10,000 for a 12 week feasibility study which may then lead to further funding for a proof of concept.

An interactive webinar will take place at 10:30am on Wednesday 7th August 2024 where delegates will have the chance to hear directly from the challenge owners and ask any questions. Attendance is free - register here.

The deadline for applications for this challenge is 2pm on Wednesday 21 August 2024.

0 Comments
2024/07/24
14:22 UTC

406

Programming my AR4 to be a cameraman

It’s still a bit slow but I am working on increasing the acceleration while keeping things smooth

30 Comments
2024/07/24
12:27 UTC

1

Parrot Minidrone Competition

Heyy folks was trying to create a path plan for parrot mini drone to follow a red line even tried some algorithms but I'm facing signal issue from my vision based data to my control system (I'm not able to use bus selector to get signal from vision based data to my input of my State flowchart as my desired input of state flow is not mentioned or showing in bus signal data)

Is anyone out there how can help me ouuuttt!??????????

0 Comments
2024/07/24
11:41 UTC

2

Difference between Unitree Go2 versions

Hi! I am considering buying the Unitree Go2 robot, but I still don't understand the limitations between the three versions (AIR, PRO, EDU). I know that the EDU version allows for secondary development, but I am not sure what the benefits of that are. I want to implement different sensors on the robot and automate its movement across different places (without manual control). Is the EDU version required for this? Thanks in advance.

3 Comments
2024/07/24
06:33 UTC

1

Help with Raspberry Pi for Robot Project (hobby not homework)

I am using a very basic test code provided at the end of this video linked below (I'm basically trying to rebuild her robot with a few extra mods but I haven't even added the mods yet)

https://www.youtube.com/watch?v=Bp9r9TGpWOk

I'll also copy the code here.

I keep getting this error:

RuntimeError: The GPIO channel has not been set up as an OUTPUT

It marks the error at the first line of my forward function.

What am I doing wrong?

#GPIO Settings

GPIO.setwarnings(False)

GPIO.setmode(GPIO.BOARD)

#labeling pins

GPIO.setup[29, GPIO.OUT]

GPIO.setup[31, GPIO.OUT]

#30 GND

GPIO.setup[32, GPIO.OUT]

GPIO.setup[33, GPIO.OUT]

#ultrasonic setup

ultrasonic = DistanceSensor(echo=17, trigger=4)

#Wheel control

class Robot:

def __init__(self, name, rwheel, lwheel):

self.name = name

self.rwheel = tuple(rwheel)

self.lwheel = tuple(lwheel)

self.rwheel_f = int(rwheel[0])

self.rwheel_b = int(rwheel[1])

self.lwheel_f = int(lwheel[0])

self.lwheel_b = int(lwheel[1])

#methods

def forward(self, sec):

GPIO.output(self.rwheel_f, True)

GPIO.output(self.lwheel_f, True)

#stop

time.sleep(sec)

GPIO.output(self.rwheel_f, False)

GPIO.output(self.lwheel_f, False)

def backward(self, sec):

GPIO.output(self.rwheel_b, True)

GPIO.output(self.lwheel_b, True)

#stop

time.sleep(sec)

GPIO.output(self.rwheel_b, False)

GPIO.output(self.lwheel_b, False)

def lturn(self, sec):

GPIO.output(self.rwheel_f, True)

#stop

time.sleep(sec)

GPIO.output(self.rwheel_f, False)

def rturn(self, sec):

GPIO.output(self.lwheel_f, True)

#stop

time.sleep(sec)

GPIO.output(self.lwheel_f, False)

#establishing ob

smelly = Robot("smelly", (29, 31), (32,33))

#test run

smelly.forward(3)

smelly.backward(3)

smelly.lturn(3)

smelly.rturn(3)

11 Comments
2024/07/24
04:32 UTC

1

Supporting Animations with Arduino

Hello everyone,

I am working with the Arduino Mega for the water enrichment project and need help. The project objective is as follows: Our objective is to create an HMI system for our piping and tank system prototype. This HMI system should display temperature, pressure, and O2/CO2 concentrations in water. The above sensors and motors are connected to a control system via the Arduino Mega. It should also be able to display an animation of the tank levels rising and falling as well as the piping systems filling up with gas and water. The issue is as follows: Our current touchscreen is the Nextion Basic 7'' HMI LCD Touch Display which is only able to support images not animations. For our project, we are looking for a touchscreen wherein we can create the animation ourselves and run it, while also being compatible with the Arduino Mega. I would appreciate some guidance on how to resolve this issue. Ultimately, we are looking for a touchscreen that supports creating animations/running animations and is also compatible with Arduino (if not compatible, then attachable to a module that is compatible with Arduino). Unfortunately, my team and I are under a deadline of one month so we cannot purchase screens outside of Canada.

Thank you so much for your help, I appreciate any advice on our issue.

Hamna

3 Comments
2024/07/24
02:33 UTC

2

Best Software for this UAV and Sensor Simulation Project?

Hi everyone!

I'm starting a project to simulate a UAV and its sensors for two purposes:

  1. Testing navigation

  2. Testing the computer vision system

For example, if I make a drone to count cars in a parking lot, I need to test its navigation and counting accuracy.

The simulation software should support various sensors and have great graphics. I'm considering Blender for making the model and Unreal Engine or Unity for making the environment. I need to simulate plant growth using a mathematical model.

Currently, I'm thinking of creating models in Blender, using them to build the environment in Unreal Engine 5, and then simulating in AirSim.

If you have any suggestions, even those not directly related to the question, please don't hesitate to comment! Thank you for reading!

2 Comments
2024/07/23
23:30 UTC

21

Made a Harmonic Drive simulator, any feedback?

2 Comments
2024/07/23
20:21 UTC

14

BB1-zero update - pi4robot - bigger beefier arms v3.0 WIP and Anti deer / raccoon “BBBB turret”

Hello!

I have been working on trying to figure out the arms all week …have been running into problems but tackling them best I can.

Currently waiting on a servo to come in from Amazon to finish the work. But I figure I’d show ya BB1s “Anti Deer Raccoon BBBB turret” 🙏🏽 (it is functional! video coming soon )

The arms also mean having to change up how the power is handled so will probably have to add another voltage regulator and some other stuff 🤔. WIP 🙏🏽

0 Comments
2024/07/23
20:15 UTC

83

What’s a robot?

Roboticist Ali Ahmed, Co-founder & CEO of Robomart, defines what factors must be met for something to be considered an autonomous robot.

Btw, I’m the host, and I’m from the XR space. Ali is my guest, thought to post it here, might be very basic haha. But they’re doing some cool stuff thought to share.

Full interview

33 Comments
2024/07/23
19:45 UTC

37

What advice do you have to someone who wants to start a robotics company in a garage?

What advice would you give to a young entrepreneur building a self-funded robotics company in a garage?

( in terms of Early-Marketing, Fundraising, Manufacturing, team building, or anything )

48 Comments
2024/07/23
16:51 UTC

Back To Top