/r/robotics
/r/robotics
My googling (and Ali Express'ing) is not finding a wheel that could fit one of these motors:
Any ideas on such a tiny little wheel?
I'm looking for some external method of measuring the position of a robotic arm in real time. Ideally I would be able to load in a skeleton of the arm, and the measurement tool would respond with joint angles and end effector position.
Has anyone encountered something like this? From what I've seen, normal 3D scanners just return a full model, without a concept of this "skeleton". And robotic tracker tools I've found (e.g., OptiTrack) are not about drones positioning than details about one arms orientation
I am currently working on a project that aims at controlling a robotic arm (UR10e) using an RGB camera and hand gestures.
Currently I managed to build the hand detection and gesture recognition pipeline using MediaPipe, and I defined some semantic for the hand gestures in order to control the robot. To send commands to the robot I use ROS Noetic and the official UR package.
Right now I can control the robot in 3 ways: joint space, cartesian space in world reference frame, and cartesian space in end effector reference frame. With my left hand i select the joint/axis, while with the right hand I select positive or negative movement. For example, assuming I am doing control in joint space, if with my left hand i have three fingers up and the right hand is pointing up the robot the robot will start moving the third joint counterclockwise. This is not a particularly fancy motion planning framework, just a "go left" or "go right" kind of commands.
I think this is cool to see, but it is not particularly useful. So I was wondering, what could be an application of this completely remote control (in the sense that you do not need to touch anything in order to control the robot)? What other things can I do that could make my tool useful in a real world scenario?
Hi there, I am doing a research on mobile robots. I am going to integrate DRL(new exploring) in turtlebot3 and planning to make multi-robot system. But I can;t able to decide whether to use Open Gymnasium or Gazebo environment. Any suggestions ??
https://docs.zeroth.bot/build/bom
I've been stressed out. I have ZERO electronics experience but I want to build this.
It sais "12V to 5V, 3 amp capacity (may need connectors)." Does it or does it not need connectors? And if so which connector do I buy? Are the molex connectors listed compatible for this ?
So been chasing codes. Tech switched wires to troubleshoot original issue wires 3e01 and 5h57 were switched with 3e03 and 5h63. Powered back on same original issue remained but machine ran fine. At end of day they shut down machine switched wires back to how print represents and now getting error alarms as - EDIT 0B BUT ENE WELDING RIABLE 3001 N/OUT Gout ROBOT STEM INFO DISPLAY UTILITY ALARM NUM: 6 ALARM 1051 TASK#O SETUP INITIALIZE ERROR(MOTION) [11 ALARM 1050 SET-UP PROCESS ERROR (SYSCON) [2] ALARM 4109 DC 24V POWER SUPPLY FAILURE(1/O) [1111_1111_1111_1111] ALARM 0020 CPU COMMUNICATION ERROR [50] ALARM 0011 CPU BOARD INSERTION ERR. (SAFETY) L0000_0001] n Menu Simple Menu, I7F Panel RESET Broken fan fuse. [l]
Hi, I’m looking for a robotic arm which can paint with a brush. The payload can be less than 500gr, amplitude ideally 0.5 or 1 meter 5 or 6 DOF Budget up to 5000 USD Human Collision detection would be great. I would like to control the arm using code and not interested to show/teach what to do by recording the humanly generated movements. Precision at 1 mm would be good enough
I’m considering the ar4 but I also have advertising for second hand industrial looking arms like a kuka kr210
Z1 air is over the budget
A scara robot could also work
Thank you in advance
Hey yall ! I’m laid off now so I’ve had some time to work on fleshing this lil guy out. Still a learning work in progress. Everything from scratch. 🙏🏽
Utliziing tensorflow lite for image recognition.
Pi5 robot controlling 4 esp32 chips
I've developed an open-source motion planner that achieves sub-millisecond planning times for moderately complex problems with articulated robots. https://github.com/HiroIshida/plainmp
Initially developed as a general robotics programming framework including IK and trajectory optimization, in addition to motion planning, I ended up focusing on optimizing the motion planning component just for fun. The performance improvements came through persistent tuning using perf profiling. While it doesn't match VAMP's performance (the world's undisputedly fastest motion planner as of 2024), I think it's interesting that persistent tuning without any groundbreaking innovations still achieved sub-millisecond planning times.
The planner achieves median planning times of 0.17ms for dual bars scenarios and remains under 1ms for more complex setups like ceiled dual bars (0.65ms) and fetch table scenarios (0.62ms), as attached figure.
I still have several ideas to make this even faster (but I'm currently writing my PhD thesis, so this will have to wait until after next April). Please look forward to future updates!
Hello, robot enthusiasts!
This is an announcement and call for participation in the League of Robot Runners 2024, a multi-season 🚀 competition and research initiative 🚀 tackling one of the most challenging problems in industrial optimisation: Multi-Robot Path Planning (sometimes also called Multi-Agent Path Finding).
The competition is inspired by current and emerging applications that rely on mobile robotics 🦾🤖. For example, Amazon automated warehouses, where thousands of robots work together to ensure safe and efficient package delivery 🧸📦 🚚 ❤️.
Now in its second season, the competition focuses on two core challenges:
Both setups are online and real-time, which means the clock ticks while you compute. Complete as many tasks as possible before time runs out!
We think the competition is especially interesting for Robotics researchers and practitioners:
Participating in this competition is a great way to showcase your 💡 ideas and implementations 💡 to a global audience of academic and industry experts. After the competition, problem instances and submissions are open-sourced, which increases your visibility, lowers entry barriers for others and helps the community to grow and learn 👩🏫 🤔 📚 🎓.
There is a $10,000 USD prize pool for 🌟 outstanding performances 🌟 across three different categories. We’re also offering training awards in the form of $1,000 USD AWS credits to help participants reduce their offline computational costs 😻.
Submissions are open anytime, and evaluation results are available immediately on our live leaderboard. The competition runs until 📅 February 16, 2025 📅, with results announced in March 2025.
It’s easy to get started! We provide you with a simulator and code harness (the “start kit”), many example problems, and a visualiser to explore generated solutions. You also have access to last year’s best-performing planner as a baseline. Visit our website for all the details (www.leagueofrobotrunners.org), or post here if you have questions!
I'm creating a fairly basic robotic arm and just have the pieces connected to the servos directly but it seems like its stressing them out and i feel that may not be wise in the future. How would i go about taking stress off of the servo itself and directing it elsewhere, maybe a bearing or something?
Just for context I'm using MG90S Micro Servos
I was wanting to get my daughter a Lego Mindstorm kit for Christmas but was sad to see they are retired. There are a ton of programmable robots on Amazon and I was curious if anyone here has any experience with them and could recommend one. Would love to stay around $150 or less if possible. Oh, if we could upload custom sounds for it to play it would make her day. Thank you
Hello everyone,
This is my first time building a wheeled robot from scratch, and I’m looking for feedback or suggestions on the parts list I’ve put together. The goal is to build a 4WD robot capable of basic navigation and high-level processing.
PS: let me know if I should post it in a different subreddit.
Here’s what I’ve got so far:
This might be a brilliant or very stupid idea, but do we have anything such as a lidar that utilizes both laser and ultrasonic? (A 360 rapid rotating device)
Many people know that most of the humanoids nowadays are teleoperated. But how do we evaluate their intelligence? I put together this questions to ask when evaluating a humanoids capabilities. Would like to hear your feedback on it:
Can someone please tell me what this sensor actually does? And is it mostly for robots? https://www.waveshare.com/10-dof-ros-imu-a.htm
There didn't seem enough information to understand what it does, and any help would be great, thank you!
Hi, I'm a beginner, and I couldn’t find the answer in forums or the manual. If you have a manual or something where I can check this info, please let me know.
When setting the mass in kilograms after defining the tool in ABB on the teach pendant, I was taught to use mass=1
(ABB Set Tool Mass). From what I understood about payload configuration in KUKA robots, the value '1' is an absolute value, means the maximum load capacity of the robot (for example, 6kg on a KUKA Agilus Sixx). But if you want to set a specific weight, like 2kg, you’d input '2'. Also, I know it’s not recommended to run the robot at its max payload.
So, with ABB, what does the value '1' mean? I’ve seen it a lot in tutorials, and that’s how I was taught, but nobody explained why.
Thanks for your help!
Does anyone have suggestions for a microphone array that can be used with ros on jetson orin for sound source localization?
I'm looking to add a microphone array so the system can listen and detect from what direction the sound came from.