/r/robotics
/r/robotics
In today's rapidly evolving digital age, artificial intelligence (AI) is transforming our lives, work, and learning at an astonishing pace. Whether it's smart assistants in our daily lives or data analysis in businesses, the applications of AI are profoundly impacting various fields.I'm very curious about the fascinating changes that occur when AI is integrated into robots 🌟.For example, could they play table tennis with us? 😝
I am building a pick-and-place robotic arm using servo motors and 3d printed parts. The goal is to make it autonomous to identify and move objects. As of now, it is joystick-operated.
https://reddit.com/link/1ecfn8u/video/dlv684y8nsed1/player
I have created a hackaday page for it which contains stl files for 3d printing and the components list and I'll soon be adding build instructions & project logs. If you want to know more about my projects, drop in your email here where I share monthly updates on what I am working on.
I have seen this company Amber robotics and their products look good and well priced for what they are, but I have seen on their kickstarter for their new robot they haven’t shipped out the products for people yet and keep saying reasons but some people believe it’s a scam because of this, https://www.kickstarter.com/projects/amberobotics/lucid-1-zero-code-and-ai-planning-7-axis-portable-robotic-arm/comments , I wanted to buy the lucid one pro from their site https://shop.amberobotics.com/collections/all?page=2 but I’m just wondering if anyone knows anything about if they’re legit or anyone has bought the lucid one from the site and actually had it delivered, on their first robot I know people who bought it and it did arrive and was exactly as they described with no issues, so does anyone think it’s safe to buy or should I avoid and find a different option? Sorry just worried to spend this amount of money and I can’t find any information from anyone who bought a lucid one from their site
Coming straight out of college with a marketing degree, I never thought in a million years that I’d be building a brain for a robot, yet here I am. The world of programming is a daunting place for anyone, especially someone without any coding experience, but I’ve been exploring this fascinating field thanks to a user-friendly platform called Neurorobotics Studio.
Neurorobotics Studio allows you to create custom behaviors for robots based on biological neuroscience principles, using both physical and simulated robots. Recently, I worked on a project to create both an autonomous and bodily motion-tracking game of pong. Here’s how I approached it:
These projects have shown me the incredible potential of neurorobotics. Imagine a future where robots perform precise tasks to assist in healthcare, making ethical decisions that mimic human learning. Neurorobotics Studio’s open-source format encourages community collaboration, which is crucial for rapid progress in this field.
I’m excited to see what others can innovate and how we can collectively advance this technology. Don’t just take my word for it; explore this field yourself and share your creations!
Technical Details and Challenges:
Potential Use Cases:
Feel free to ask any questions or provide feedback on my approach and the use of Neurorobotics Studio. I’m eager to engage with the community and learn from your insights.
Hello, I am creating a robot with the intention of having it mow my lawn. I am using an arduino and want to learn about machine vision, so I plan to implement a camera to use for obstacle avoidance. Does anyone have recommendations for good cameras to use? Or any good methods of obstacle avoidance?
i have two servo motors that i want to control using the joystick. i followed a tutorial on how to do it and i followed each step perfectly. the servos are being really jittery and moving without any input from me. i’ll put a pic of the code in the comments.
I'm currently in Senior high and I'm trying to do robotics research but am unable to think of a topic, any suggestions are very helpful. Our budget is about $500 thanks.
Is there rviz2 graph plugin? I found something similar in jsk_visualization repository, and also graph_rviz_pligin, but both of them are written for ROS noetic. Is there something similar on ROS2? Thanks!
IMU stands for inertial measurement unit, which is composed of three single-axis accelerometers and three single-axis gyroscopes. The accelerometer detects the acceleration signal of the object in the carrier coordinate system in three independent axes, while the gyroscope detects the angular velocity signal of the carrier relative to the navigation coordinate system. After processing these signals, the attitude of the object can be calculated.
It is worth noting that IMU provides relative positioning information. Its function is to measure the route of the object relative to the starting point, so it cannot provide information about your specific location. Therefore, it is often used together with GPS. When the GPS signal is weak in some places, IMU can play its role, allowing the car to continue to obtain absolute position information so as not to get lost.
In fact, the mobile phones we use every day, the cars and airplanes we take, and even missiles and spacecraft all use IMU. However, the cost and accuracy vary.
According to different scenarios, IMU has different requirements for accuracy. High accuracy also means high cost.
Low-precision IMU: used in ordinary consumer electronic products. This low-precision IMU is very cheap and is commonly used in mobile phones and sports watches. It is often used to record the number of steps.
Medium-precision IMU: used in unmanned driving. The price ranges from a few hundred to tens of thousands of yuan, depending on the positioning accuracy requirements of the unmanned vehicle.
High-precision IMU: used in missiles or space shuttles. Take missiles as an example. From the launch of the missile to the hitting of the target, the aerospace-grade IMU can achieve extremely high-precision calculations, and the error can even be less than one meter.
In addition to the accuracy and cost, IMU has two very critical characteristics. The first is the high update frequency, and the operating frequency can reach more than 100Hz; the second is the high calculation accuracy in a short period of time, without too much error.
The IMU message under ROS looks like:
std_msgs/Header header
uint32 seq
timestamp
// timestamp
string frame_id
geometry_msgs/Quaternion orientation
// orientation
float64 x
float64 y
float64 z
float64 w
float64[9] orientation_covariance
// orientation covariance
geometry_msgs/Vector3 angular_velocity
// angular velocity
float64 x
float64 y
float64 z
float64[9] angular_velocity_covariance
// angular velocity covariance
geometry_msgs/Vector3 linear_acceleration
// linear acceleration
float64 x
float64 y
float64 z
float64[9] linear_acceleration_covariance
// linear acceleration covariance
This message type provides IMU data, including orientation, angular velocity, and linear acceleration. Here’s a detailed explanation of each part:
seq
: Sequence number of the message.timestamp
: Timestamp indicating when the message was generated.frame_id
: Identifier of the reference coordinate frame.x, y, z, w
: Components of the quaternion representing the IMU’s current orientation.orientation_covariance
: A 9-element array representing the covariance matrix of the orientation, indicating the uncertainty of the orientation measurement.x, y, z
: Components of the angular velocity, corresponding to the three axes.angular_velocity_covariance
: A 9-element array representing the covariance matrix of the angular velocity, indicating the uncertainty of the angular velocity measurement.x, y, z
: Components of the linear acceleration, corresponding to the three axes.linear_acceleration_covariance
: A 9-element array representing the covariance matrix of the linear acceleration, indicating the uncertainty of the linear acceleration measurement.
roslaunch limo_bringup limo_start.launch
rosbag record -O bag_name.bag/topic1_name/topic2_name/xxx
Press Ctrl+C to end the recording. The file is automatically saved in the root directory with the name imu.bag.
Rosbag records a specific topic name:
rosbag record -O bag_name.bag /topic1_name /topic2_name /xxx
rosbag play -r 0.5 imu.bag
The terminal will display:
Replay the recored IMU data. And open the terminal and input:
rqt_plot
Close the LIMO driver.
In the interface, choose IMU topic data. Click ‘+’. Then you can see 3 angular velocity of IMU. And click the right side of the interface. Finally, you will see the IMU data changing.
● Use ROS and rviz to visualize IMU sensor data.
Requirements:
● Tips:
● Use IMU sensors to implement robot posture control.
Requirements:
● Tips:
Limo is a smart educational robot published by AgileX Robotics. More details please visit: https://global.agilex.ai/
If you are interested in Limo or have some technical questions about it, feel free to join AgileX Robotics or AgileX Robotics. Let’s talk about it!
Hi
I am using Cytron MD30C R2 for driving my rover via pixhawk and using arduino for pwm ( and direction). My driver is rated for 30 amp continuous current and 80 amp peak. I am using 24V, 75kgcm torque 100 RPM brushed DC motors (2 on one motor driver, shorting the terminals), I am using two motor drivers for 4 motors.
The motor driver works perfectly fine while working on no load condition, But after driving for a while and changing directions one of the motor drivers stops working, the light dims, and mosfet starts heating. The driver becomes unusable.
Any help would be appreciated. Thanks !!
checking if all of this is compatible for a RC crawler build or if i need to add any more parts. also if you give a link to a professional rock crawler
You know anything about AI, tridreduntant procs and armoured power. I'm an almost amateur at this stuff. I know nothing about servos and how much voltage and amps to feed. My guess is 48v and 50 amp. That's I very high amp that current batteries can't supply at light weight. My guess is lithium polymer. It would have to be distributed and I only do that voltage and amps without big batteries.
Hi im looking to buy a robotic arm that uses brushless motors, hopefully with at least a 1kg payload but not a deal breaker, I can spend around £8000/$10300, I was hoping for one that is more like a straight robot arm because I’d like to be able to make human like movements, I’ve seen Ufactory arms and they look very good but the shape of them wouldn’t work for exactly what I’m wanting to do with it, any suggestions is really appreciated thanks :)
HI, not too new to robotics but new to SLAM here. Practically speaking, what is the level of accuracy from running visual+imu (inertial) SLAM? For example, if I feed a 720P video to ORB-SLAM3, with well-calibrated intrinsics, is it accurate to 10cm? 1cm?
I'm working on a project where trajectories are computed from videos shot by cameras equipped with imu, hence the question. Thank you.
Bigger beefier arms are working great sofar ! He lost his jitters !
Post Operational Clean Out (POCO) of the Sellafield site requires the internal cleaning of large, highly radioactive vessels with complex internal configurations. Sellafield Ltd is seeking innovative solutions for deploying vessel cleaning heads and ultra-high/high-pressure water jetting or spray nozzles to apply decontaminating chemicals and foams inside these structures.
The vessels' internal structures can include cooling/heating coils; fluid mixing and transfer devices; instrument dip-legs; internal vessels; and pipe support brackets. All of these internal structures require full surface decontamination. Any tool deployed to clean these internal structures would need to be able to navigate around them whilst cleaning them.
Sellafield Ltd is looking for technology that can deploy an ultra-high pressure water jetting (UHPWJ) head or other tools into a vessel that can:
The aim is to decontaminate the vessel by removing the entire internal surface layer of metal.
Access into the vessel is achieved via 150mm inspection ports that pass through the 1.5m thick concrete cell wall, then through a 150mm diameter access port on the metal wall of the vessel itself.
Technology is being developed to cut an access port into a vessel and then fit a removable access plug through which any technological solution could be deployed.
Some vessels have open tops or engineered access ports, but many don’t and would need access to be created. Therefore, the smaller and lighter the solution the better.
Aggressive chemicals can be deployed by filling the entire vessel undergoing POCO; however, this produces unacceptable volumes of effluent. The ability to spray chemicals onto the internal surfaces of the vessel would use less of the reagent and reduce the effluent challenge. Ultra-high pressure water jetting (UHPWJ) or electrochemical methods would remove the need for reagents, if the deployment challenge can be solved.
FIND OUT MORE
See linked the full challenge statement: https://www.gamechangers.technology/static/u/Challenge%20statement%20-%20Directional%20decontamination%20head%20deployment%20into%20large,%20congested,%20highly%20radioactive%20vessels.pdf
Visit www.Gamechangers.technology and head to the challenges section to apply.
Successful solutions will receive £10,000 for a 12 week feasibility study which may then lead to further funding for a proof of concept.
An interactive webinar will take place at 10:30am on Wednesday 7th August 2024 where delegates will have the chance to hear directly from the challenge owners and ask any questions. Attendance is free - register here.
The deadline for applications for this challenge is 2pm on Wednesday 21 August 2024.
It’s still a bit slow but I am working on increasing the acceleration while keeping things smooth
Heyy folks was trying to create a path plan for parrot mini drone to follow a red line even tried some algorithms but I'm facing signal issue from my vision based data to my control system (I'm not able to use bus selector to get signal from vision based data to my input of my State flowchart as my desired input of state flow is not mentioned or showing in bus signal data)
Is anyone out there how can help me ouuuttt!??????????
Hi! I am considering buying the Unitree Go2 robot, but I still don't understand the limitations between the three versions (AIR, PRO, EDU). I know that the EDU version allows for secondary development, but I am not sure what the benefits of that are. I want to implement different sensors on the robot and automate its movement across different places (without manual control). Is the EDU version required for this? Thanks in advance.
I am using a very basic test code provided at the end of this video linked below (I'm basically trying to rebuild her robot with a few extra mods but I haven't even added the mods yet)
https://www.youtube.com/watch?v=Bp9r9TGpWOk
I'll also copy the code here.
I keep getting this error:
It marks the error at the first line of my forward function.
What am I doing wrong?
#GPIO Settings
GPIO.setwarnings(False)
GPIO.setmode(GPIO.BOARD)
#labeling pins
GPIO.setup[29, GPIO.OUT]
GPIO.setup[31, GPIO.OUT]
#30 GND
GPIO.setup[32, GPIO.OUT]
GPIO.setup[33, GPIO.OUT]
#ultrasonic setup
ultrasonic = DistanceSensor(echo=17, trigger=4)
#Wheel control
class Robot:
def __init__(self, name, rwheel, lwheel):
self.name
= name
self.rwheel = tuple(rwheel)
self.lwheel = tuple(lwheel)
self.rwheel_f = int(rwheel[0])
self.rwheel_b = int(rwheel[1])
self.lwheel_f = int(lwheel[0])
self.lwheel_b = int(lwheel[1])
#methods
def forward(self, sec):
GPIO.output(self.rwheel_f, True)
GPIO.output(self.lwheel_f, True)
#stop
time.sleep(sec)
GPIO.output(self.rwheel_f, False)
GPIO.output(self.lwheel_f, False)
def backward(self, sec):
GPIO.output(self.rwheel_b, True)
GPIO.output(self.lwheel_b, True)
#stop
time.sleep(sec)
GPIO.output(self.rwheel_b, False)
GPIO.output(self.lwheel_b, False)
def lturn(self, sec):
GPIO.output(self.rwheel_f, True)
#stop
time.sleep(sec)
GPIO.output(self.rwheel_f, False)
def rturn(self, sec):
GPIO.output(self.lwheel_f, True)
#stop
time.sleep(sec)
GPIO.output(self.lwheel_f, False)
#establishing ob
smelly = Robot("smelly", (29, 31), (32,33))
#test run
smelly.forward(3)
smelly.backward(3)
smelly.lturn(3)
smelly.rturn(3)
Hello everyone,
I am working with the Arduino Mega for the water enrichment project and need help. The project objective is as follows: Our objective is to create an HMI system for our piping and tank system prototype. This HMI system should display temperature, pressure, and O2/CO2 concentrations in water. The above sensors and motors are connected to a control system via the Arduino Mega. It should also be able to display an animation of the tank levels rising and falling as well as the piping systems filling up with gas and water. The issue is as follows: Our current touchscreen is the Nextion Basic 7'' HMI LCD Touch Display which is only able to support images not animations. For our project, we are looking for a touchscreen wherein we can create the animation ourselves and run it, while also being compatible with the Arduino Mega. I would appreciate some guidance on how to resolve this issue. Ultimately, we are looking for a touchscreen that supports creating animations/running animations and is also compatible with Arduino (if not compatible, then attachable to a module that is compatible with Arduino). Unfortunately, my team and I are under a deadline of one month so we cannot purchase screens outside of Canada.
Thank you so much for your help, I appreciate any advice on our issue.
Hamna
Hi everyone!
I'm starting a project to simulate a UAV and its sensors for two purposes:
Testing navigation
Testing the computer vision system
For example, if I make a drone to count cars in a parking lot, I need to test its navigation and counting accuracy.
The simulation software should support various sensors and have great graphics. I'm considering Blender for making the model and Unreal Engine or Unity for making the environment. I need to simulate plant growth using a mathematical model.
Currently, I'm thinking of creating models in Blender, using them to build the environment in Unreal Engine 5, and then simulating in AirSim.
If you have any suggestions, even those not directly related to the question, please don't hesitate to comment! Thank you for reading!
Hello!
I have been working on trying to figure out the arms all week …have been running into problems but tackling them best I can.
Currently waiting on a servo to come in from Amazon to finish the work. But I figure I’d show ya BB1s “Anti Deer Raccoon BBBB turret” 🙏🏽 (it is functional! video coming soon )
The arms also mean having to change up how the power is handled so will probably have to add another voltage regulator and some other stuff 🤔. WIP 🙏🏽
Roboticist Ali Ahmed, Co-founder & CEO of Robomart, defines what factors must be met for something to be considered an autonomous robot.
Btw, I’m the host, and I’m from the XR space. Ali is my guest, thought to post it here, might be very basic haha. But they’re doing some cool stuff thought to share.
What advice would you give to a young entrepreneur building a self-funded robotics company in a garage?
( in terms of Early-Marketing, Fundraising, Manufacturing, team building, or anything )