NVIDIA and Open Robotics are enabling ROS 2 to run on Jetson processors for perception, modeling, and training to ease robotics development.
Project aims to take bipedal robots to a new level, equipping them to adapt on the fly to treacherous ground, dodge obstacles or decide whether a given area is safe for walking.
New support allows full integration of vision-based 3D mapping in ROS 2 designs.
MOV.ai and Velodyne deliver a complete navigation package for ROS-based AMR solutions.
A senior robotics engineer at Boston Dynamics explains how perception and adaptability enable Atlas to perform varied, high-energy behaviors like parkour.
Scythe Robotics shares tips about sensor selection, ruggedized hardware, and software optimization for its commercial mowers.
A computer model relates a finger’s desired position to the corresponding pressure a pump would have to apply to achieve that position. Using this model, the team developed a controller that directs the pneumatic system to inflate the fingers.
By training Atlas to maneuver through complex parkour courses, Boston Dynamics developed new movements inspired by human behaviors and pushed the humanoid to new limits.
University of Michigan researchers have enabled humanoid robots to use their hands in a similar way, so the robots can better travel across rough terrain, such as disaster areas or construction sites.
Unity is supporting ROS 2 due to “significant advancements and support of more hardware drivers, networking modules, communication architecture, and several robot algorithms.”
MIT developed a high-speed flight-planning algorithm that combines simulations and experiments, in a way that minimizes the number of experiments required to identify fast and safe flight paths.
Sarcos Robotics and T-Mobile announce a collaboration to integrate T-Mobile 5G into the Sarcos Guardian XT robot.
GE’s project was one of eight funded by the U.S. Army to advance autonomous, off-road navigation capabilities for military ground vehicles.
Adaptive locomotive modality, or the ability to move in different ways, is integral to extreme terrain exploration on the Moon and Mars.
Droidlet enables researchers to build robots that accomplish tasks either in the real world or in simulated environments like Minecraft or Habitat.
Cassie completed the 5K at the Oregon State University campus in just over 53 minutes on a single battery charge.
The Covariant Brain is a “universal AI that enables robots to see, reason, and act autonomously in the real world.”
Guided by a set program that autonomously switched between off, low, medium, and high pressures, the robotic hand was able to press the buttons on the controller to successfully complete the first level of Super Mario Bros. in less than 90 seconds.
The tool lets users select one or two 3D sensors to instantly analyze and compare minimum range, maximum range, horizontal field of view, vertical field of view, and diagonal field of view. You can also click and drag for different viewpoints.
OpenAI disbands its robotics research team due to lack of large enough data sets to effectively generate reinforcement models.
Engineers use a few different techniques that employed both Spot’s Choreographer software and its API to make Spot dance.
To demo the algorithm, the robot helped put a jacket on a human, which could potentially prove to be a powerful tool in expanding assistance for those with disabilities or limited mobility.