The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

How humanoids learn to read the room

By Sana Kazilbash | May 13, 2026

Image courtesy of ADI

The closer robots get to people, the more vital it is for them to see, hear and react without missing a beat.

Designing for a humanoid robot is one of the most complicated applications in robotics today. All on its own, a humanoid system has to manage movement, balance, vision and reactivity across a complex web of joints, sensors and data processing. This becomes critical when a humanoid robot operates in an environment that includes people.

As humans, we do most things automatically. We judge our center of gravity and adjust accordingly to run or jump or move to avoid a collision. We process vast amounts of audio and visual information simultaneously, in real time, then use that information to decide how to respond. All of this happens within fractions of a millisecond.

A humanoid robot has to approximate this kind of environmental awareness through an array of sensors, then interpret those inputs quickly enough to establish a safe working zone and act appropriately to avoid harming the people in its vicinity.

“There needs to be trust between humans and robots to ensure safe interaction. Any robot working with or around humans needs to be able to deal with our natural unpredictability. The robot must also be able to express its intentions to the humans around it to prevent unsafe human behavior due to misunderstandings” says Geir Ostrem, Analog Devices Fellow with the Automotive Business Unit at ADI.

And as labor shortages deepen and more robots move into shared spaces to increase efficiency, a key question arises: what is needed for a humanoid robot to operate safely and efficiently side by side with humans?

Vision

Situational awareness with humanoid robots starts with vision, especially in environments where people and equipment are constantly moving. A humanoid robot needs to see and understand its surroundings in order to react quickly and appropriately, whether that’s to pick up an object or move away from a person. Standard human vision can be approximated with RGB image sensors, along with depth perception achieved through time-of-flight, structured light or stereo vision methods.

Having visual input alone isn’t enough; processing that data quicky and accurately is essential. Cameras and visual sensors are unlikely to be located near the central computer, which means all that visual data has to travel through the robot over long cables. Cables add weight and constrains flexibility of movement, so it is important to get the most use out of each cable.

In a humanoid robot, a main processor serves as the brain, with vision sensors in the head or torso connected directly to the central computer. Where lower latency is needed for fast control loops — such as controlling a motor that drives fast movements — dedicated smaller processors can sit closer to the sensor or actuator, handling local processing, reducing wiring harnesses and ensuring functional safety, while also transferring data to the main processor.

Already in wide use throughout the automotive industry, ADI’s Gigabit Multimedia Serial Link (GMSL) technology transports video data in real time, in a single stream capable of carrying many gigabits per second. In humanoid robots, this enables redundancy and fast, local processing of visual data, which enables these systems to identify and understand their surroundings using physical AI to process visual data locally in the robot, rather than sending it to the cloud.

Audio

Vision alone is not enough, however; if a robot is going to work collaboratively with humans, it needs intelligent hearing, too. “Being able to speak in natural language and having a conversational user interface is a very powerful way to communicate with a robot,” says Ostrem.

It is also important that a humanoid robot must be able to understand acoustic events that occur in its environment. If something crashes to the floor behind a robot, it must be able to identify the source of the sound, as well as understand what that sound means, Ostrem explains. Classification of acoustic events is a task ideally suited for local, physical AI.

Just like visual data, audio inputs must travel from multiple microphones to the robot’s central computer for processing, which means latency is a concern.

“When it comes to sound events, localization and detection, having deterministic latency from the microphone to the computer is very critical,” says Ostrem. “Now you’re talking about beamforming and the acoustic field, and it requires that you know the relative delays between the different microphones with high accuracy.”

“This is something that ADI’s A2B audio bus does superbly, because the time needed to get a signal from a microphone to a computer using A2B is completely deterministic.”

A common automotive technology, A2B is a low-latency audio transport technology that supports sound source localization, and streamlines audio connectivity across the system by allowing many microphones to be connected in daisy chains on a single bus — carrying power, audio and controls over just two wires.

“If you look at a robot, the amount of wiring needed for all the sensors is one of the biggest problems,” says Ostrem. “A2B allows you to put in advanced audio functionality with very few wires.”

Image courtesy of ADI

Battery/Power

All of these sensors, processors and connectivity devices needs power to operate. Humanoid robots carry their own energy supply in the form of battery packs. Most humanoid robots run on lithium-ion batteries in the range of 48, 60 or 72 volts — smaller than those used in automotive, but carrying many of the same risks such as overheating or thermal runaway.

ADI offers technologies such as electrochemical impedance spectroscopy (EIS) for detecting unsafe changes in battery chemistry early, so batteries can be swapped out before failure.

“EIS allows you to look deeply into what is happening in the chemistry of the battery,” Ostrem explains. “If something goes wrong with the battery, or if it turns out to be unsafe somehow, you can detect this ahead of time — before it becomes a hazard. When a robot is going to operate around humans, if that battery is going into thermal runaway, you definitely want to make sure that battery is far away from a human when the problem actually happens.”

And that all depends on the audio, visual and other internal sensors being able to identify problems, communicate them quickly, and act to mitigate any safety risks to nearby humans.

Conclusion

As humanoids move into more complex roles, the safety, sensing and interaction demands will only grow. Ostrem believes the future lies in better AI at the edge — both in terms of reaction time, safety and battery life.

ADI already has sensing and perception, connectivity and battery management figured out in the automotive sphere. The natural next step is leveraging those technologies into emerging applications like humanoids.

“In some ways, humanoid robots are where cars were many years ago,” says Ostrem. “The architecture isn’t fully set, and there is significant room for industry collaboration around standardizing interfaces to stimulate the ecosystem around humanoid robots.”

To learn more, visit https://www.analog.com/en/solutions/industrial-automation/industrial-robotics/humanoid-robotics.html

Sponsored content by Analog Devices

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

Architecting Reliability: How GMSL Diagnostics Enable Robust Vision
The Evolution of Vision Connectivity in Robotics: From USB and Ethernet to GMSL
GMSL and the growing ecosystem around robotic vision systems
The Convergence in Perception Systems from Cars to Robots

RBR50 Innovation Awards

“2026”
“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Architecting Reliability: How GMSL Diagnostics Enable Robust Vision
  • The Evolution of Vision Connectivity in Robotics: From USB and Ethernet to GMSL
  • GMSL and the growing ecosystem around robotic vision systems
  • How humanoids learn to read the room
  • The Convergence in Perception Systems from Cars to Robots
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2026 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe