The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Brain-Controlled Robots Becoming a Reality at MIT

By Steve Crowe | March 6, 2017

The human needs to wear an EEG cap that measures their brain signals. The system looks for brain signals called “error-related potentials” that are generated when the brain notices a mistake has been made.

Robots aren’t supposed to make mistakes. But if they do, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Boston University have developed a way to correct the robot’s actions by using a person’s electroencephalography (EEG) brain signals.

The human needs to wear an EEG cap that measures their brain signals. The system looks for brain signals called “error-related potentials” (ErrPs) that are generated when the brain notices a mistake has been made. As the robot indicates which choice it plans to make, the system uses ErrPs to determine if the human agrees with the decision.

The system works in real-time, classifying brain waves in 10-30 milliseconds. The system, which is being tested on Rethink Robotics’ Baxter, even make the robot feel embarrassed when it makes a mistake. The idea is to make robots a more natural extension of humans.

Paper: Correcting Robot Mistakes in Real Time Using EEG Signals

“Imagine being able to instantaneously tell a robot to do a certain action, without needing to type a command, push a button or even say a word,” says CSAIL director Daniela Rus, who won the 2017 Engelberger Robotics Awards for Education. “A streamlined approach like that would improve our abilities to supervise factory robots, driverless cars and other technologies we haven’t even invented yet.”

The team’s next step is to refine the system so that it can handle multiple-choice and other complex tasks besides simple binary-choice activities, which is what you see in the video atop this page.

“As you watch the robot, all you have to do is mentally agree or disagree with what it is doing,” says Rus. “You don’t have to train yourself to think in a certain way – the machine adapts to you, and not the other way around.”

In addition to monitoring ErrPs, the team also detects “secondary errors” that occur when the system doesn’t notice the human’s original correction. “If the robot’s not sure about its decision, it can trigger a human response to get a more accurate answer,” the team says. “These signals can dramatically improve accuracy, creating a continuous dialogue between human and robot in communicating their choices.”

“This work brings us closer to developing effective tools for brain-controlled robots and prostheses,” says Wolfram Burgard, a professor of computer science at the University of Freiburg who was not involved in the research. “Given how difficult it can be to translate human language into a meaningful signal for robots, work in this area could have a truly profound impact on the future of human-robot collaboration.”

About The Author

Steve Crowe

Steve Crowe is Executive Editor, Robotics, WTWH Media, and chair of the Robotics Summit & Expo and RoboBusiness. He is also co-host of The Robot Report Podcast, the top-rated podcast for the robotics industry. He joined WTWH Media in January 2018 after spending four-plus years as Managing Editor of Robotics Trends Media. He can be reached at scrowe@wtwhmedia.com

Related Articles Read More >

Parkhotel employees in Eisenstadt, Austria, celebrate the arrival of Pudu service robots.
Pudu Robotics CEO predicts that service robot market will expand
Meet the RBR50 Robotics Innovation Awards Winners
Picking robot shipments graph.
Over 150,000 picking robots to be installed by 2030
How to use simulation for developing robots

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe