The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail

Seoul Robotics launches 3D perception software with deep learning

By Mike Oitzman | September 1, 2021

Listen to this article
Voiced by Amazon Polly
seoul robotics

Seoul robotics LiDAR Perception System. | Image credit: Seoul Robotics

In the latest release of its SENSR software,  Seoul Robotics introduces what it claims is its most advanced, 3D perception features yet. SENSR 2.2 software can detect objects that are partially obstructed, fast-moving, or clustered together, in addition to providing classification of bicycles, vehicles, and pedestrians.

The Seoul Robotics software product line consists of two solutions: SENSR M is designed for perception on moving vehicles like autonomous vehicles and autonomous mobile robots (AMR). SENSR 2.2 is designed for fixed sensing applications such as cities, public spaces, logistics, manufacturing and retail spaces.

Deep learning (DL) is at the heart of the SENSR software design. By leveraging DL, Seoul Robotics is improving the perception accuracy. Seoul Robotics claims it’s using DL to track more than 500 objects simultaneously and with an accuracy range of within 10 centimeters.

SENSR 2.2 also includes weather-filtering AI, allowing the software to track and detect objects in severe weather conditions, including heavy rain and snow. SENSR 2.2 is currently deployed by Seoul Robotics across the United States as well as in Japan, Korea, and numerous other countries.

“The introduction of deep learning into 3D perception software may be one of the last show-stopping enhancements in the LiDAR industry. Historically, the focus has been on advancing the LiDAR sensors themselves, but that’s changing. Moving forward, there will be heavy investments in 3D perception software that interprets the data into actionable solutions,” said HanBin Lee, CEO of Seoul Robotics. “The introduction of SENSR 2.2 is accelerating the adoption of solutions that will fuel autonomy across the globe.”

SENSR 2.2 is sensor-agnostic and compatible with over 75 different types of 3D sensors currently on the market today, including LiDAR, 3D cameras, and imaging radar. SENSR 2.2 brings heightened accuracy to a range of solutions, such as smart intersections, wrong-way detection, speeding, smart railroad crossings, crowd management, and smart retail. Seoul Robotics is rapidly expanding globally and has current partnerships with several top-tier organizations including BMW, Mercedes-Benz, Chattanooga Department of Transportation, Emart, and many others.

“Since we deployed Seoul Robotics’ technology into our smart city solutions we have seen an increase in our operational efficiencies and improvements in overall safety of our community,” said Kevin Comstock, smart city director for the City of Chattanooga. “Seoul Robotics has specifically helped the City of Chattanooga seamlessly monitor pedestrian traffic, and we are currently gathering data that will inform future capabilities of wrong-way detection. These efforts are saving money for the city, travel time for local residents, and–most importantly–lives.”

About The Author

Mike Oitzman

Mike Oitzman is Editor of WTWH's Robotics Group and founder of the Mobile Robot Guide. Oitzman is a robotics industry veteran with 25-plus years of experience at various high-tech companies in the roles of marketing, sales and product management. He can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

ABB depalletizer replaces heavy lifting and improves efficiency
Researchers developing underwater map-making robot
motional car
Motional, Serve Robotics begin autonomous deliveries with Uber Eats
Kria KR260
AMD launches Kria KR260 robotics starter kit

2021 Robotics Handbook

The Robot Report Listing Database

Latest Robotics News

Robot Report Podcast

Agility Robotics gets a boost from Amazon; The US Alliance of Robotics Clusters is born
See More >

Sponsored Content

  • Meet Trey, the autonomous trailer (un)loading forklift
  • Kinova Robotics launches Link 6, the first Canadian industrial collaborative robot
  • Torque sensors help make human/robot collaborations safer for workers
  • Roller screws unlock peak performance in robotic applications
  • Making the ROS development cycle manageable

RBR50 Innovation Awards

Leave us a voicemail

The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Business Review
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Advertising
  • Contact Us

Copyright © 2022 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Exoskeletons
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Markets
    • Agriculture
    • Healthcare
    • Logistics
    • Manufacturing
    • Mining
    • Security
  • Financial
    • Investments
    • Mergers & Acquisitions
    • Earnings
  • Resources
    • Careers
    • COVID-19
    • Digital Issues
    • Publications
      • Collaborative Robotics Trends
      • Robotics Business Review
    • RBR50 Winners 2022
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • Healthcare Robotics Engineering Forum
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
    • Leave a voicemail