The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

Light launches Clarity camera system to accurate autonomous vehicle vision

By Eugene Demaitre | October 27, 2020

Light Clarity reference design

Multiple cameras provide 3D vision. Source: Light

Unlike other mobile robots, autonomous vehicles need to see far ahead to react safely to traffic conditions at highway speeds. Light today launched its Clarity perception platform. The company said Clarity is able to see any 3D structures in the road from 10 cm to 1 km (3.9 in. to 0.62 mi.) away — three times the distance of current lidar sensors — using passive cameras.

Founded in 2013, Light said it combines breakthroughs in computational imaging with multi-camera calibration and advanced machine learning. Co-founder and CEO Dave Grannan is a serial entrepreneur who developed speech-recognition technology later acquired by Nuance Communications, and co-founder and Chief Technology Officer Rajiv Larioa helped develop the foundation of LTE 4G wireless communications. The Redwood, Calif.-based company said its technology provides accurate depth at both near and far distances in real time.

Clarity uses multiple cameras for high-def 3D vision

Unlike conventional lidar or radar, which sends out signals and defines objects based on their reflections, or stereoscopic vision, which relies on two cameras that may be close together, Light’s Clarity uses multiple cameras and derives depth and distance information for each pixel.

“The processing power for recalibrating every frame 30 times per second was missing from autonomy stacks since 2004. We founded Light to address the problem of calibration with computational imaging,” Grannan told The Robot Report. “Light uses multiple cameras to perceive the world and objects for human-like vision.”

A test van used four cameras, which resulted in a pixel depth superior to that of 32-channel lidar costing $8,000 to $10,000, he said.

“The best lidar today has a range of 250 meters, not 1,000 meters. Lidars provide 72,000 points, and our system provides 1.8 million points,” Grannan said. “Our depth is continuous.”

Clarity is able to generate up to 95 million data points every second, 20 times greater than any perception system currently available, claimed Light. Its depth is also domain-independent, meaning Clarity does not need to be trained to recognize the specific objects it may encounter on the road in order to derive 3D structure.

“Normally, you have to fuse the data, overlaying the lidar and picture, plus machine learning,” Grannan said. “In our case, we are domain-agnostic. We leave object identification and classification to the next steps in machine vision. With Tesla and Mobileye, the machine learning in the stack would have to go through a neural network to identify an object.”

“We recognize that some of the algorithms have to be done on the hardware. Our solution includes our own dedicated silicon,” he added. “It is part of our roadmap to eventually build our own object identification capability.”

Clarity Light depth

Clarity obtains depth information for each pixel. Source: Light

Light system also has potential for ADAS

Light said Clarity’s range and level of detail will contribute to safer advanced driver-assist systems (ADAS) and autonomous vehicles by enabling them to detect and react to potential obstacles more quickly. This is also useful for adaptive suspension systems, Grannan said.

For every 100 meters of added perception, a vehicle gains an additional four seconds of time to slow down, change lanes, or alert the driver to take over, which is important for hand-offs from autonomous systems and for heavier vehicles such as fully loaded Class 8 trucks. “A half-loaded truck doesn’t have the friction and needs more than 250 m to stop,” he noted.

“Automakers have invested billions of dollars and decades of research to make safe, reliable ADAS and self-driving cars a reality. But so far, even the best perception systems on the market miss objects and obstructions in the road — some as big as a semi-truck,” Grannan stated. “Any system that powers a vehicle needs to be equipped with comprehensive, measured depth alongside the type of visual information that cameras provide, in order to make smart decisions that make driving safer.”

“There is nothing else like the Clarity platform with its combination of depth range, accuracy, and density per second. It enables a new generation of vehicles that can be made safer, without having to compromise on cost, quality, or reliability,” said Prashant Velagaleti, chief product officer of Light. “Rather than only minimizing the severity of a collision, having high-fidelity depth allows any vehicle powered by Clarity to make decisions that can avoid accidents, keeping occupants safe as well as comfortable.”

In addition, Clarity’s ability to spot open parking spaces more than 100 meters away could help drivers save time, fuel, and frustration, said Light.

Clarity Light depth

Clarity provides continuous depth information. Source: Light

Clarity intended for sensor fusion, competitive on cost

Light is meant to supplement rather than replace lidar, radar, or ultrasonic sensors, Grannan said. “We’re not religious around the sensor suite,” he said. “Combining data sources makes sense for fault tolerance and redundancy, particularly as we get to Level 5 autonomy.”

How does Clarity handle inclement weather? “Our system works well in low light, and our algorithms work the same when we use near-IR [infrared],” replied Grannan. “To deal with rain and heavier fog, we’re starting to look at shortwave IR.”

The Clarity platform uses off-the-shelf cameras sourced from existing automotive supply chains, keeping costs low, he said. Light said it can take advantage of constant innovations in camera technology and algorithms.

“Our clear advantage that we provide lidar-like accuracy for the cost of cameras,” Grannan said. “It costs tens of thousands of dollars for lidar, while our cameras and ASIC [application-specific integrated circuit] costs OEMs $250 to $260. Since ADAS like lane keeping already uses cameras, they’d need to calibrate them to our specifications but may not need to spend much more.”

“For a 360-degree view, it may cost only $1,000 to the automaker,” he said. “We’re in discussions with design groups, which see camera placement as a simpler problem than for lidar. In the front, two cameras could be placed in the A pillars [around the windshield] and one behind the rearview mirror. We’re also looking at the B and C pillars in the back or in the bumpers.”

Light is in communications with full-stack autonomous vehicle providers, as well as Tier 1 suppliers and the automotive OEMs themselves, Grannan said.

“The automakers say the tolerances for vibration and XYZ cameras are well within their capabilities,” he said. “There’s also applicability for affordable platforms for lower levels of driving assistance.

“We want to be in the next-generation Level 4 and Level 5 platforms that will be tested next year,” said Grannan. “Perception problems are not solved. There’s nothing that solves misidentification problems. … The industry needs to get out of small ring-fenced trials to scale to market. We have the safety to make people comfortable. If we get to regulations for perception, it would be a failure of the industry, which will probably come up with a framework on its own.”

“There’s a place for lidar, radar, and ultrasonic sensors, but there was a missing piece,” he said. “We’ve got a unique offering — three-dimensionality, with size, distance, and velocity — for safety in the self-driving stack.”

Light is hiring and expects to conduct a fundraising round later this year or early next, said Grannan.

About The Author

Eugene Demaitre

Eugene Demaitre is editorial director of the robotics group at WTWH Media. He was senior editor of The Robot Report from 2019 to 2020 and editorial director of Robotics 24/7 from 2020 to 2023. Prior to working at WTWH Media, Demaitre was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, and Robotics Business Review.

Demaitre has participated in robotics webcasts, podcasts, and conferences worldwide. He has a master's from the George Washington University and lives in the Boston area.

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

New enabling technologies from Automate 2025
The Gemini 435Le sensor package from Orbbec.
Orbbec designs Gemini 435Le to help robots see farther, navigate smarter
Images showing the difference between the CARLA simulator on its own, and working with Helm.ai's GenSim-2.
Helm.ai launches AV software for up SAE L4 autonomous driving
A talk being given at the 2024 Robotics Summit's Engineering Theater.
9 sessions to see at the Robotics Summit Engineering Theater

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.
The Robot Report Listing Database

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Sager Electronics and its partners, logos shown here, will exhibit at the 2025 Robotics Summit & Expo. Sager Electronics to exhibit at the Robotics Summit & Expo
  • The Shift in Robotics: How Visual Perception is Separating Winners from the Pack
  • An AutoStore automated storage and retrieval grid. Webinar to provide automated storage and retrieval adoption advice
  • Smaller, tougher devices for evolving demands
  • Modular motors and gearboxes make product development simple
The Robot Report
  • Mobile Robot Guide
  • Collaborative Robotics Trends
  • Field Robotics Forum
  • Healthcare Robotics Engineering Forum
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe