
Source: Sense Photonics
Whether they’re called lidar sensors or 3D time-of-flight cameras, accurate and affordable perception is essential to the development of new mobile robots, autonomous vehicles, and industrial applications such as inspection or security. Sense Photonics Inc. makes a solid-state camera that it says can provide long-range sensing, distinguish intensity, and work in sunlight.
The Durham, N.C.-based company announced its Osprey modular FLASH lidar in January 2020 and was an exhibitor at MODEX 2020. Sense Photonics was also an 2020 RBR50 innovation award winner (Robotics Business Review is a sibling publication to The Robot Report).
Sense Photonics named Shauna McIntyre, a 25-year automotive industry veteran, its new CEO in April. She recently spoke with The Robot Report about her experience and approach to sensing and robotics.
Coming from Google and Lithia Motors, how did you see sensing?
McIntyre: At Google Maps, we were always trying to develop better maps for consumers, but also for objects to have maps. For example, inside and outside buildings, there’s a woeful inadequacy for moving inside factories, warehouses, etc. I knew the need was there, and I was one of the first at Ford to do lean manufacturing and automate a plant with programmable closed-loop robots.
The world of objects — being able to move around a space for health, productivity, or environmental reasons — I was exposed to the huge need for better maps.
How did you come to Sense Photonics?
McIntyre: I was approached by a recruiter and found out that there are tons of lidar companies out there. Sense Photonics has made good decisions to be economically viable. I have not seen another company like it in the intelligent vision space. Most sensors make detecting moving objects on vehicles too complex and expensive to do well. I wanted to take a crack at bat to change the world.
How has the COVID-19 pandemic affected you and the company?
McIntyre: Its effect on the automotive industry affected everybody, especially in sensing.
One surprising silver lining of the pandemic is that it’s forcing businesses to rethink how the world works and understand the need for object-based sensing. I’m glad I made the choice to come over; there’s a stronger sense of urgency.
At Sense Photonics, we’ve found ways to still be productive, and the company was granted “essential” status. Our high-performing team has been so busy that you wouldn’t know there was a pandemic. We’ve had two shifts in the clean room and are firing on all cylinders.
A lot of lab calibration and light machinery work could be done remotely, and we did drive testing in a van. Our team has multiple locations in the [San Francisco] Bay Area, as well as in North Carolina and Scotland.
The terminology around sensors for robots and self-driving cars has been shifting. Are we talking about 3D cameras, lidar, or machine vision? Is it just marketing hype?
McIntyre: What Sense Photonics is doing is unique in that it’s 3D vision systems. We characterize it that way because it’s not exclusive to lidar. Osprey has capabilities beyond human eyes, but it’s camera-like and can fuse with RGB data and with FLASH single-light systems.
Our system gathers an order of magnitude more data than competing systems to build a better picture. Better data helps populate the autonomy stack.
There is a lot of confusion around vision systems. Sense Photonics’ lidar is solid-state. Some might claim to be solid-state even though they have moving parts; we have no moving parts. That’s important for reliability and precision because it’s more reliable than having to compensate for that in addition to any bouncing around on a moving forklift or an AGV [automated guided vehicle].
Speaking of vehicles, is Sense Photonics focusing on any one type?
McIntyre: The duty cycle of vehicles is inherently variable, so the current wave of automation has to be variable. Programmable robots had use cases, but we need more intelligence now. We need that flexibility in warehouses and we need reliability to ensure that can trust the intelligence coming from sensors.
As for mobile robots versus vehicles, I give the company credit for not basing itself on one sector. I know the time it takes to validate products for vehicle safety, and trends change. I appreciate that Sense Photonics has a broader view and has embraced industrial applications.
We build 3D intelligent vision systems that are industry-agnostic. We have great proof points in stationary applications like security or feeding into fleet-management systems. We’re a platform, and use cases today are very much in industrial applications.
For example, we’ve worked with Honeywell on stationary sensing. To validate a product on moving technology is very difficult, and we’ve already got a lot of good data from stationary uses, such as to eliminate truck idling in bays or to provide data on warehouse efficiency.
Autonomous vehicles will take a while — and co-development with automakers. There are also potential consumer and medical applications, and our platform can apply to a lot of them.
Sense Photonics won a 2020 RBR50 innovation award for Osprey. What sorts of use cases have you seen for it so far?
McIntyre: They include stationary security, as well as picking and inventory counting. The camera has provided an 80% improvement in the level of specificity for counting parts in bins passing through a warehouse.
We have agreements and relationships in last-mile robotic delivery space, and Osprey integrates nicely into autonomous vehicles. Nobody wants a coffee can on top of a robot or car. Design is really important, and we’ve separated emitter and receiver to integrate better with others’ designs. Our understanding from working with designers is another inherent advantage.

The Osprey solid-state lidar sensor. Source: Sense Photonics
How much is Sense Photonics working on the software and data side?
McIntyre: Hardware is easily commoditized, and moving up the stack is a high priority for us. We have a team and are actively recruiting and hiring. We’re providing hardware, a SDK [software development kit], and object lists. Our vice president of software brings wonderful leadership from drone vision at NVIDIA.
We have an advantage with the amount of data. It’s then a matter of using artificial intelligence and machine learning to continuously improve systems and build a learning system. The more we can help customers use data, the better. It’s a big and exciting area.
What is Sense Photonics working on now?
McIntyre: We have products available now, and more are coming soon.
We also want to be the first true solid-state lidar for longer ranges, up to 200 meters. Although automotive companies are most interested, other robotics companies have expressed interest.
Safety-critical applications like aircraft or car brakes have to have redundancy. It will have to be the same in autonomous vehicles, which also have performance and economic demands.
Where are you with funding and plans for future growth?
McIntyre: We’re still on Series A and have a good runway. Given the uncertainty around the pandemic, we’ll be kicking off a Series B fundraising for an undetermined amount a bit early.
I’m impatient and want to scale as quickly as possible while we have an advantage. We’ve hired three executives in recent months, and the sooner we scale, the better.
To date, Sense Photonics has been very focused on R&D and is not as big a name as other sensor companies. We’ve been able to focus on building an outstanding product rather than building a huge organization. That’s one reason why we’ve been able to recruit from competitors.
Tell Us What You Think!