The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe

How Carbon Robotics built the large plant model for its laser weeding robot

By Brianna Wessling | November 7, 2025

Carbon Robotics Founder and CEO Paul Mikesell with the company's LaserWeeder G2.

Carbon Robotics founder and CEO Paul Mikesell with the LaserWeeder G2. | Source: Carbon Robotics

Carbon Robotics has laser-weeding robots running in 14 countries around the world, taking on a variety of crops, weather conditions, and weed types. Behind all of these robots is the “large plant model” that the company has been developing since its first days in operation.

Paul Mikesell, the CEO of Carbon Autonomous Robotic Systems Inc., started the company after speaking with a farmer about the challenges of agriculture today.

“I sold an airplane to a farmer, and we just started talking,” he told The Robot Report. “I realized that agriculture had big problems, and that nobody seemed to be focusing on it and trying to solve them. I could immediately see how to build some value there, based on everything I was hearing.”

Mikesell saw an opportunity for AI and robotics to help farmers with the weeding process, for which traditional processes can be timely, costly, and ineffective. As he set out to make a robot to solve the problem, Mikesell knew he would need to create a robot that could get to work quickly and provide value to farmers.

“I think having something working quickly and early is really important in any of these kinds of companies, versus starting off with a big vision and then spending years just trying to build it,” Mikesell said.

Carbon’s team spent years working on farms, in laptops out of trailers, to gather data about the real-world working conditions of the field. “The farmers were super welcoming, very insightful, and also open with us about what their problems were and where they could see ROI based on what we were building,” he said.

This data would become crucial for the Seattle-based company’s large plant model (LPM), which enables its current system to identify and target weeds.

Carbon Robotics gathered much of its early data itself

“So, what you need at the beginning of any of these AI projects is a good data set to train across this data. Ours were all curated data sets,” explained Mikesell. “We were literally taking the pictures ourselves. We were doing all the labeling ourselves.”

Creating an AI system that will work in real-world conditions doesn’t just require a lot of data; it also requires high-quality data. To do this, Carbon Robotics spent a lot of time perfecting the lighting system around its cameras to ensure that each picture the robot collects is clear in varied lighting conditions outdoors.

“Early on, we developed a pretty incredible lighting system that gives us just beautiful pictures without shadow. So, not only do we have the largest dataset, but it’s probably also the best pictures, with high resolution and perfect lighting,” Mikesell said. “We have all the geo-tagged information, so we know where they were, what time of day, and all that kind of stuff.”

“Our lights flash five times as bright as the sun, but it doesn’t look like it, because the duty cycle is so light. It’s a 10% duty cycle, so when your eyes look at it, it just kind of looks medium bright,” he added.

Mikesell noted that other outdoor robots use a kind of skirt that goes around the camera system. This can work for a lot of the day. However, when the sun is setting or rising, and light is coming from the horizon line, it can still affect performance.

Starting a ‘data flywheel’ quickly is key

A close up of the laser weeder killing a weed.

When the LaserWeeder kills weeds, the nutrients from the weeds go back into the soil to fertilize crops, said Mikesell. | Source: Carbon Robotics

Eventually, Carbon Robotics’ team was able to build its own labeling tools and create an AI model. “We were always concerned about how you get enough data quickly,” Mikesell said. “What happened was we got our AI to be good enough, quickly enough, that then we started deploying and selling machines.”

This opens the door to creating a data flywheel. As robots in the field collect more high-quality, real-world data every day, the model becomes more robust. To enable this, the team prioritized capturing data from deployed robots.

“Once you get that flywheel going, then you’ve got this nice dataset, and you can do some pretty amazing things,” according to Mikesell. “We’ve been able to convert our AI now into this new type of AI we announced this year, called a large plant model or the LPM. The point of the LPM is that it can generalize now about plant types without having to know anything ahead of time about what you’re trying to do.”

“That means we can go into a new crop that we’ve never seen before, and without any retraining at all, we can tell the AI, this is your crop and these are your weeds,” he continued.

Will Carbon use generative AI in the future?

Mikesell noted that Carbon Robotics isn’t currently using generative AI in its robots, but he does see the potential for the technology to make an impact in agriculture.

“One place could be synthetic training data,” he said. “There may be a debate about whether or not you would call that generative AI, but that’s adding pieces of synthetic training data to make your AI models better. What you’re doing there is using a really big, slow, and expensive model to produce data to train what’s supposed to be your fast and light deployment.”

“The other thing, of course, would be human interaction, because generative AI works really well for speech interaction with humans,” added Mikesell. “So we will do more of that in the future.”

Mikesell has also seen companies use generative AI to sift through farm data and present it to farmers in easy-to-understand ways. For example, it could enable a farmer to simply ask a system about their yields or crop health.

Carbon Robotics brings autonomy to tractors

Earlier this year, Carbon Robotics released the Carbon Autonomy Tractor Kit (ATK), a retrofit for existing tractors. Mikesell said the company had already developed an autonomy system for its LaserWeeder, which it eventually decided to sell as a pull-behind unit instead of an independent one.

It’s now using this technology to bring autonomy to farmers working with planted fields.

“We decided to tackle what we think was the harder but maybe more opportune task, which is these planted field tasks,” said Mikesell. “So I’m talking about everything from laser weeding, but also spraying, cultivating, cultivation, bed forming, tillage, and field setup.”

“These folks already have their tractors, and they typically want to be able to put somebody in the tractor at various parts of the day for different operations,” he said. Carbon said the kit provides farmers with more flexibility, so they can take advantage of autonomy when needed.


SITE AD for the 2026 Robotics Summit save the date.

Carbon brings in fresh funding, teases new product line

Last month, Carbon Robotics raised $20 million in a Series D-2 extension round, led by Giant Ventures. Mikesell said the new funding will enable the company to finish developing its latest product line.

“It’s focused entirely on a third product, a new product line that we’re developing that we have not announced or spoken about yet, other than to say that it’s a reuse of our existing AI stack in some new ways, with some new robots that we’re building,” Mikesell said.

About The Author

Brianna Wessling

Brianna Wessling is an Associate Editor, Robotics, WTWH Media. She joined WTWH Media in November 2021, after graduating from the University of Kansas with degrees in Journalism and English. She covers a wide range of robotics topics, but specializes in women in robotics, robotics in healthcare, and space robotics.

She can be reached at [email protected]

Tell Us What You Think! Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles Read More >

closeup of the LCM chip.
Programmable optics pioneer Lumotive opens new centers in Oman and Taiwan
Festo has developed its soft gripper to be sold as components, shown here, or as an entire system.
Festo designs HPSX compliant gripper to meet industry requirements
Drones and self-driving tractors are examples of autonomous machines using physical AI.
Is physical world AI the future of autonomous machines?
Holman Robotics is a new division offering unified automation support.
Holman Robotics launches to offer automation management services

RBR50 Innovation Awards

“rr
EXPAND YOUR KNOWLEDGE AND STAY CONNECTED
Get the latest info on technologies, tools and strategies for Robotics Professionals.

Latest Episode of The Robot Report Podcast

Automated Warehouse Research Reports

Sponsored Content

  • Supporting the future of medical robotics with smarter motor solutions
  • YUAN Unveils Next-Gen AI Robotics Powered by NVIDIA for Land, Sea & Air
  • ASMPT chooses Renishaw for high-quality motion control
  • Revolutionizing Manufacturing with Smart Factories
  • How to Set Up a Planetary Gear Motion with SOLIDWORKS
The Robot Report
  • Automated Warehouse
  • RoboBusiness Event
  • Robotics Summit & Expo
  • About The Robot Report
  • Subscribe
  • Contact Us

Copyright © 2025 WTWH Media LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media
Privacy Policy | Advertising | About Us

Search The Robot Report

  • Home
  • News
  • Technologies
    • Batteries / Power Supplies
    • Cameras / Imaging / Vision
    • Controllers
    • End Effectors
    • Microprocessors / SoCs
    • Motion Control
    • Sensors
    • Soft Robotics
    • Software / Simulation
  • Development
    • Artificial Intelligence
    • Human Robot Interaction / Haptics
    • Mobility / Navigation
    • Research
  • Robots
    • AGVs
    • AMRs
    • Consumer
    • Collaborative Robots
    • Drones
    • Humanoids
    • Industrial
    • Self-Driving Vehicles
    • Unmanned Maritime Systems
  • Business
    • Financial
      • Investments
      • Mergers & Acquisitions
      • Earnings
    • Markets
      • Agriculture
      • Healthcare
      • Logistics
      • Manufacturing
      • Mining
      • Security
    • RBR50
      • RBR50 Winners 2025
      • RBR50 Winners 2024
      • RBR50 Winners 2023
      • RBR50 Winners 2022
      • RBR50 Winners 2021
  • Resources
    • Automated Warehouse Research Reports
    • Digital Issues
    • eBooks
    • Publications
      • Automated Warehouse
      • Collaborative Robotics Trends
    • Search Robotics Database
    • Videos
    • Webinars / Digital Events
  • Events
    • RoboBusiness
    • Robotics Summit & Expo
    • DeviceTalks
    • R&D 100
    • Robotics Weeks
  • Podcast
    • Episodes
  • Advertise
  • Subscribe