Listen to this article
|
Building off of its nuScenes autonomous driving dataset, Motional today open-sourced its nuReality set of virtual reality (VR) environments. The custom environments are designed to study interactions between autonomous vehicles and pedestrians. Motional said open-sourcing the VR environments will help the research community.
Motional said a key challenge to widespread acceptance and adoption of autonomous vehicles is effective communication between AVs and other road users. For example, how will an autonomous vehicle acknowledge that it sees a cyclist or pedestrian who is trying to cross a street?
To study this scenario and others, Motional created VR environments using CHRLX animation studio. There are 10 VR scenarios modeled after an urban 4-way intersection. There is no clear pedestrian crossing zone, stop signs, stoplights, or other elements to indicate that the car lawfully would have to come to a complete stop. Those scenarios include:
- A human driver stopping at an intersection
- An AV stopping at an intersection
- A human driver not stopping at an intersection
- An autonomous vehicle not stopping at an intersection
- An autonomous vehicle using expressive behavior, such as a light bar or sounds, to signal its intentions
Motional uses two vehicle models in VR environments: a conventional, human-driven vehicle and an autonomous vehicle without a human operator. Both vehicles were modeled after a white 2019 Chrysler Pacifica, but the autonomous vehicle model has some notable differences. It includes both side-mirror and roof-mounted LiDAR sensors and has no visible occupants. The human driven model includes a male driver who looks straight ahead and remains motionless during the interaction.
In the clip below, the approaching Motional robotaxi uses an LED strip in the front windshield to indicate that the vehicle is stopping.
In this next clip, Motional said the approaching robotaxi’s nose dips to signal that the vehicle is stopping.
And in this next clip, the autonomous vehicle features exaggerated brake and RPM reduction sounds to alert other road users that it’s coming to a stop.
You can view more clips from Motional here. Motional said it adopts some of the learnings from these VR tests for the development of its SAE Level 4 autonomous vehicles.
“By making nuReality open source, we hope these VR files will accelerate research into pedestrian-AV interactions and Expressive Robotics,” said Paul Paul Schmitt, principal engineer – planning, controls, drive by wire architecture, Motional. “It’s like the old saying: If you want to go fast, go alone. But, if you want to go far, go together.”
Motional, the autonomous vehicle company that is a $4 billion joint venture between Hyundai and Aptiv, recently said it plans to launch a fully driverless robotaxi service in Las Vegas in 2023. It’s been testing robotaxis in Las Vegas for nearly four years, completing over 100,000 passenger trips, but with a human safety driver inside each car.
In November 2020, Motional received the go-ahead from the state of Nevada to test fully driverless vehicles on public roads. While Nevada won’t require a human safety driver behind the wheel, Motional has still had a safety driver in the passenger seat for extra precautions.
Laura Major, CTO, Motional, recently joined The Robot Report Podcast to discuss the challenges of developing and deploying Motional’s technology, including the service in Las Vegas. She also discussed when the safety drivers could potentially be removed and when Motional’s service will be commercially available to fleet operators. You can listen to the conversation with Laura below, starting at about the 48-minute mark.
Motional has also partnered with Derq to test how autonomous vehicles react when given a broader perspective than they already have. At two intersections in Las Vegas, cameras placed high above the roads are connected to Derq’s AI system. The cameras will transmit data to Motional’s vehicles, providing a different view of some of the toughest intersections they’re navigating.
Motional’s autonomous vehicles use a sensor suite of advanced LiDAR, cameras, and radar that see up to 300 meters away and 360 degrees around the vehicle. But further enhancing the technology could help the vehicles navigate these challenging environments.
Tell Us What You Think!