Cornell Tech researchers have developed a mixed reality (XR) driving simulator system that could reduce the cost of testing vehicle systems and interfaces, such as the turn signal and dashboard.
Through the use of a publicly accessible headset, virtual objects and events are superimposed on the view of participants driving unmodified vehicles through low-speed test areas, to produce collection of accurate and repeatable data on behavioral responses in real driving tasks.
Doctoral student David Goedicke is the lead author of “XR-OOM: Mixed reality driving simulation with real cars for research and design», which he will present to the Association for Computing Machinery’s CHI Conference 2022from April 30 to May 5 in New Orleans.
The main author is Wendy Juassociate professor of information science at the Jacobs Technion-Cornell Institute at Cornell Tech and the Technion, and member of the information science field at Cornell. Hiroshi Yasuda, a human-computer interaction researcher at the Toyota Research Institute, also contributed to the study.
This work is an offshoot of Ju’s lab research conducted in 2018, which resulted in VR-OOM, a virtual reality on-road driving simulation program. This current work goes one step further, Goedicke said, by combining real-world video — known as “video pass-through” — in real time, with virtual objects.
“What you’re trying to do is create scenarios,” he said. “You want to feel like you’re driving in a car, and the developer wants full control over what scenarios you want to show a participant. Ultimately, you want to use the real world as much as possible.
This system was built using the Varjo XR-1 Mixed Reality headset, along with the Unity simulation environment, which previous researchers have demonstrated can be used for driving simulation in a moving vehicle. . XR-OOM integrates and validates them into a usable driving simulation system that integrates real-world variables, in real time.
“One of the problems with traditional simulation testing is that they really only consider scenarios and situations that designers have thought about,” Ju said, “so a lot of important things happening in the world are not captured in these experiments.(XR-OOM) increases the ecological validity of our studies, to be able to understand how people will behave in really specific circumstances.
A challenge with XR vs. VR is rendering the outside world faithfully, Goedicke said. In mixed reality, what’s on the video screen must match the outside world precisely.
“In virtual reality, you can fool the brain very easily,” he said. “If the horizon doesn’t quite match, for example, it’s not a big problem. Or if you do a 90 degree turn but it was more like 80 degrees, your brain doesn’t care that much. But if you try to do that with mixed reality, where you bring in real elements from the real world, it doesn’t work at all.
To test the validity of their method, the researchers designed an experiment with three conditions: No headphones (Condition A); helmet with video pass-through only (Condition B); and headset with video pass-through and virtual objects (Condition C).
Participants were asked to perform several stationary tasks, including starting the vehicle, adjusting the seat and mirrors, fastening the seat belt, and verbally describing the dashboard lights that are visible. Participants were also asked to complete several low-speed driving tasks, including left and right turns, slalom navigation, and stopping on a line. Pilots in conditions A and B had to navigate around actual physical cones, placed 8 feet apart; those in condition C saw superimposed cones in their helmets.
Most participants successfully completed all cockpit tasks, with most unsuccessful attempts attributable to unfamiliarity with the vehicle. Most passed the driving tasks as well, with slalom navigation being the most challenging for all, regardless of condition.
This success validates the potential, Ju said, of this technology as a low-cost alternative to elaborate facilities for testing certain in-vehicle technologies.
“TThis type of high-resolution mixed-reality headset is becoming more widely available, so now we’re thinking about how to use it for driving experiences,” Ju said. “More people will be able to take advantage of these things which will be commercially available and inexpensive very soon.”
Other co-authors include Sam Lee, MS ’21; and PhD students Alexandra Bremers and Fanjun Bu.
This research was funded by the Toyota Research Institute.