Jet Fighter With a Steering Wheel: Inside the Augmented-Reality Car HUD
The 2022 Mercedes-Benz EQS, the first all-electric sedan from the company that essentially invented the automobile in 1885-1886, glides through Brooklyn. But this is definitely the 21st century: Blue directional arrows seem to paint the pavement ahead via an augmented-reality (AR) navigation system and color head-up display, or HUD. Digital street signs and other graphics are superimposed over a camera view on the EQS's much-hyped Hyperscreen"-a 142-centimeter (56-inch) dash-spanning wonder that includes a 45-cm (17.7-inch) OLED center display. But here's my favorite bit: As I approach my destination, AR street numbers appear and then fade in front of buildings as I pass, like flipping through a virtual Rolodex; there's no more craning your neck and getting distracted while trying to locate a home or business. Finally, a graphical map pin floats over the real-time scene to mark the journey's end.
It's cool stuff, albeit for folks who can afford a showboating Mercedes flagship that starts above US $103,000 and topped $135,000 in my EQS 580 test car. But CES 2022 in Las Vegas saw Panasonic unveil a more-affordable HUD that it says should reach a production car by 2024.
Head-up displays have become a familiar automotive feature, with a speedometer, speed limit, engine rpms, or other information that hovers in the driver's view, helping keep eyes on the road. Luxury cars from Mercedes, BMW, Genesis, and others have recently broadened HUD horizons with larger, crisper, more data-rich displays.
Mercedes Benz augmented reality navigationyoutu.be
Panasonic, powered by Qualcomm processing and AI navigation software from Phiar Technologies, hopes to push into the mainstream with its AR HUD 2.0. Its advances include an integrated eye-tracking camera to accurately match AR images to a driver's line of sight. Phiar's AI software lets it overlay crisply rendered navigation icons and spot or highlight objects including vehicles, pedestrians, cyclists, barriers, and lane markers. The infrared camera can monitor potential driver distraction, drowsiness, or impairment, with no need for a standalone camera as with GM's semiautonomous Super Cruise system.
Panasonic's AR HUD system includes eye-tracking to match AR images to the driver's line of sight.Panasonic
Andrew Poliak, CTO of Panasonic Automotive Systems Company of America, said the eye tracker spots a driver's height and head movement to adjust images in the HUD's eyebox."
We can improve fidelity in the driver's field of view by knowing precisely where the driver is looking, then matching and focusing AR images to the real world much more precisely," Poliak said.
For a demo on the Las Vegas strip, using a Lincoln Aviator as test mule, Panasonic used its SkipGen infotainment system and a Qualcomm Snapdragon SA8155 processor. But AR HUD 2.0 could work with a range of in-car infotainment systems. That includes a new Snapdragon-powered generation of Android Automotive-an open-source infotainment ecosystem, distinct from the Android Auto phone-mirroring app. The first-gen, Intel-based system made an impressive debut in the Polestar 2, from Volvo's electric brand. The uprated Android Automotive will run in 2022's lidar-equipped Polestar 3 SUV-an electric Volvo SUV-and potentially millions of cars from General Motors, Stellantis, and the Renault-Nissan-Mitsubishi alliance.
Gene Karshenboym helped develop Android Automotive for Volvo and Polestar as Google's head of hardware platforms. Now, he's chief executive of Phiar, a software company in Redwood, Calif. Karshenboym said AI-powered AR navigation can greatly reduce a driver's cognitive load, especially as modern cars put ever more information at their eyes and fingertips. Current embedded navigation screens force drivers to look away from the road and translate 2D maps as they hurtle along.
It's still too much like using a paper map, and you have to localize that information with your brain," Karshenboym says.
In contrast, following arrows and stripes displayed on the road itself-a digital yellow brick road, if you will-reduces fatigue and the notorious stress of map reading. It's something that many direction-dueling couples might give thanks for.
You feel calmer," he says. You're just looking forward, and you drive."
Street testing Phiar's AI navigation engineyoutu.be
The system classifies objects on a pixel-by-pixel basis at up to 120 frames per second. Potential hazards, like an upcoming crosswalk or a pedestrian about to dash across the road, can be highlighted by AR animations. Phiar's synthetic model trained its AI for snowstorms, poor lighting, and other conditions, teaching it to fill in the blanks and create a reliable picture of its environment. And the system doesn't require granular maps, monster computing power, or pricey sensors such as radar or lidar. Its AR tech runs off a single front-facing, roughly 720p camera, powered by a car's onboard infotainment system and CPU.
There's no additional hardware necessary," Karshenboym says.
The company is also making its AR markers appear more convincing by occluding" them with elements from the real world. In Mercedes's system, for example, directional arrows can run atop cars, pedestrians, trees, or other objects, slightly spoiling the illusion. In Phiar's system, those objects can block off portions of a magic carpet" guidance stripe, as though it were physically painted on the pavement.
It brings an incredible sense of depth and realism to AR navigation," Karshenboym says.
Once visual data is captured, it can be processed and sent anywhere an automaker chooses, whether a center display, a HUD, or passenger entertainment screens. Those passenger screens could be ideal for Pokemon-style games, the metaverse, or other applications that combine real and virtual worlds.
Poliak said some current HUD units hog up to 14 liters of volume in a car. A goal is to reduce that to 7 liters or less, while simplifying and cutting costs. Panasonic says its single optical sensor can effectively mimic a 3D effect, taking a flat image and angling it to offer a generous 10- to 40-meter viewing range. The system also advances an industry trend by integrating display domains-including a HUD or driver's cluster-in a central, powerful infotainment module.
You get smaller packaging and a lower price point to get into more entry-level vehicles, but with the HUD experience OEMs are clamoring for," Poliak said.