Oculii looks to supercharge radar for autonomy with $55M round B
Autonomous vehicles rely on many sensors to perceive the world around them, and while cameras and lidar get a lot of the attention, good old radar is an important piece of the puzzle - though it has some fundamental limitations. Oculii, which just raised a $55M round, aims to minimize those limitations and make radar more capable with a smart software layer for existing devices - and sell its own as well.
Radar's advantages lie in its superior range, and in the fact that its radio frequency beams can pass through things like raindrops, snow, and fog - making it crucial for perceiving the environment during inclement weather. Lidar and ordinary visible light cameras can be totally flummoxed by these common events, so it's necessary to have a backup.
But radar's major disadvantage is that, due to the wavelengths and how the antennas work, it can't image things in detail the way lidar can. You tend to get very precisely located blobs rather than detailed shapes. It still provides invaluable capabilities in a suite of sensors, but if anyone could add a bit of extra fidelity to its scans, it would be that much better.
That's exactly what Oculii does - take an ordinary radar and supercharge it. The company claims a 100x improvement to spatial resolution accomplished by handing over control of the system to its software. Co-founder and CEO Steven Hong explained in an email that a standard radar might have, for a 120 degree field of view, a 10 degree spatial resolution, so it can tell where something is with a precision of a few degrees on either side, and little or no ability to tell the object's elevation.
Some are better, some worse, but for the purposes of this example that amounts to an effectively 12*1 resolution. Not great!
Handing over control to the Oculii system, however, which intelligently adjusts the transmissions based on what it's already perceiving, could raise that to a 0.5 horizonal x 1 vertical resolution, giving it an effective resolution of perhaps 120*10. (Again, these numbers are purely for explanatory purposes and aren't inherent to the system.)
That's a huge improvement and results in the ability to see that something is, for example, two objects near each other and not one large one, or that an object is smaller than another near it, or - with additional computation - that it is moving one way or the other at such and such a speed relative to the radar unit.
Here's a video demonstration of one of their own devices, showing considerably more detail than one would expect:
Exactly how this is done is part of Oculii's proprietary magic, and Hong did not elaborate much on how exactly the system works. Oculii's sensor uses AI to adaptively generate an intelligent' waveform that adapts to the environment and embed information across time that can be leveraged to improve the resolution significantly," he said. (Integrating information over time is what gives it the 4D" moniker, by the way.)
Here's a little sizzle reel that gives a very general idea:
Autonomous vehicle manufacturers have not yet hit on any canonical set of sensors that AVs should have, but something like Oculii could give radar a more prominent place - its limitations sometimes mean it is relegated to emergency braking detection at the front or some such situation. With more detail and more data, radar could play a larger role in AV decisionmaking systems.
The company is definitely making deals - it's working with Tier-1s and OEMs, one of which (Hella) is an investor, which gives a sense of confidence in Oculii's approach. It's also working with radar makers and has some commercial contracts looking at a 2024-2025 timeline.
Image Credits: Oculii
It's also getting into making its own all-in-one radar units, doing the hardware-software synergy thing. It claims these are the world's highest resolution radars, and I don't see any competitors out there contradicting this - the simple fact is radars don't compete much on resolution," but more on the precision of their rangefinding and speed detection.
One exception might be Echodyne, which uses a metamaterial radar surface to direct a customizable radar beam anywhere in its field of view, examining objects in detail or scanning the whole area quickly. But even then its resolution" isn't so easy to estimate.
Echodyne steers its high-tech radar beam on autonomous cars with EchoDrive
At any rate the company's new Eagle and Falcon radars might be tempting to manufacturers working on putting together cutting-edge sensing suites for their autonomous experiments or production driver-assist systems.
It's clear that with radar tipped as a major component of autonomous vehicles, robots, aircraft and other devices, it's worth investing seriously in the space. The $55M B round certainly demonstrates that well enough. It was, as Oculii's press release lists it, co-led by Catapult Ventures and Conductive Ventures, with participation from Taiwania Capital, Susquehanna Investment Group (SIG), HELLA Ventures, PHI-Zoyi Capital, R7 Partners, VectoIQ, ACVC Partners, Mesh Ventures, Schox Ventures, and Signature Bank."
The money will allow for the expected scaling and hiring, and as Hong added, continued investment of the technology to deliver higher resolution, longer range, more compact and cheaper sensors that will accelerate an autonomous future."