DMR News

Advancing Digital Conversations

Atomathic Launches AISIR For Radar — Physical AI Reasoning Technology for Safety-Critical Perception

ByEthan Lin

Dec 16, 2025

Atomathic, formerly Neural Propulsion Systems and a pioneer in physical AI-sensing technology, today announced AISIR for Radar™ (AI Signal Intelligence Reasoning), a physics-constrained generative reasoning engine designed to deliver stable, reliable radar perception in cluttered, High Dynamic Range (HDR) environments. These are the critical edge cases where conventional radar stacks degrade into phantom objects, target flicker, and poor object separation—rendering them unreliable for safety-critical decisions.

To accompany the launch, the company also published the new white paper “Physical AI Reasoning for Stable Radar Perception: Closing the Reliability Gap,” which details why conventional radar processing hits a stability ceiling—and how Atomathic’s dual-system architecture (Fast Response + Reasoned Response) resolves the underlying ambiguity of sparse-aperture sensing. 

●     WHITE PAPER: Download

●     WATCH: View the HDR stress test video

“The industry widely understands that ADAS and Autonomous Driving (AD) perception stacks struggled to integrate radar effectively due to instability in cluttered scenes. With the introduction of AISIR, we are removing a long-standing roadblock to reliability,” said Dr. Behrooz Rezvani, Founder and CEO of Atomathic. “In our new white paper, we demonstrate why radar has historically failed to match its theoretical potential and how a physics-grounded approach can finally solve this reliability problem. By unifying sparse reconstruction with generative physics reasoning, we have created a stable perception stack capable of supporting ADAS and true autonomy.”

Recent research from NVIDIA and Waymo underscores the industry’s shift toward reasoning-centric, physics-inspired perception platforms.

“Software-defined radar has the potential to tackle one of the most persistent challenges in automated-driving development: achieving reliable, physics-grounded perception without increasing the cost or complexity of the sensor stack,” said Sam Abuelsamid, VP of Market Research at Telemetry. “Recent demonstrations show that radar, when processed with advanced reasoning layers, appears to match LiDAR performance in many safety-critical scenarios, including detecting pedestrians near large vehicles. If these early results hold up in production environments, this class of technology could influence how OEMs think about future ADAS and autonomous system designs.”

Why Radar’s Reliability Gap Persists

Radar remains essential for L2–L4 autonomy because it is the only sensor capable of operating in fog, rain, spray, glare, and low light—conditions where cameras and LiDAR inherently struggle. Yet in real-world driving, production radar systems often fail in HDR, clutter-rich, multipath environments, particularly when detecting Vulnerable Road Users (VRUs) near a truck, school bus, or roadside infrastructure. In these scenes, sidelobes, inconsistent millimeter-wave reflections, and sparse-aperture ambiguity create unstable detections and intermittent tracking.

The industry as a whole has struggled to deploy radar as a critical safety layer, leading to significant setbacks—most notably Tesla’s withdrawal of radar from its Full Self-Driving (FSD) stack, trading potential safety redundancy for driving consistency.

Atomathic’s approach reframes radar from a traditional “filtering and thresholding” pipeline into an inverse problem that requires rigorous reconstruction and physics-grounded inference described in detail in the white paper.

Atomathic Dual-System Architecture: Fast Response + Reasoned Response

AISIR for Radar works in tandem with AIDAR™ (AI Detection and Ranging) to form a cognitive, dual-system perception stack:

●     AIDAR (Fast Response / System 1): Performs rapid, per-frame sparse reconstruction—decomposing raw radar measurements into a compact set of physically meaningful “atoms” for hyper-resolution separation in clutter.

●     AISIR for Radar (Reasoned Response / System 2): Applies physics-constrained generative inference over time. It tests competing hypotheses using wave-consistent signal prediction, rejects physically inconsistent returns (ghosts), and stabilizes perception using adaptive compute.

Proof Point: HDR “Pedestrian Next to Truck” Stress Test

The white paper includes a canonical HDR clutter stress test: a pedestrian walking beside a large metallic truck—an environment where traditional radar processing will lose or intermittently suppress the pedestrian due to sidelobes and masking.

In the demonstration, Atomathic’s AISIR isolates and locates the pedestrian very close to the truck and then stabilizes the track over time using hierarchical reasoning, while reducing flicker and rejecting interference that does not remain physically self-consistent across the sequence.

Key Findings in the White Paper

●     Solving the “Sparse Aperture” Deficit: How structured sparse reconstruction addresses the ill-posed reality that reflections often outnumber antennas in cluttered scenes.

●     Dual-System Stability: How fast per-frame reconstruction (AIDAR) paired with physics-based reasoning (AISIR) suppresses ghosts and stabilizes tracks.

●     HDR Clutter Resolution: Evidence of robust Vulnerable Road User (VRU) detection in sidelobe-heavy scenes.

 ###

All product and company names may be trademarks or registered trademarks of their respective holders. Use of them does not imply any affiliation with or endorsement by them.

Ethan Lin

One of the founding members of DMR, Ethan, expertly juggles his dual roles as the chief editor and the tech guru. Since the inception of the site, he has been the driving force behind its technological advancement while ensuring editorial excellence. When he finally steps away from his trusty laptop, he spend his time on the badminton court polishing his not-so-impressive shuttlecock game.

Leave a Reply

Your email address will not be published. Required fields are marked *