By: Dawn Zoldi
At CES 2026, Ambarella used an off‑site invitation-only exhibition to make a simple point about electric vehicle autonomy: the car is the shell, but the real value lies in the “guts and brains” of the compute and sensing stack. By tightly coupling a purpose‑built artificial intelligence system‑on‑a‑chip (AI SoC), a full autonomous driving (AD) software stack and adaptive 4D imaging radar, the company has positioned itself as a foundational supplier for electric vehicle (EV) makers that need range‑friendly autonomy, not gaming‑class power hogs.
Ambarella in the EV Autonomy Race
Ambarella started in video compression silicon in 2004 and evolved into a fabless edge AI semiconductor company focused on perception, fusion and planning at the edge. Its low‑power SoCs now sit inside advanced driver assistance systems (ADAS) cameras, electronic mirrors, telematics units, driver monitoring systems, fully autonomous driving stacks and a wide range of robotics and drone platforms. That evolution accelerated with the acquisition of VisLab in 2015, when the company brought in a team with decades of autonomous driving research and a mature AD stack that has since been optimized to run directly on Ambarella’s CV3‑AD AI domain controller family.
Rather than chasing broad, general‑purpose compute, Ambarella designs its SoCs “from the algorithm up.” It starts with the workloads needed for perception and planning and then builds custom accelerators around them. The result is a chip that cannot mine bitcoin or run high‑budget, triple‑A video games, but can deliver multiple times the performance, memory efficiency and power efficiency of a competing embedded graphics processing unit (GPU) on AD workloads, all within a tight power envelope that matters for EV range and thermal design.
Guts, Brains and the Full AD Stack

In the Ambarella CES exhibition’s automotive section, Pier Paolo Porta, Ambarella’s AD Stack Product Marketing Director and co‑founder of VisLab, walked through how the company’s stack has grown from simple “viewing” to full autonomy “guts and brains.” Ambarella’s chips began as video encoders, then added camera‑based sensing so the system could not only capture images but also understand what was in them. Now, those same SoCs perform planning as central domain controllers for autonomous driving.
Porta framed the car itself as “just half of the game,” with the other half being the invisible pipeline of data that turns recordings into 3D labeled data crucial for the training of the AD stack. In Ambarella’s architecture, the CV3‑AD SoC (CV as in computer vision) runs the complete AD stack, including perception, prediction and planning, on its dedicated CVflow AI engines, leaving most of the on‑chip Arm CPU (advanced reduced instruction set computing machine) resources free for OEM‑specific software and functional safety tasks.
The stack ingests data from cameras and Oculii 4D imaging radar and fuses them on a single CV3‑AD SoC to build a high‑fidelity model of the environment. Unlike approaches that rely on pre‑generated high-definition (HD) maps, Ambarella’s software uses standard‑definition (SD) maps and generates HD‑like detail on the fly from live sensor data. This reduces dependence on costly map maintenance while still supporting higher levels of autonomy.
Purpose‑Built GPU Alternative for EVs

A central theme of the Ambarella CES exhibition was efficiency. This is especially important for EVs where every watt used for compute is a watt not available for range. In a side‑by‑side demo against a well‑known embedded GPU, Porta showed Ambarella’s N1‑655‑class silicon delivers close to three times the AI performance, memory efficiency and power efficiency on the same workload. That headroom can mean the difference between requiring an expensive liquid‑cooled electronic control unit (ECU) and using a simpler air‑cooled, easily swappable one. This has obvious implications for both up‑front cost and maintenance.
Porta described three guiding principles for Ambarella’s SoCs: efficiency, flexibility and reusability. Efficiency is measured not only in tera‑operations per second, but in frames per second per watt and in how much data must be shuffled between the SoC and memory, areas where a carefully tuned accelerator can avoid the bandwidth bottlenecks that often limit GPU‑based systems. Flexibility and reusability come from a modular AD stack and scalable SoC family, allowing OEMs to select the right CV3‑AD variant and use all or part of the software stack alongside their own code, from L2+ driver assistance up through higher‑level autonomy.
For EV builders, this matters in three ways. First, lower power consumption helps preserve battery range while still running rich sensor suites. Second, simpler thermal management reduces engineering complexity and bill of materials. Third, a common hardware‑software platform across a fleet allows automakers to differentiate in software without redesigning compute for each vehicle class.
Radar: The Other Half of Perception

If the AI SoC provides the “brains,” Ambarella’s Oculii 4D imaging radar provides much of the “guts” that let EVs see through darkness, rain and fog. Ted Chua, Ambarella’s Director of Product Marketing and Radar Technology, focused on this radar layer. It uses AI‑driven waveform adaptation to dramatically increase angular resolution and sensitivity without requiring massive antenna arrays. By shaping the radar signal to the environment, the system can resolve more points per object. This improves both object detection and classification at long range.
Independent tests by a global automotive OEM showed Ambarella’s radar detecting small, low‑profile objects, including a stuffed animal puppy dog, beyond 100 meters, during both day and night scenarios. The same evaluations found that the system suppressed false positives better than other 3D and 4D radar systems by limiting the “ghosts and noise” that can cause nuisance braking or unsafe automatic emergency braking (AEB) events at highway speeds. That selectivity is crucial for EV‑focused autonomy stacks that must balance assertive driving with passenger comfort and safety, especially in dense traffic.
Ambarella’s centrally processed radar architecture routes raw radar data from multiple Oculii radar heads back to a CV3‑class SoC for centralized radar signal processing, where it is fused with camera feeds for a unified environmental model. This architecture gives OEMs the flexibility to scale advanced perception across an entire range of makes and models for higher automated driving levels, without changing the underlying software or compute platform.
Data Pipeline: Recording, Labeling and Selection
Behind the demos and ride‑alongs is a less glamorous but critical piece of Ambarella’s story: the data engine that trains and validates its EV autonomy stack. Porta described data as “the other 50%” of autonomy, with a closed loop of recording, labeling and selection needed to build robust neural networks. Ambarella provides a reference data‑recording design that can instrument a car in about five days. It combines multiple cameras and lidar as ground truth so development teams can rapidly collect synchronized sensor logs in real traffic.
Those logs feed an automated 3D annotation pipeline that uses lidar and camera fusion to generate 3D bounding boxes and trajectories for all road participants before projecting them back into the images. Compared with manual 2D labeling, which costs cents per frame, full 3D annotation can reach dollars per frame if done by hand. Automation is the only way to scale to the millions of frames needed for robust EV autonomy. Ambarella then analyzes dataset statistics (think: road types, weather conditions, object classes) to identify gaps, and uses that feedback to automatically trigger targeted data collection, for example in foggy conditions or complex urban merges, rather than brute‑forcing every possible scenario.
This disciplined approach means that even if two teams have similar network architectures, the team with the better, more balanced dataset wins on real‑world performance. For automakers, tapping into a partner that brings both the compute platform and a mature data pipeline can shorten development cycles and reduce the risk of edge‑case failure in EV deployments.
AV Trucking Puts Ambarella’s Stack to Work
Ambarella showcased a real world use case with Kodiak Robotics to illustrate what its “guts and brains” approach looks like when it leaves the demo track and hits real freight lanes. Jamie Half‑Acre, Kodiak’s VP of hardware, has been working with Ambarella silicon for nearly two decades, dating back to an Ambarella A6‑based encoder used by NBC during the 2008 Summer Olympics. He now uses Ambarella for the company’s Class 8 autonomous big‑rig platforms.
Kodiak operates a single integrated Level 4 software stack across long‑haul trucking, industrial trucking and defense, including a high‑profile deployment with Atlas Energy in West Texas. There, the company has been contracted to deliver 100 AI‑driven trucks that are already running fully autonomously in dust, rain and other brutal conditions.
To keep those trucks moving every day of the year, Kodiak needed what Half‑Acre calls the best camera SoC on the market, with robustness across a wide range of lighting and weather scenarios. Each of the AI‑driven West Texas trucks now runs four Ambarella CV2‑class devices, which provide best‑in‑class vision performance in low light and high dynamic range scenes, conditions that can overwhelm less capable sensors when headlights, work lights and harsh sun reflections collide.
Building on that success, Kodiak is now working with Ambarella’s CV3 platform, using its added compute not only for improved camera processing but also for on‑board radar and lidar processing, as well as time‑critical neural networks at the edge.
By moving core sensor processing out toward the sensor, close to the cameras and other front‑end hardware, Kodiak can avoid running individual high‑bandwidth cables all the way back to a central compute node. This reduces system complexity and overall power draw. That partitioning dovetails with Ambarella’s own philosophy of efficient, domain‑specific compute and gives Kodiak a way to scale its Level 4 “AI driver” without building a power‑hungry data center into the cab.
Half‑Acre described Ambarella as a partner that delivers practical solutions and strong support, a combination that helps Kodiak keep its fleet on the cutting edge while still meeting the uptime demands of real‑world freight customers.
CES Exhibition and Developer Zone: Plugging in the Engineers

Ambarella’s CES week included a cross‑domain “mobile technology” showcase covering automotive, IP cameras, robotics, and consumer devices. The idea was to show that the same core SoC capabilities, high‑quality video processing, efficient neural inference and radar fusion, recurs across markets, with automotive demos showcasing domain controllers and AD stacks as well as in robotics and IoT applications.
Complementing the physical demonstrations, Ambarella announced its new online Developer Zone (DevZone) at CES, hosted at developer.ambarella.com. The DevZone aggregates optimized models, documentation, agent‑based workflows and sample code into a single portal aimed at developers, system integrators, ISVs, module makers and OEMs building on Ambarella’s edge AI SoCs. For EV autonomy developers, it offers a central place to access sample applications, explore the CV3‑AD stack, and work with toolchains for data collection, simulation and annotation that mirror Ambarella’s internal flows.
The company noted that more than 39 million edge AI SoCs have already shipped into endpoints such as smart cameras and ADAS systems, giving it a substantial installed base as it moves deeper into edge AI infrastructure and central-domain automotive compute. By tying that hardware footprint to an open, partner‑friendly DevZone and a CES Tech Zone that makes the “guts and brains” story tangible, Ambarella has signaled that EV autonomy is not a moonshot, but an extension of technologies already working at scale today.
The rides, briefings and tours all reinforced Ambarella’s proposition at CES 2026: it sells the compute “guts and brains,” not just chips. With an algorithm‑first SoC portfolio, a tightly integrated AD and radar stack, and a maturing data and developer ecosystem, Ambarella is giving EV makers a practical path to autonomy that protects range, simplifies thermal design and scales from today’s ADAS to tomorrow’s driverless fleets.