For the past two decades, Raquel Urtasun, founder and CEO of autonomous trucking startup Waabi, has been developing artificial intelligence systems that can reason like a human would.
The AI pioneer previously served as chief scientist at Uber ATG before launching Waabi in 2021. Waabi launched with an “AI-first approach” to accelerate the commercial development of autonomous vehicles, starting with long-haul trucks.
“If you can build systems that can actually do that, then suddenly you need a lot less data,” Urtasun told TechCrunch. “You need a lot less calculations. If you’re able to do the reasoning in an efficient way, you don’t need to have fleets of vehicles deployed all over the world.”
Creating an AI-powered AV stack that perceives the world as a human force and reacts in real time is what Tesla is trying to do with its first approach to self-driving. The difference, aside from Waabi’s convenience of using lidar sensors, is that Tesla’s Full Self-Driving system uses “imitation learning” to learn how to drive. This requires Tesla to collect and analyze millions of videos of real driving situations that it uses to train its AI model.
Waabi Driver, on the other hand, has done most of its training, testing and validation using a closed-loop simulator called Waabi World that automatically creates digital twins of the world from data. performs real-time sensor simulation. builds Waabi Driver stress test scenarios. and teaches the Guide to learn from its mistakes without human intervention.
In just four years, this simulator has helped Waabi launch commercial pilots (with a human driver in the front seat) in Texas, many of which are happening through a partnership with Uber Freight. Waabi World also enables the startup to reach its planned commercial launch fully driverless in 2025.
But Waabi’s long-term mission is much grander than just trucks.
“This technology is extremely, extremely powerful,” said Urtasun, who spoke to TechCrunch via video interview, a whiteboard full of hieroglyph-like formulas behind her. “It has this amazing ability to generalize, it’s very flexible and it develops very quickly. And it’s something that we can expand to do a lot more than trucking in the future… That could be robotaxis. This could be humanoids or warehouse robotics. This technology can solve any of these use cases.”
The promise of Waabi’s technology — which will first be used to scale autonomous trucking — allowed the startup to close on a $200 million Series B round, led by existing investors Uber and Khosla Ventures. Strong strategic investors include Nvidia, Volvo Group Venture Capital, Porsche Automobil Holding SE, Scania Invest and Ingka Investments. The round brings Waabi’s total funding to $283.5 million.
The size of the round, and the strength of its participants, is particularly remarkable given the success the AV industry has taken in recent years. In the trucking space alone, Embark Trucks shut down, Waymo decided to end its autonomous freight operations, and TuSimple shut down its US operations. Meanwhile, in the robotaxi space, Argo AI faced its own shutdown, Cruise lost its licenses in California after a major safety incident, Motional cut nearly half its workforce, and regulators are actively investigating Waymo and Zoox.
“You build the strongest companies when you pool resources in times that are really difficult, and the AV industry in particular has seen a lot of failures,” Urtasun said.
That said, AI-focused players in this second wave of autonomous vehicle startups have secured impressive capital raises this year. UK-based Wayve is also developing a self-learning rather than rules-based system for autonomous driving, and in May closed a $1.05 billion Series C led by SoftBank Group. And Applied Intuition in March raised a $250 million round at a $6 billion valuation to bring artificial intelligence to automotive, defense, manufacturing and agriculture.
“Within AV 1.0, it’s very clear today that it’s very capital intensive and very slow to make progress,” Urtasun said, noting that the robotics and autonomous driving industry has been held back by complex and fragile AI systems. “And investors, I would say, are not very excited about that approach.”
What investors are excited about today, however, is the promise of genetic artificial intelligence, a term that wasn’t exactly in vogue when Waabi launched, but nonetheless describes the system that Urtasun and her team have created. Urtasun says Waabi’s is a next-generation genAI that can be deployed in the natural world. And unlike popular language-based genAI models today, such as OpenAI’s ChatGPT, Waabi has figured out how to build such systems without relying on huge datasets, large language models, and all the computing power that comes with them.
The Waabi guide, says Urtasun, has a remarkable ability to generalize. So instead of trying to train a system on every possible data point that ever existed or could ever exist, the system can learn from a few examples and handle the unknown in a safe way.
“That was in the plan. We’ve built these systems that can perceive the world, create abstractions of the world, and then take those abstractions and think, “What might happen if I do this?” Urtasun said.
This more humane, logic-based approach is much more scalable and capital-efficient, says Urtasun. It is also vital for validating security-critical systems operating at the edge. You don’t want a system that takes a few seconds to react or you’ll crash the vehicle, he said. Waabi announced collaboration to bring Nvidia’s Drive Thor to its autonomous trucks, which will give the startup access to car-class computing power at scale.
On the road, it seems that the Waabi driver understands that there is something solid in front of him and that he must drive carefully. He may not know what it is, but he will know to avoid it. Urtasun also said that the Guide was able to predict how other road users would behave without having to be trained in various specific situations.
“It understands things without us telling the system the meaning of objects, how they move in the world, that different things move differently, that there is occlusion, there is uncertainty, how to behave when it rains a lot,” Urtasun said. “All these things, he learns automatically. And because he’s being exposed to driving scenarios right now, he’s learning all these capabilities.”
Waabi’s fundamental genAI model also doesn’t fall victim to the black box or illusion effect prevalent in LLM-based genAI models today, says Urtasun. This is because the model running in the Waabi driver is an interpretable end-to-end training system that can be validated and verified and whose decisions can be tracked.
This capability in a streamlined, unified architecture means it can be applied to other autonomy use cases, Urtasun says.
“If you expose it to interactions in a warehouse, picking up and dropping things, it can learn that, no problem,” she said. “You can expose him to multi-use situations and he can learn to do all these skills together. There’s no limit to what he can do.”
This article has been updated with information from Waabi about how the AI model avoids hallucinations.