Last year, Hugging Face, the AI Dev platform, launched Lerobot, a collection of open AI models, data sets and tools to build real world robotic systems. On Tuesday, Hugging Face worked with Ai Startup Yaak to extend Lerobot with a training set for robots and cars that can navigate to environments such as city streets, autonomous.
The new set, Called Learning to Driving (L2D)It is above a size Petabyte and contains data from sensors installed in cars in German driving schools. The L2D records camera, GPS and “vehicle dynamics” data from driving trainers and students navigating roads with construction belts, intersections, highways and much more.
There are a number of self-guiding training that puts there from companies such as Waymo and Alphabet’s Waymo and Comma AI. But many of them focus on design duties such as detection and monitoring of objects, which require high quality commentary, according to L2D creators-making it difficult to escalate.
On the contrary, the L2D is designed to support the development of “end to end” learning, claims its creators, which helps to predict actions (eg when a pedestrian can cross the road) directly from sensor entrances (eg.
“The AI community can now build self-guidance models,” writes co-founder Harsimrat Sandhawalia and Remi Cadene, a member of the AI team for Robotics on Hugging Face, wrote in a blog position. “The L2D aims to be the largest set of open source self -service data that authorizes the AI community with unique and different ‘episodes’ for training of end to end.”
Facial and Yaak hugging plans to conduct a real “closed loop” test world of models trained using L2D and Lerobot this summer, developed in a safety driver. Companies are calling on the AI community to submit models and duties that would like to evaluate models, such as navigation in roundabouts and parking spaces.