Latest news with #Helm.ai's
Yahoo
8 hours ago
- Automotive
- Yahoo
Helm.ai advances self-driving tech with new vision system
a startup backed by Honda, has unveiled a new camera-based system named Vision, designed to interpret complex urban environments for self-driving cars. collaboration with Honda will see its system featured in the 2026 Honda Zero series of electric vehicles, promising a hands-free driving experience where users can safely divert their attention away from the road. The company said, it is also negotiating with various automakers for the integration its technology into mass-market vehicles. CEO and founder Vladislav Voroninski was quoted by Reuters as saying: "We're definitely in talks with many OEMs and we're on track for deploying our technology in production. "Our business model is essentially licensing this kind of software and also Foundation model software to the automakers." The California-based company's vision-first strategy is akin to that of Tesla, which also prioritises cameras over other sensors like Lidar and radar to mitigate additional costs. Despite this, industry specialists argue that supplementary sensors are essential for safety, providing redundancy for cameras that may falter in low-visibility conditions. Companies operating robotaxis, such as Waymo and May Mobility, employ a mix of radar, lidar, and cameras to navigate their environment. which has raised $102m in funding, counts Goodyear Ventures, Sungwoo HiTech, and Amplo among its investors. Vision synthesises imagery from multiple cameras to generate a comprehensive bird's-eye view map, enhancing the vehicle's decision-making and control mechanisms. The system is tailored for compatibility with various hardware platforms from companies such as Nvidia and Qualcomm. This adaptability allows car manufacturers to seamlessly integrate Vision into their vehicles' existing systems, which are already equipped with proprietary technologies for predicting and planning vehicle movements. " advances self-driving tech with new vision system" was originally created and published by Just Auto, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site.


Business Wire
a day ago
- Automotive
- Business Wire
Helm.ai Announces Level 3 Urban Perception System With ISO 26262 Components
REDWOOD CITY, Calif.--(BUSINESS WIRE)-- a leading provider of advanced AI software for high-end ADAS, autonomous driving, and robotics automation, today announced Vision, a production-grade urban perception system designed for Level 2+ and Level 3 autonomous driving in mass-market vehicles. Vision delivers accurate, reliable, and comprehensive perception, providing automakers with a scalable and cost-effective solution for urban driving. Assessed by UL Solutions, has achieved ASPICE Capability Level 2 for its engineering processes and has been certified to meet ISO 26262 ASIL-B(D) requirements for components of its perception system delivered as Software Safety Elements out of Context (SEooC) for Level 2+ systems. The ASIL-B(D) certification confirms that these SEooC components can be integrated into production-grade vehicle systems as outlined in the safety manual, while ASPICE Level 2 reflects structured and controlled software development practices. Built using proprietary Deep Teaching™ technology, Vision delivers advanced surround view perception that alleviates the need for HD maps and Lidar sensors for up to Level 2+ systems, and enables up to Level 3 autonomous driving. Deep Teaching™ uses large-scale unsupervised learning from real-world driving data, reducing reliance on costly, manually labeled datasets. The system handles the complexities of urban driving across several international regions, including dense traffic, varied road geometries, and complex pedestrian and vehicle behavior. It performs real-time 3D object detection, full-scene semantic segmentation, and multi-camera surround-view fusion, enabling the self-driving vehicle to interpret its surroundings with high precision. Additionally, Vision generates a bird's-eye view (BEV) representation by fusing multi-camera input into a unified spatial map. This BEV representation is critical for improving the downstream performance of the intent prediction and planning modules. Vision is modular by design and is optimized for deployment on leading automotive hardware platforms, including Nvidia, Qualcomm, Texas Instruments, and Ambarella. Importantly, since Vision has already been validated for mass production and is fully compatible with the end-to-end (E2E) Driver path planning stack, it enables reduced validation effort and increased interpretability to streamline production deployments of full stack AI software. 'Robust urban perception, which culminates in the BEV fusion task, is the gatekeeper of advanced autonomy,' said Vladislav Voroninski, CEO and founder of ' Vision addresses the full spectrum of perception tasks required for high end Level 2+ and Level 3 autonomous driving on production-grade embedded systems, enabling automakers to deploy a vision-first solution with high accuracy and low latency. Starting with Vision, our modular approach to the autonomy stack substantially reduces validation effort and increases interpretability, making it uniquely suited for nearterm mass market production deployment in software defined vehicles.' About develops next-generation AI software for ADAS, autonomous driving, and robotics automation. Founded in 2016 and headquartered in Redwood City, CA, the company reimagines AI software development to make scalable autonomous driving a reality. offers full-stack real-time AI solutions, including deep neural networks for highway and urban driving, end-to-end autonomous systems, and development and validation tools powered by Deep Teaching™ and generative AI. The company collaborates with global automakers on production-bound projects. For more information on including products, SDK, and career opportunities, visit