Lucid Chooses Mobileye As Partner For Autonomous Drive Technology
Lucid Motors and Mobileye N.V. announced a collaboration to enable autonomous driving capability on Lucid vehicles.
Lucid will launch its first car, the Lucid Air, with a complete sensor set for autonomous driving from day one, including camera, radar and lidar sensors. Mobileye was chosen to provide the primary compute platform, full 8-camera surround view processing, sensor fusion software, Road Experience Management (REM) crowd-based localization capability, and reinforcement learning algorithms for Driving Policy.
These technologies will enable a full Advanced Driver Assistance System (ADAS) suite at launch, and then enable a logical and safe transition to autonomous driving functionality through over-the-air software updates.
Mobileye is expected to provide a dual set of EyeQ®4 system-on-chips. The chipset will process a full 8-camera surround view system, providing full 360-degree visual perception. Consistent with other Mobileye programs, the camera set includes a forward-facing trifocal-lensed camera and an additional five cameras surrounding the vehicle.
In addition, Mobileye will offer sensor fusion software that incorporates data from radar and lidar sensors, along with the camera set, in order to build the critical environmental model necessary to facilitate autonomous driving.
To complete and strengthen the environmental model, Mobileye’s REM system is intended to provide the vehicle with highly accurate localization capability. Lucid vehicles will benefit from the real-time updating of the collaborative, dynamic global Roadbook high-definition mapping system. Data generated from Lucid vehicles can be used to enhance the autonomous driving software and will also contribute to the aggregation of Mobileye’s Global Roadbook.
In a related news, Mobileye has entered into a strategic partnership with intelligent mapping company Here.
Mobileye said it would provide its Roadbook technology — real-time road data collected by Mobileye sensors — as an additional layer to Here’s mapping data and use Here’s Open Location platform, which analyzes and provides context to location data, to augment information gathered through its sensors.
The integration of both companies’ data is intended to be used by autonomous vehicles, which rely on real-time location information to make decisions.