As vehicles evolve, and are likened to smartphones on wheels, advancements in semiconductor chips, in-vehicle connectivity, head-up display side mirrors, and advanced lidar and infrared cameras are growing across the market.
Eyeris Technologies, a global in-cabin sensing artificial intelligence (AI) and sensor fusion company, announced last week that it will begin collaborating with indie Semiconductor on the integration of its advanced monocular 3D sensing AI software.
Eyeris’ software will be paired with indie’s intelligent vision processor system-on-chip (SoC) using 2D image sensors to enable automakers and Tier-1 manufacturers to bring to market a new generation of depth-aware in-cabin monitoring solutions. SoCs are essentially chips that hold all of a system’s necessary components.
Eyeris’ software and sensors work with cameras to gather image, thermal, and radar data. Camera locations can be on the dashboard and B-pillar or interior rear view mirror and center of the cabin, depending on the make and model of the vehicle. Data is gathered under OEM-specific camera coordinates, meaning the distance of the sensors relative to the camera. As with any camera and sensor combination, all of the components would need to be calibrated post-collision to ensure they can function properly.
The integrated in-cabin monitoring system will have a safety domain controller to further enhance vehicle interior safety, comfort, and overall in-cabin experience, a joint news release states.
Eyeris technology provides in-cabin information from occupants’ face, body, and hands as well as from objects using a single automotive-grade 2D image sensor, like the latest RGB-IR sensors. AI-enabled processors enable enhanced depth-aware understanding of the location, size, and position of occupants and other objects to customize, for example, air bag deployment accordingly and reduce the risk of occupant injury when deployed, Eyeris said.
“The Eyeris software and the indie vision processor combine to satisfy the increasing requirements by OEMs for in-cabin 3D features using a single 2D image sensor,” said Modar Alaoui, Eyeris founder and CEO. “Through this collaboration with indie’s dedicated vision processor, adding this depth-aware third dimension enables Tier 1 and vehicle manufacturers to now create safer air bags and improve on the design of various vehicle restraint systems that enhance the safety and comfort of all occupants.”
Indie Semiconductor Executive Vice President and Vision Business Unit General Manager Abhay Rai added that the company’s vision process SoC brings high-performance, ultra-low power, low latency embedded vision processing, and small form factor “to provide a richer suite of in-cabin monocular 3D safety software for OEM and Tier-1 customers that was not previously possible.”
“Our vision processor provides a unique dedicated safety domain controller and powerful CPU, facilitating in-cabin and exterior safety functionality within a single platform,” he said.
It was also recently announced that the planned Afeela electric vehicle (EV) from Sony Honda Mobility will use Qualcomm’s Snapdragon Digital Chassis — an advanced driver assistance system (ADAS) that harnesses the power of several hundred PCs for hands-free driving and combines connectivity, entertainment, safety, and customization systems into one platform.
Last year, General Motors announced that Snapdragon Ride would be used in its Ultra Cruise ADAS on the Cadillac CELESTIQ as the “brain” of the vehicle. Qualcomm has also partnered with Renault Group, Volkswagen, BMW, Mercedes-Benz, Hyundai, Volvo, Nio, and Stellantis to use its Snapdragon Digital Chassis, of which the Ride platform is part of. Ride is a scalable, customizable platform that SoCs, including an “industry-leading” 5-nanometer Snapdragon SoC and an artificial intelligence accelerator. Qualcomm touts the platform as having expanded software for vision perception, parking and driver monitoring. Ride also provides autonomous functionality.
The digital chassis has three other automotive systems — Snapdragon Auto Connectivity, Cockpit, and Car-to-Cloud. Auto Connectivity provides secure, connected, intelligent, and location-aware 4G and 5G cellular vehicle-to-everything (C-V2X), Wi-Fi, and Bluetooth connectivity. Cockpit supplies the digital instrument clusters and infotainment controls. Car-to-Cloud makes over-the-air (OTA) software updates possible as well as on-demand unlocking of features, pay-as-you-use services, and gathers vehicle and usage analytics.
“There is a tremendous amount of opportunity to reinvent the car,” Qualcomm Senior Vice President and Automotive General Manager Nakul Duggal told WIRED. “And a tremendous amount of that reinvention is happening because the car is becoming a truly digital product. …If you think about the way the car architecture is being designed going forward, you have centralization of compute capabilities, larger processors in the car, built-in connectivity, safety features built-in. All of these require the car architecture to shift, and you need somebody who actually understands what it means to be able to build a platform.”
“For automakers, finding the right technology partner means simplifying a vehicle’s architecture and unlocking new revenue streams in the form of passenger entertainment and downloadable upgrades,” CleanTechnica’s Steve Hanley recently wrote.
Other innovations making it to market are video display side mirrors. While not widely available in the U.S. due to federal regulations, they are out in China and Europe.
USA Today took a look earlier this year at Forvia products manufactured in Michigan. For sale in China is the eMirror, which writer Mark Phelan says is meant to replace or augment side mirrors with high-definition video. Valerie Zelko, Forvia electronic mirrors and other ADAS, told Phelan initial use in the U.S. will likely be on SUVs and pickup trucks to replace towing mirrors. Smaller mirrors would be installed as a fail-safe in case the video function quits working.
“The cameras can provide wide fields of view, eliminating blind spots without the distortion caused by optical wide-angle mirrors,” Phelan wrote. “They can also incorporate other sensors, lighting up to alert vehicles oncoming on either side.”
Without naming specific manufacturers, Phelan went on to write about “smart” LED headlights on driverless delivery vehicles. The soon-to-come headlights will also view the road ahead, selectively disable individual LEDs to avoid shining in other drivers’ eyes, and project lights on the road to follow the lane or directions from the vehicle’s navigation system. They are available now on Audi vehicles outside the U.S.
In February 2022, the National Highway Traffic Safety Administration (NHTSA) issued its final rule to allow OEMs to install adaptive driving beam (ADB) headlights on new vehicles to increase visibility and reduce headlight glare — a decision that likely means added repair complexity and cost post-collision.
NHTSA ruled that ADBs will improve safety for pedestrians and bicyclists by making them more visible at night and will help prevent crashes by better illuminating animals and objects in and along the road. However, NHTSA states in the rule the agency hasn’t measured or regulated “glare dosage,” or the amount of glare.
“While NHTSA agrees that a qualitative relationship exists, the agency has not established, and does not know of, a quantified relationship between glare dosage and crash risk,” the final rule document states.
Regarding recent and ongoing advancements in ADAS and autonomous vehicle technologies, SAE International will hold a two-day seminar on April 26-27 to cover how they “have disrupted the traditional automotive industry with their challenges and potential to increase safety while attempting to optimize the cost of car ownership.”
Specifically, lidar and infrared camera sensing will be explored since SAE says options for both are seeing rapid growth and adoption in the industry.
“However, the sensor requirements and system architecture options continue to evolve almost every six months,” SAE said. “This course will provide the foundation to build on for these two technologies in automotive applications.”
Lidar and infrared camera tech will be demoed during the seminar. Infared basics will be covered from the electromagnetic spectrum and spectral irradiance to night vision and eye safety plus the lidar basics of flash, scanning, wavelengths, lasers, detectors, scanners, range and resolution calculations, optics, thermal design, challenges to automotive qualification and sensor fusion.
The second day of the seminar will cover focus on driver monitoring using infrared cameras and machine vision for vehicle exteriors. Trends and challenges facing optical sensing in AVs will also be discussed.
Featured image credit: Jae Young Ju/iStock