InCabin USA 2024 Press Conference Recap

Two premier events in the automotive technical space, AutoSens and InCabin, converged in Detroit, Michigan, at the Huntington Place during the week of May 20th, 2024. This colocation of both events featured an extensive agenda across two tracks, expert-led roundtables, in-depth tutorials, and a full exhibition of the latest ADAS innovations. Ultimately, AutoSens and InCabin USA 2024 enhanced the dialogue around improving occupant safety and driver experiences in the domains of ADAS, autonomous vehicles, and interior sensing.

On Wednesday, May 22nd, a press briefing was held for both conferences. Eight companies made announcements as part of the AutoSens press briefing, while six companies made announcements as part of the InCabin press briefing. Here is an overview of what each company presented during InCabin USA 2024.

LightMetrics

LightMetrics is a provider of video telematics solutions for commercial vehicles, the most notable being its RideView platform for fleet applications. RideView is a hardware agnostic platform that provides driver coaching workflows to increase safety and recorded video for accident exoneration, among other features. Now, LightMetrics has announced its strategic expansion into developing driver monitoring systems (DMS) for passenger cars.

With the rise of electric vehicles, LightMetrics aims to integrate DMS with cockpit systems to enhance computational and power efficiency through advanced AI technology, a significant step towards improving road safety, as described by the company.

“Expanding into the OEM market is a natural progression for LightMetrics,” said Krishna Govindarao, Co-founder and Head of Product at LightMetrics. “Our expertise in efficient AI and access to real-world data allows us to deliver superior DMS solutions that meet the evolving needs of the automotive industry. We are excited to bring our innovative approach to a new segment and continue our mission of making roads safer for everyone.”

During InCabin USA 2024, LightMetrics provided additional context on how RideView has helped the company develop an effective DMS solution for automotive OEMs, including:

  • LightMetrics can access billions of miles of driving data with various camera positions, diverse driver and occupant ethnicities, and varying road and weather conditions, thanks to RideView.

  • Edge AI enables a complete DMS features set, including drowsiness, distraction, and cell phone usage detection.

  • Video telematics solutions from LightMetrics are already deployed in the United States, Canada, Mexico, Brazil, Australia, and India.

Learn More: Krishna Govindarao joined our AutoVision News LIVE broadcasts from the exhibition floor during InCabin USA 2024 to discuss the DMS market and why LightMetrics is expanding into passenger vehicles.

Anyverse

Anyverse took the opportunity to announce its technical collaboration with Sony Semiconductor Solutions Corporation (Sony) during InCabin USA 2024. This joint venture aims to integrate Sony’s Image Sensor Models into Anyverse’s synthetic data platform to enhance the development of ADAS and autonomous vehicle technologies.

As described in a press release, integrating Sony’s Image Sensor Models into Anyverse’s platform will maximize system design and validation cycles, ultimately allowing developers to evaluate sensor configurations and increase model performance before implementation.

“The collaboration with Sony Semiconductor Solutions represents a tremendous leap forward in addressing one of the most critical challenges in dataset creation—the perception domain gap,” said Víctor González, CEO of Anyverse. “By bridging this gap through the combination of high-fidelity data and physical sensor simulation, we empower developers to create deep learning models that can effectively handle a wide spectrum of real-world scenarios.”

The collaboration between Anyverse and Sony aims to:

  • Reduce the automotive industry’s reliance on real-world data collection.

  • Minimize costs and fatigue associated with physical testing.

  • Provide high-quality data and sensor models through Anyverse’s platform (i.e., developers will not need to deploy sensors on multiple vehicles for testing).

  • Expedite the system’s evaluation process and expand the perception model’s coverage through the Anyverse platform.

“The collaboration between Anyverse and Sony Semiconductor Solutions will provide an automotive-grade end-to-end spectral simulation pipeline to the ADAS perception system developers,” said Tomoki Seita, General Manager, Automotive Business Division, Sony Semiconductor Solutions Corporation. “Sony has prepared Image Sensor Models based on the internal architecture of the image sensors used in camera systems to achieve automotive-grade fidelity.”

Learn More: Javier Salado, Product Manager at Anyverse, joined AutoVision News Radio host Carl Anthony as part of InCabin Insights. During the discussion, Salado spoke about the impact of data when developing interior monitoring systems and shared more about the emerging trends in the space.  

ams OSRAM

ams OSRAM showcased its diverse lightning and sensor projects during InCabin USA 2024 and further discussed the ICARUS 3D Driver Monitoring System at the press briefing. Highlights of ICARUS, as presented by Russell Willner, Senior Product Marketing Manager at ams OSRAM, include:

  • Monitoring the driver for drowsiness, distraction, and gaze, plus monitoring the cabin for children and pets.

  • 3D sensing capability to determine microsleeps, eye gaze directions, and blink duration.

  • The ability to support emerging features, including augmented reality heads-ups displays and face authentication.

“Driver monitoring systems are becoming essential to ensuring road safety and enhancing the driver experience,” Willner said. “The ICARUS system measures the position of the driver’s head in 3D to detect micro-sleeps and other advanced signs of drowsiness which can pose a serious risk to road safety.”

Furthermore, the ICARUS evaluation kit features a Vertical Cavity Surface Emitting Laser (VCSEL) or an IR LED for flood illumination for 2D near-infrared (NIR) sensing. It can be upgraded to include an NIR dot-pattern projector for accurate and cost-effective 3D sensing. ams OSRAM also showcased its AS8579, a capacitive sensor designed to determine whether the driver’s hands are on the steering wheel or not (a critical criterion to ensure safer operation of Level 2 and above vehicles).

Learn More: Add the ams OSRAM podcast to your Spotify playlist.

2024 ADAS Guide Banner.

IAV

IAV unveiled its SaaS (Software-as-a-Service) solution for its occupant monitoring systems (OMS). The SaaS solution will be available in 2025 for IAV customers and will provide occupant classification for safety and comfort functions. InCabin USA 2024 marked the first time the SaaS solution was publicly shown. Moreover, IAV announced that its AI-based occupant classification has already been implemented on the BlackBerry IVY platform and showcased other offerings for software-defined vehicles, including:

  • An AI-based vital sign algorithm to predict heart and respiration rates of front-row occupants.

  • The ability to enable impairment recognition and sudden sickness detection.

Learn More: Follow IAV GmbH on LinkedIn for the latest updates.

Mindtech Global

Mindtech announced the addition of two new sensors, lidar and near-infrared (NIR), to its Chameleon platform to aid in the training and accuracy of AI models. As described by Mindtech, Chameleon enables the creation of synthetic data matched to the real world, and adding lidar and NIR support will allow AI system developers to create training and test data for these two sensor types. Combining these two non-human visible sensors with RGB sensors allows platform users to build accurate and synchronous data with full annotations.

“Working directly with customers has led to the introduction of these technologies solving specific problems they were having,” explained Chris Longstaff, VP of Product Management at Mindtech. “Access to sufficient relevant and diverse data without infringing privacy, and recording dangerous situation use cases were key amongst those requirements.”

During InCabin USA 2024, Mindtech spoke more about how Chameleon allows users to place sensors into any scenario and provided more details on the Rotating Lidar simulator, including:

  • Accounting for different use cases, such as autonomous forklifts and trucks in warehouse environments.

  • The combination of lidar and visual data means computer vision models can be better trained to distinguish different elements in a complex scene, which may have many moving and interacting items.

Meanwhile, a typical use case for the new NIR camera capability is in-cabin monitoring of personal and commercial vehicles, including identifying tired, distracted, or intoxicated drivers.

Learn More: Chris Longstaff talks with Rich Nass, host of the Embedded Executive podcast, about the tools Mindtech offers that help create vision data sets for training neural networks.

DEEP IN SIGHT Co. Ltd.

DEEP IN SIGHT introduced its in-cabin monitoring system and face recognition unit technology named CAMOSYS and FRU, respectively.

“With its unparalleled capabilities in in-cabin monitoring, CAMOSYS sets a new benchmark for safety, security, convenience, and driver experience,” said Lucas Oh, CEO of DEEP IN SIGHT. “We are eager to become a key partner in the global ICMS market and are fully prepared to collaborate with leading OEMs and Tier-1 companies.”

Key features of CAMOSYS include:

  • Engineered to detect drowsiness, distraction, fatigue, phone usage, seatbelt compliance, and other unexpected driving events.

  • Supports personalized features like facial recognition and gesture control.

  • Euro NCAP and NHTSA compliant.

Meanwhile, FRU enables drivers and passengers to authenticate themselves via facial recognition. As described by DEEP IN SIGHT, its FRU technology can distinguish between real and fake images of the vehicle’s owner, allowing them to unlock the doors without a key in three seconds. “Our passion for innovation drives us to continually push the boundaries of what’s possible,” Oh added.

Learn More: DEEP IN SIGHT’s Technical Manager, Hoseung Choi, sat down with Carl Anthony from InCabin Insights to discuss AI-powered 3D sensing solutions, its start-up strategy, and some of the latest industry trends.