AutoSens USA 2024 Press Conference Recap

For the first time in North America, AutoSens and InCabin USA co-located at the Huntington Place in downtown Detroit during the week of May 20th, 2024. The event featured an extensive agenda across two tracks, expert-led roundtables, in-depth tutorials, and a full exhibition floor of the latest ADAS innovations that are shaping the future of automotive technology.

On Wednesday, May 22nd, a press briefing was held for both conferences. Six companies made announcements as part of the InCabin press briefing, while eight companies made announcements as part of the AutoSens press briefing. Here is a recap of what each company presented during AutoSens USA 2024.

Claytex

During the AutoSens press conference, Claytex provided an overview of the Sim4CAMSens project, which aims to develop a sensor evaluation framework that spans modeling, simulation, and physical testing. Funded by the UK’s Centre for Connected and Autonomous Vehicles (CCAV) as part of its Commercialising CAM Supply Chain Competition, the project involves the creation of new sensor models, new material models, and new test methods to allow ADAS and sensor developers to accelerate their work.

Claytex leads the Sim4CAMSens project with support from a consortium of industry partners, including rFpro, Syselek, Oxford RF, WMG, National Physical Laboratory, Compound Semiconductor Applications Catapult, and AESIN.

“The Sim4CAMSens project addresses a pivotal gap in the journey towards autonomous vehicles. The perception sensors are the eyes of these vehicles, and ensuring their accurate representation in simulations is critical,” said Mike Dempsey, Managing Director of Claytex, a TECHNIA Company. “This collaboration of industry leaders will contribute to safer and more reliable autonomous mobility.”

As outlined by the Sim4CAMSens official website, key objectives include:

  • Quantify and simulate perception sensors in all conditions. Doing so enables sensor suppliers to more fully demonstrate their solutions.

  • Enhance synthetic training data by improving perception sensor models.

  • Proposing a credible simulation framework and ensure that AVs are safe to deploy to the satisfaction of regulators.

  • Development of improved noise models. This involves both laboratory and field tests to identify and quantify the noise factors that impact sensor performance. 

Learn More: Mike Dempsey joined AutoVision News Radio host Carl Anthony ahead of AutoSens USA 2024 to discuss the latest news from Claytex, including the Sim4CAMSens project.

Dexerials Corporation

Dexerials Corporation showcased its photonic solution based on micro and nanofabrication technology developed for consumer electronics, automotive displays, and HUD applications.

During the press event, Dexerials explained that the specific method used for moth-eye nanofabrication typically involves a precise technique like nanoimprint lithography or photolithography to create nanostructures on the surface of different substrates. These nanostructures are composed of polymers with unique properties to minimize reflection and maximize transmittance.

“Depending on customer requirements, the nanostructures can be deposited on flexible substrate in R2R process with high uniformity and scalability or directly deposited on a rigid free-shape surface with high resolution and precision,” Dexerials said in a statement.

At AutoSens USA 2024, Dexerials’ team offered optical solutions including:

  • Ghost image reduction.

  • Anti-fogging and seamless designs.

  • How a nanostructured surface mitigates optical noise and empowers sensing technologies.

Learn More: Nelly Soudakova, General Manager of Digital Marketing and New Business Development, and Naoki Hanashima san, Chief Engineer of Corporate R&D at Dexerials Corporation, joined our AutoVision News LIVE broadcasts from the exhibition floor at AutoSens. Soudakova and Hanashima san discussed the company’s expansion into the photonics business and potential partnerships in the space.

Sheba Microsystems Inc.

Sheba Microsystems shed light on its recently launched vehicle camera, Sharp-7. It is the world’s first autofocus automotive-grade camera that incorporates an eight-megapixel sensor and an integrated MEMS driver to generate high-quality imaging across all automotive temperature ranges, ultimately improving the safety of vehicles with the latest ADAS systems.

“Over the past two decades, we all have seen how high-resolution sensors have dramatically improved cameras on our phones,” said Dr. Faez Ba-Tis, CEO and co-founder of Sheba Microsystems. “A key enabler for such high-resolution imaging was the adoption of autofocus actuators.”

During the press event at AutoSens USA 2024, Sheba shared more about its patented MEMS technology, including:

  • Thermal stability and consistent performance at temperatures ranging from -40 to 150 degrees Celsius.

  • Precise imaging during thermal and mechanical shocks, thermal cycling, vibration, tumble, and microdrop tests.

“But the absence of reliable autofocus actuator technology that can operate in the automotive environment, coupled with the problem of thermal expansion, has been a decades-long blocker and has limited the adoption of high-resolution sensors in automotive cameras,” Dr. Ba-Tis added. “With the Sharp-7 camera, we wanted to demonstrate not only how Sheba MEMS technology solves for thermal expansion and produces consistent, high-quality imaging from existing eight-megapixel sensors, but also how our technology paves the road towards the adoption of even higher resolution image sensors, which will ultimately keep everyone on the road safer, especially with today’s advancements in autonomous vehicle technology.”

Learn More: Ridha Ben Mrad, President, CTO, and Co-Founder of Sheba Microsystems, joined our AutoVision News LIVE broadcasts from the exhibition floor at AutoSens to discuss the Sharp-7.

neurocat

neurocat announced a collaboration with Taiwan-based oToBrite to advance its Vision-AI solution even under the most adverse weather conditions. This collaboration will utilize neurocat’s image augmentation technology to enhance the performance of oToBrite’s Vision-AI solution, resulting in:

  • Vulnerable road user detection (i.e., cyclists and pedestrians).

  • In-context detection (i.e., lane departure, wrong-way, and forward collision warnings).

As explained by the companies at AutoSens USA 2024, autonomous driving constitutes a high usage of algorithms and data, and data collection is a complex task with respect to expenditures and weather uncertainty.

“With its collaboration with neurocat, oToBrite is on the forefront of finding solutions to these data challenges. neurocat’s image augmentations offer a way to quickly generate highly targeted data that precisely fills data gaps that are directly related to the needs of model development,” the companies said in a joint statement. “Leveraging the new augmented data for retraining represents a promising road to ensure oToBrite continues to be a leader in safe autonomous driving systems.”

Learn More: Florens Gressner, CEO and Co-Founder of neurocat, joined AutoSens Insights to discuss how the company develops validation systems, challenges with data management, and how to solve the “perception puzzle.”

2024 ADAS Guide Banner

rFpro

rFpro launched a detailed digital model of a road network in Los Angeles, California, to support the development of autonomous vehicles and self-driving technologies. This 36 kilometer route includes highways, split-dual carriageways, single-carriageway sections, and other physical objects necessary for ADAS and autonomous vehicle engineers to conduct comprehensive testing before deploying on public roads.

“Los Angeles is one of the leading cities globally for developing and trialling autonomous vehicle technologies,” said Matt Daley, rFpro Technical Director. “Our model provides OEMs with the ability to thoroughly train and test perception systems in a safe and repeatable environment before correlating these simulated results on the public road.”

Key features of the digital model include:

  • Created using survey-grade lidar scan data to produce a road surface accurate to within one millimeter in height across the entire 36 kilometer route.

  • Features 12,400 buildings, 40,000 pieces of vegetation, and over 13,600 items of street furniture (traffic lights, road signs, road markings, walls, and fences).

  • Includes infrastructure elements that challenge automated driving technologies, such as roadside parking, islands separating carriageways, drop curbs in residential areas, rail crossings, bridges, and tunnels.

Learn More: Matt Daley joined AutoSens Insights to discuss high-fidelity synthetic training data, the latest in ray tracing, their ASAM membership, and participation in the OpenMATERIAL project.

KDPOF

KDPOF presented its multi-gigabit transceiver, KD7251, for automotive applications. It is a single-chip solution integrating optical fiber interface, optical engine, photonics, and electronics in a hybrid package supporting a standard reflow assembly process. KD7251 supports IEEE standard 1722b encapsulation for MIPI and remote control (GPIO, SPI, I2C) over Ethernet, supporting traffic dis/aggregation, multi-drop functions as well as all the connectivity combinations of Ethernet/MIPI-to-Ethernet/MIPI.

“KD7251 implements PCS, PMA, and PMD sublayers together with MDI by integrating in a single component optics, photonics, and electronics using automotive qualified bulk CMOS process,” wrote Rubén Pérez-Aranda, CTO and co-founder of KDPOF, on LinkedIn. “KD7251 supports assembly in a PCB by standard pick-and-place and reflow processes, which is a game changer in the implementation of high-speed optical transceivers that will enable low-cost, high-volume integration of optical ports in automotive ECUs.”

KD7251 enables new use cases with optical technology, such as:

  • Multi-gigabit Ethernet backbone.

  • Zonal gateway connectivity and smart antenna links.

  • Connectivity through radars, cameras, lidar, displays, and high-performance computing units.

Learn More: Rubén Pérez-Aranda was a guest during our AutoVision News LIVE broadcasts from the exhibition floor at AutoSens USA 2024. Pérez-Aranda spoke more about the single-chip KD7251, including how KDPOF integrates multiple functionalities into an optical transceiver.

Valens Semiconductor

At AutoSens USA 2024, Valens Semiconductor spoke about its collaboration with Sony Semiconductor Solutions Corporation (Sony) for the Electromagnetic Compatibility (EMC) testing of a multi-vendor A-PHY link. The partnership aims to progress towards a mature Sony A-PHY-integrated image sensor compatible with Valens’ VA7000 deserializer chip.

As described in a Valens press release, the Sony solution will include an integrated high-speed connectivity solution, which will provide a number of benefits for ADAS cameras, including a smaller form factor, lower power consumption, and reduced cost. As of April 2024, Valens and Sony are developing an eight-megapixel A-PHY camera module for ADAS systems.

MIPI A-PHY is the first standard in the automotive industry for in-vehicle high-speed connectivity and the only technology optimized to support sensor integration. Since its release in 2020, it has attracted a growing ecosystem of companies designing products based on the technology.

“We are rapidly moving towards bringing to market the first-ever sensor with integrated high-speed connectivity, and these IOT and EMS tests are a testament to the maturity of the A-PHY standard and our camera solution,” said Kenji Onishi, Deputy Senior General Manager of the Automotive business Division in Sony Semiconductor Solutions. “We believe that A-PHY is uniquely positioned to meet automotive OEM requirements for enhanced ADAS systems.”

“One of the key benefits of MIPI A-PHY technology is its optimization of the transmitter, which is small, inexpensive, and simple,” explained Eyran Lida, Chief Technology Officer at Valens Semiconductor. “This allows for direct integration of the connectivity inside the sensor, a benefit that no competing solution can provide.”

Key benefits of the MIPI A-PHY standard for automotive are:

  • Fosters a multi-vendor ecosystem of interoperability.

  • More link robustness and better packet error rates when compared to legacy solutions.

  • Reduces the complexity of the sensor side while delivering exceptional EMC performance throughout the lifecycle of the vehicle.

Learn More: Daniel Shwartzberg, Director of Business Development and System Solutions at Valens Semiconductor, joined AutoVision News Radio Host Carl Anthony for a discussion about MIPI A-PHY, the importance of that standard, and how the automotive industry can continue to leverage and embrace it.

MEMS Drive Inc

MEMS Drive Inc. debuted its Sensor-based Autofocus (AF) and Optical Image Stabilization (OIS) MEMS Actuator during AutoSens USA 2024 in Detroit. According to MEMS Drive, the technology marks a significant advancement in automotive imaging and will deliver enhanced image stabilization and autofocus capabilities for next-generation ADAS cameras.

MEMS Drive uses its proprietary MEMS design and process to allow CMOS sensors to achieve swift and precise sensor shift and was the first semiconductor company to implement five-axis stabilization for mobile cameras. The MEMS Drive development team operates globally, with offices in Los Angeles, Nanjing, Taipei, Hong Kong, and Shenzhen.

“We are thrilled to introduce our Autofocus and OIS MEMS Actuator at AutoSens USA,” said Colin Kwan, President and CEO of MEMS Drive Inc. “This cutting-edge technology represents a significant advancement in automotive imaging, offering enhanced stability and focus performance that can elevate the capabilities of automotive camera systems.”

Key features include:

  • Autofocus Capability: MEMS technology ensures ideal image focusing at all times, essential for the safe operation of ADAS features and fully autonomous vehicles.

  •  Optical Image Stabilization: MEMS technology eliminates unwanted camera movements, resulting in sharp and accurate images for ADAS and autonomous driving systems.

Learn More: Colin Kwan was a guest during our AutoVision News LIVE broadcasts from the exhibition floor at AutoSens USA 2024. During the interview, Kwan spoke more about the Autofocus and OIS MEMS Actuator, including its durability and other potential benefits for the automotive industry.