ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
ERCIM news 130
ERCIM news 130
Back Issues Online
Back Issues Online

by Jürgen Kogler, Christoph Sulzbachner, Erwin Schoitsch and Wilfried Kubinger

Reliable Advanced Driver Assistance Systems (ADAS) aid drivers in a variety of traffic and environment/weather conditions. Growing traffic volumes require sensors and systems that handle difficult urban and non-urban scenarios. For such systems, the EU FP7 project ADOSE is developing and evaluating new cost-efficient sensor technology that will provide vehicles with a ‘virtual safety belt’ by addressing complementary safety functions.

The EU-funded project ADOSE (reliable Application-specific Detection of road users with vehicle On-board SEnsors), coordinated by Centro Ricerche Fiat (CRF), is evaluating new sensors and sensor systems. Such sensors are necessary for ADAS like lane departure warning, collision warning or high-beam assist. Figure 1 illustrates the various sensors from different project members. Additionally, the operating distance of each sensor and examples of its use are depicted.

Cost-effective solutions for ADAS are still missing, which is preventing both extensive market penetration and an increase in the number of sensors and supported safety functions. Studies were performed in 2005 to evaluate customers’ desire and willingness to pay for active and passive safety systems in passenger cars (Frost & Sullivan, European Markets for Advanced Driver Assistance Systems, B844-18, (2006)). In general, the price that consumers are willing to pay for their ideal package of safety features is significantly lower than what they perceive its market price to be. Researchers and manufacturers therefore need to find ways to reduce the prices of safety options to increase customer acceptance. For this reason, it is critical to develop and implement high-performance sensors that considerably reduce the costs of ADAS for safety in passenger cars. As an example, the penetration of ADAS dependent on classical vision sensors in the highly cost-driven automotive market is still limited by the cost of the electronics required to process images in real time.

The five sensor technologies to be developed further are (partners in brackets):

  • far infrared (FIR) add-on sensor with good thermal and spatial resolution at lower cost, to be combined with a high-resolution imager for enhanced night-vision applications (more reliable obstacle detection and classification) [Bosch]
  • low-cost multi-functional CMOS vision sensor, detecting critical environmental parameters (fog, rain etc) and providing information on the driving scenario (oncoming vehicles, vulnerable road users (VRUs) in night conditions) [CRF, STMicroelectronics, Fraunhofer Institut für Zuverlässigkeit und Mikrointegration (IZM)]
  • high spatial resolution and low-cost 3D range camera based on 3D packaging, optical CMOS and laser radar technology for short-range safety requirements (eg for pre-crash) [Interuniversity MicroElectronics Centre (IMEC)]
  • harmonic radar combined with passive nonlinear reflective and active tags enabling easy detection and identification of traffic obstacles and vulnerable road users, even in dark or adverse weather conditions [Valtion Teknillinen Tutkimuskeskus (VTT), Triad, Uppsala University]
  • high temporal resolution and low-cost bio-inspired silicon retina stereo sensor (SRS), addressing time-critical decision applications [Austrian Institute of Technology (AIT)].

The approach of the Austrian Institute of Technology (AIT) to reducing the costs of ADAS is to use an SRS. An SRS is a vision-based sensor that delivers information about the illumination changes (‘events’) in the visual field. Figure 1 illustrates an example of the SRS output, which rather than reflecting static images, records events when either the object or the vehicle (or both) are moving. Derived from the human vision system, the bio-inspired silicon retina sensor is a new type of imager. It detects intensity changes in an observed scene, with each pixel delivering its address and event data separately and independently. This type of sensor is intended to overcome certain obstacles in classical vision systems: high temporal resolution allows quick reactions to fast motion in the visual field, on-sensor pre-processing significantly reduces both memory requirements and processing power, and wide dynamic range helps in the difficult lighting situations encountered in real-world traffic.

The SRS is specifically tailored to serve as a pre-crash warning and preparation sensor for side impacts. Pre-crash applications must reliably react in real time to prepare the vehicle (eg activate the pretensioner, preparation of a side airbag) for the imminent impact (which, in case of side impact, cannot be avoided by a reasonable reaction of the impacted vehicle). For the pre-crash sensor, it is necessary to take distance measurements of objects approaching the sensor. Two silicon retinas have therefore been coupled to a stereo vision unit, allowing depth information to be extracted from moving objects in the viewed scenery. Before the depth information is available, it is necessary to match the corresponding SRS data in the acquired left and right images. The so-called ‘stereo-matching’ step is an essential part of each stereo vision system. A new kind of stereo vision algorithm has been developed within the ADOSE project especially for the silicon retina sensors, along with advanced sensors with higher resolution than ever before.

Links:
http://www.adose-eu.org
http://www.arcs.ac.at
http://www.smart-systems.at

Please contact:
Willi Kubinger, Jürgen Kogler, Christoph Sulzbacher, Erwin Schoitsch
Austrian Research Centers /AARIT, Austria
E- mail: {wilfried.kubinger, juergen.kogler, christoph.sulzbachner, erwin.schoitsch}@arcs.ac.at

Next issue: January 2024
Special theme:
Large Language Models
Call for the next issue
Get the latest issue to your desktop
RSS Feed