The reliability and quality of the support provided by assistance and safety systems is enhanced by sensor fusion – the exchange and interpretation of data from several independent sources, such as radar systems, video cameras and digital roadmaps.
The following scenario demonstrates the value of sensor fusion: the familiar DISTRONIC proximity control system uses radar to look ahead and select the appropriate speed for the traffic conditions, continuously adjusting it to the situation and always maintaining an appropriate distance from the vehicle in front.
On winding, multi-lane routes, however, the radar might lose sight of the vehicle ahead. This is where the abilities of DISTRONIC could be complemented by a Lane Assistant equipped with a camera which literally keeps track of developments, even in confusing situations. As weather conditions, such as a low sun or heavy rain, could compromise the effectiveness of the camera, further information about the course of the road is required. This is provided by combining data from digital road maps with the vehicle position, which is established by GPS, in order to determine the curvature of the road.
As "sensors" which are completely unaffected by the weather, digital road maps could also help to render "visible" an object such as a roadside speed-limit sign which is concealed by a truck. Sensor fusion could also allow the two channels to complement each other effectively at traffic lights as the image processing function can register and communicate the state of the lights much more easily if it already knows their position thanks to the digitalised map.