What will fully automatic cars look like?

This is how an automated car works

What does a car need to be able to drive automatically? And which systems are already in use today? We show the current state of the art and explain the challenges that automation brings with it.

Video cameras on the automated car are directed forwards, backwards and to the side. They provide real images of the road, traffic signs and other road users. They determine the distance to objects and help to perceive pedestrians and cyclists.

Radar sensors continuously measure the distance of your own car to others Road users and things in the vicinity. The GPS system receives signals from satellites and thus gives a view of the surrounding area. Additional Acceleration sensors recognize in which lane the car is currently driving, whether it is traveling in the desired direction or whether it is skidding.

By Cellular or WIRELESS INTERNET ACCESS the automated car (and already a few models today) can exchange information with other vehicles and data sources (C2X communication). This enables early warnings of unexpected obstacles.

The software on board then evaluates all the data and adapts the autonomous driving style accordingly. This enables the car to brake, accelerate or steer independently. The software also knows and takes traffic rules into account.

How the computer sees traffic

In order to find their way around in traffic, automated cars categorize the environment into different objects: What is the lane, where are traffic signs and where are people and other cars? This subdivision is important insofar as, for example, different movement sequences are to be expected from a car than from a person.

If a pedestrian, including direction and speed, is recognized by the car, it calculates further movement based on Probabilities. While a pedestrian can get out of the danger area with one step in half a second, a car cannot. The automated car has to respond accordingly to the various road users react differently.

Human eye versus computer image

The sensors, especially radar, are unbeatable when it comes to calculating distances and speeds. The computer can calculate exactlywhen a collision would take place with the same parameters and could initiate braking with an accuracy of centimeters. Man can only guess. In addition, unlike sensors, people are sometimes distracted. But when do the parameters stay the same in a complex situation?

The strength of the people is that Capture complex situations and the nonverbal communication. Through eye contact, for example, people can tell whether a pedestrian has noticed the situation or whether he is inattentively stepping onto the street. Cameras capture the direction in which people are looking, but that is not real eye contact.

A short waving or blinking can also be enough to communicate or to resolve a difficult situation. The computer doesn't understand. With him there are only traffic rules and associated measured values.

How did you like the article?