How can self-driving cars overcome vision problems?

Self-driving cars use special tools to see better when things are blurry or dark.

Imagine you’re playing hide and seek at night, and it’s really hard to see where your friends are hiding. But then someone turns on a flashlight, suddenly, everything is clearer! That's kind of how self-driving cars work. They have sensors, like tiny eyes that can detect things even in the dark or when it's raining.

How they use more than one tool

Self-driving cars don’t just rely on one way to see. Some cars use cameras, like your phone’s camera, to take pictures of the road. But if it's foggy, those pictures get blurry, just like a smudged drawing.

So they also use radar, which works like a bat uses sound to find insects. Even when you can’t see something clearly, radar can tell the car how far away things are, kind of like feeling around in the dark with your hands.

By using both cameras and radar together, self-driving cars get a better picture of the world, just like having two eyes instead of one!

Take the quiz →

Examples

  1. A self-driving car can’t see the road clearly in the rain, so it uses radar to keep moving forward.
  2. When a car gets stuck in fog, it switches from using cameras to using lidar for better distance sensing.
  3. A car’s sensors work together like a team: one sees the road, another feels the surroundings.

Ask a question

See also

Discussion

Recent activity