Vista Normal

Hay nuevos artículos disponibles. Pincha para refrescar la página.
AnteayerSalida Principal

Self Driving Cars Learn from Our Eyes

8 Septiembre 2024 at 20:00

[Michelle Hampson] reports in IEEE Spectrum that Chinese researchers may improve self-driving cars by mimicking how the human eye works. In some autonomous cars, two cameras use polarizing filters to help understand details about what the car sees. However, these filters can penalize the car’s vision in low light conditions.

Humans, however, have excellent vision in low-lighting conditions. The Retinex theory (based on the Land Effect discovered by [Edwin Land]) attributes this to the fact that our eyes sense both the reflectance and the illumination of light. The new approach processes polarized light from the car’s cameras in the same way.

The images pass through two algorithms. One compensates for brightness levels, while the other processes the reflective properties of the incoming light. They mounted cameras on real cars and drove them in actual dim environments to test everything out.

The result? Studies show that the approach improved driving accuracy by approximately 10%. However, the algorithms require extensive training on difficult-to-obtain data sets, so that is one challenge to adoption.

Self-driving cars certainly need improving. Oddly enough, navigation can be done with polarizing filter cameras and a clear view of the sky. Or, you can look under the road.

❌
❌