Instead, it tested several models used by academic researchers, trained on publicly available datasets. It didn’t test any object-detection models actually being used by self-driving cars, nor did it leverage any training datasets actually being used by autonomous vehicle manufacturers. The report, “Predictive Inequity in Object Detection,” should be taken with a grain of salt. “The main takeaway from our work is that vision systems that share common structures to the ones we tested should be looked at more closely,” Jamie Morgenstern, one of the authors of the study, told me. That disparity persisted even when researchers controlled for variables like the time of day in images or the occasionally obstructed view of pedestrians. The result? Detection was five percentage points less accurate, on average, for the dark-skinned group. The researchers then analyzed how often the models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.
#Self driving car body design freedom skin#
They divided up the people using the Fitzpatrick scale, a system for classifying human skin tones from light to dark. The authors of the study started out with a simple question: How accurately do state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups? To find out, they looked at a large dataset of images that contain pedestrians. That’s because automated vehicles may be better at detecting pedestrians with lighter skin tones. If you’re a person with dark skin, you may be more likely than your white friends to get hit by a self-driving car, according to a new study out of the Georgia Institute of Technology. In addition to worrying about how safe they are, how they’d handle tricky moral trade-offs on the road, and how they might make traffic worse, we also need to worry about how they could harm people of color. The list of concerns about self-driving cars just got longer.