Facial Recognition Programs Racially Biased, MIT Study Finds


Curious, Buolamwini, who is black, began submitting photos of herself to commercial facial-recognition programs.

Microsoft's facial recognition systems failed to identify darker skinned women in 21% of cases, while IBM and Face++'s rates were nearly 35%. As noted by Georgetown University's Center for Privacy and Technology, these gender and racial disparities could, in the context of airport facial scans, make women and minorities more likely to be targeted for more invasive processing such as manual fingerprinting.

However, the same software was found to be able to correctly identify white skinned females in 93 percent cases. Both had light-skinned male error rates of less than one per cent. Microsoft's dark-skinned female error rate was 20.8 per cent and effectively zero for light-skinned males.

Climber killed in fall, others stranded on Mount Hood
Two injured climbers got stuck on Oregon's Mount Hood on Tuesday, while one had been airlifted to nearby Legacy Emanuel Hospital. Helens, which is also visible from Portland on clear days, requires permits for those going above 4,800 feet (1,463 meters).

In the paper, Joy Buolamwini of the MIT Media Lab and Timnit Gebru of Microsoft Research, discussed the results of a software evaluation carried out in April and May of a year ago. The system just didn't seem to work reliably with darker-skinned users. "For the past nine months, IBM has been working towards substantially increasing the accuracy of its new Watson Visual recognition for facial analysis, which now uses different training data and different recognition capabilities than the service evaluated in this study", it said, adding, "To deal with possible sources of bias, we have several ongoing projects to address dataset bias in facial analysis - including not only gender and skin type, but also bias related to age groups, different ethnicities, and factors such as pose, illumination, resolution, expression, and decoration".

In another research study, a widely used facial-recognition data set was determined to be over 75 percent men and over 80 percent white, according to the New York Times.

According to the researchers' paper, one "major U.S. technology company" had a data set that was more than 77 per cent male and 83 per cent white, thereby making it naturally better or indeed more biased at picking out lighter-skinned men than darker-skinned women. For darker-skinned women, the error rates were 20.8 per cent, 34.5 per cent, and 34.7. The systems may not be intentionally biased, but when the data being fed into the algorithms are mostly white men and the people working on the technology are white men, the discrepancy should be expected. Your face. There's only one problem. By then, facial recognition software was increasingly moving out of the lab and into the mainstream.

Ford Focus RS Heritage Edition launched
The last of the current-generation Focus RS vehicles will come off the production line in Saarlouis, Germany on April 6 . It also commemorates 50 years of the Escort nameplate, a auto synonymous with many RS models throughout its history.

They found that across all three systems, the error rates for gender classification were consistently higher for females than they were for males, and for darker-skinned subjects than for lighter-skinned subjects.

Buolamwini is joined on the paper by Timnit Gebru, who was a graduate student at Stanford when the work was done and is now a postdoc at Microsoft Research.

Buolamwini says that the benchmark data set, composed of 1,270 images of people's faces that are labeled by gender as well as skin type, is the first data set of its kind, created to test gender classifiers, that also takes skin tone into account.

Manchester City rout FC Basel in Champions League round of 16
City have now won more games this season in all competitions (34) than they did in the whole of last season (33). The 31-year-old also played a major part in Manchester City's title-winning season both in 2011/12 and 2013/14.

Microsoft said that it had "already taken steps to improve the accuracy of our facial recognition technology" and that it was investing in research "to recognise, understand and remove bias". This is because of the data sets provided to the systems and the condition in which the algorithms were created.