icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
9 Oct, 2019 16:17

UK govt gave green light to passport photo checker IGNORING fact it would fail people with dark skin - report

UK govt gave green light to passport photo checker IGNORING fact it would fail people with dark skin - report

The UK government deployed its passport photo checking system despite knowing the technology had problems identifying people with very dark and very light skin, a Freedom of Information request (FOI) has revealed.

Documents made available in response to an FOI request from MedConfidential, a campaigning group for confidentiality and consent, shows that the UK Home Office went ahead with the facial recognition system despite their own research throwing up major issues with it.

User research was carried out with a wide range of ethnic groups and did identify that people with very light or very dark skin found it difficult to provide an acceptable passport photograph.

The UK government concluded that despite the failings, the “overall performance was judged sufficient” and decided to use the automated photo checker regardless, the New Scientist report.

Also on rt.com Sajid Javid backs facial recognition tech, as privacy campaigners launch legal action

Since deployment, some users have reported a number of issues with it. In September, Joshua Bada, a black sports coach, complained that the system mistook his lips for an open mouth.

Cat Hallam, a black technology officer at Keele University, endured a similar experience in April, claiming that the technology wrongly suggested her eyes were closed and her mouth was open.

What is very disheartening about all of this is [that] they were aware of it.

Face detection software is usually tested on thousands of images. One way bias can take hold of the system is if there is insufficient or not diverse enough training data to represent the group it will be used on.

Facial recognition technology has a history of failing to recognize individuals with certain skin tones. Google famously had to apologize in 2015 when its photos app tagged a black couple as gorillas.

Think your friends would be interested? Share this story!

Subscribe to RT newsletter to get stories the mainstream media won’t tell you.

Podcasts
0:00
27:26
0:00
27:2