When the new iPhone was announced, the face recognition software was held out as being groundbreaking. Keeping your phone secure was now as easy as taking a look at your unique face. Apple tech geeks were assured that there was a one in a million chance that someone else could trick a users phone into allowing them access via this security feature.
It seems those once in a lifetime odds drop quickly for users who are not white. Recent viral videos expose a design flaw in the security system that allows individual ethnic groups to easily access phones of other people due to their similar facial features. This deficiency is causing many to question why there was not more care paid to secure the phone for all users and if this is a sign of a more significant issue with bias within Apple. Is the Apple iPhone X racist?
The recent complaint is tied to the discovery of an Apple customer in Nanjing, China. After buying the new iPhone, the woman took it to work to show friends. What was supposed to make her phone secure quickly failed, it seems her co-worker was able to use the facial recognition software to unlock the phone without any real effort.
When the phone was returned to the local Apple store, employers exchanged it for a new one. They were sure it was a failure of that one phone, and perhaps the scanner on the handset was not working right. The Chinese woman quickly tried out the second phone and replicated the issue. It was not a phone problem but instead a design flaw.
The two women that the software deemed to be one in the same did look similar but by no means were twins. They had distinct facial differences that should have been detected but were not. They were both of Chinese descent, and this is where the issue started.
It seems the software to identify one user from the next was not prepared to deal with characteristics that are common to a variety of ethnic groups. Instead of picking up the differences between the two Chinese women that are clear to anyone seeing them in person, the software saw their faces as being the same. The idea being suddenly that “…they all look alike.”
One of the reasons for more ethnic diversity in tech. Devices can't be biased, but if the creators don't account for their own biases it shows up in things like Asian women being indistinguishable to iPhones and black hands not triggering sensors in soap machines. https://t.co/b0A2IgrsSS
— Simply TC (Not From Concentrate) (@BienSur_JeTaime) December 16, 2017
The customer in China returned the phone for a full refund after showing the retail store she bought it at that the issue was quickly replicated with the phone they gave to her to replace what they thought was a broken phone. While there have been no complaints made public from white customers, the biased software also has issues with other ethnic groups beyond the Chinese.
A short time after the phone was released, a video went viral that shared the story about one mother’s frustration with her 10-year-old being able to unlock her phone. Sana Sherwan was excited to get her hands on the new phone and enjoyed the idea that things like online shopping could be locked to anyone else but her. This excitement quickly disappeared as she made her first try at setting up the phone.
Because the family was all excited to get the high tech gadget, they began passing the phone around. When their 10-year-old got ahold of the phone, it took mere seconds for him to “trick” the phone. Sherman’s husband Attaullah Malik explained:
“We were sitting down in our bedroom and were just done setting up the Face IDs, our 10-year-old son walked in anxious to get his hands on the new iPhone X.
Right away my wife declared that he was not going to access her phone. Acting exactly as a kid would do when asked to not do something, he picked up her phone and with just a glance got right in.”
While there is a family resemblance between the mother and child, there is no way that the younger boy should have passed for his mother. There are way too many ways that their faces are different that the software should have picked up the issue.
In the case of the mother and child, the access was not 100% though. There were times that the software blocked him from accessing it, and there were other times that he was able to get right in. This is not something most parents would be happy with seeing their children now have access to anything they may not want them to have free reign to.