Facial recognition is getting better by leaps and bounds, and some of the examples of how it is being used are disturbing. In Russia, the website FindFace matches submitted photos to VK, that country’s Facebook knock-off. Trolls are using it to identify and harass women who appear in adult videos. China uses cameras with facial recognition to tag jaywalkers, and, in Dubai, police wear Google Glasses to identify people. In the U.S., the government facial recognition system can already identify the faces of half of all American adults.
Psychology Today ticks off more of the ways that facial recognition is being developed, many times with sobering implications. “Now researchers are developing techniques that not only identify people by their faces but also infer what’s in their minds,” using facial expressions to identify emotions and “facial structure [to] hint at our genetic makeup.”
At Carnegie Mellon University, privacy researcher Alessandro Acquisti matched student photos to Facebook profiles to harvest names, interests and other information, then used these data points and “another algorithm” to find Social Security numbers. His team was able to successfully find the first five digits for about 25 percent of the participants. Next, Acquisti’s team created a demo iPhone app that, when the phone’s camera was pointed at a stranger, displayed his name, SSN, and date and state of birth.
“Informed consent” may be an illusion, says Psychology Today, and “as the machines’ learning advances, step by step, we must make or accept tradeoffs, explicitly or implicitly.” Also noted was a paper that demonstrated the use of machine learning to “guess” sexual orientation from dating-site headshots, with much better results than the human eye. “The algorithm’s AUC — a statistical measure that accounts for both false positives and false negatives, where 0.5 is chance and 1.0 is perfect — was 0.81 for men and 0.71 for women,” versus humans who “scored only 0.61 and 0.51, respectively.”
Chinese researchers recently published a paper about a study that used machine learning (with a standard convolutional neural net) to judge “criminality” based on a headshot. The study’s definition of “criminality” was the existence of a prior criminal conviction. “It’s easy to see race and class biases becoming embedded and amplified,” concluded Psychology Today.
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.