In September 2019, four researchers asked publisher Wiley to immediately retract a study that had trained algorithms to distinguish faces of Uyghur people, a predominantly Muslim minority ethnic group in China. As Richard Van Noorden writes, the study published by Wiley was not alone. Journals from publishers including Springer Nature, Elsevier and the Institute of Electrical and Electronics Engineers (IEEE) had also published peer-reviewed papers that describe using facial recognition to identify Uyghurs and members of other Chinese minority groups. “For facial-recognition algorithms to work well, they must be trained and tested on large data sets of images, ideally captured many times under different lighting conditions and at different angles,” Van Noorden writes. “In the 1990s and 2000s, scientists generally got volunteers to pose for these photos—but most now collect facial images without asking permission.” In some cases, millions of images are collected without consent. Van Noorden’s story included a survey of more than 480 researchers with published papers in facial recognition, artificial intelligence and computer science. Two-thirds of the respondents said studies used to recognize or predict personal characteristics (such as gender, sexual identity, age or ethnicity) from appearance, should be done only with the informed consent of those whose faces were used, or after discussion with representatives of groups that might be affected. Judge Cathy Edwards of the BBC called the story “an impactful contribution to the debate about facial-recognition technology. The Uyghur study highlights how urgently we need to consider the ethics of such research.” While many researchers and journalists have investigated problems with facial recognition technology, Van Noorden said, he wanted to “draw a few threads together and to find out what those who produce this work feel about the dubious ethical foundations of their field, as well as their opinions on how facial recognition technology should be used and regulated. It turns out that some are troubled, but others still don’t see their work as problematic.”