Databases of faces taken from unsuspecting users are being compiled by companies, with several images being shared worldwide for use with facial recognition technology. Using publicly available profile data from sites such as OKCupid, and other less savory collection means like cameras placed at university meeting places, or public restaurants, pictures are compiled and added to databases which app designers can then use for facial recognition software testing. There is as yet no precise count on the dataset, but some privacy activists have noted that repositories built by Stanford University, Microsoft, and several other parties have upwards of ten million images.
Designers around the world employ the databases to train artificial intelligence in facial recognition. By analyzing the different facial aspects of pictures, a neural network can then develop a means of figuring out facial features from a dynamic image fed in from a camera, which the app can then use in its visual processing software. Private companies such as Facebook and Google have their own facial databases, but the data has so far been confined to the company’s private use in facial recognition software development.
The Question of Ethics
Facial recognition technology has come to the fore in recent news, with admissions by the US Government’s Immigration and Customs Enforcement (ICE) personnel used facial recognition technology to detect undocumented migrants. Additionally, the Federal Bureau of Investigation (FBI) admitted that it used facial recognition software to compare photos in drivers’ licenses and VISA applications against known criminal records. No oversight exists on these data sets, and the free use of this data in criminal profiling by authorities raises red flags for activists.
The method of acquisition of data for these databases combined with the lack of consent from members that make up these databases leave the question as to whether it violates the privacy of the individual up in the air. Google, Facebook, and Microsoft all declined to comment on the issue.