Given today’s advancement in technology, people around the world have been creating many software with the use of artificial intelligence to do, or even replace many roles. One noted software that has been out in the news recently would be Amazon’s Rekognition software. Amazon’s Rekognition software is said to be able to conduct facial recognition with high accuracy. In sight of this, the American Civil Liberties Union (ACLU) of Northern California decided to test it out. ACLU started off by setting up a database of 25,000 known accessible arrest photos. Upon comparison with all 535 members of Congress in its default setting, lawmakers such as Luis Guitierrez of Illinois, John Lewis of Georgia, and Norma Torres of California were identified as criminals. Within the 28 lawmakers that Rekognition misrecognized, 40% were people of color. This resulted in ACLU to request for a government suspension on the use Rekognition, a decision that gained popularity amidst lawmakers. One other Congress member who was misidentified was US representative, Jimmy Gomez. However when he found out, he was not surprised. In fact, he told his staff that “I wouldn’t be surprised if it was mostly people of color or minorities,” and was indeed spot on. Gomez said that this result was something to be concerned about because the bias that occurs today will ultimately be digitized and taken advantage to use it on people who are already facing countless of hurdles. Despite the fact that errors in facial recognition within women and communities of color is a common problem, experts mentioned that this incident indicates the importance for extensive discussions about the ethics of such technology, as well as the responsibility of using these technology fairly. Gomez and Lewis proceeded to send Amazon’s CEO Jeff Bezos a letter to ask for an emergency meeting to solve the glitches that produces incorrect results. Other senators have also requested for details on any in-house accuracy or bias evaluations that Amazon has tested on Rekognition, as well as which law administration or intelligence agencies are using it, and whether Amazon checks their use of the software. Amazon on the other hand, rebutted by arguing that the confidence threshold ACLU used was not high enough as it should be 95% or higher when it comes to facial recognition. A computer scientist from the University of Utah said that since Amazon had the choice to not launch a potentially sensitive technology but they still did, they should have ensured that users know the proper use of this software. Microsoft’s President, Brad Smith also proposed that given facial recognition technology’s vast societal consequences and probability for misuse, its use should be governed. Additionally, Amazon’s own shareholders and civil rights movement groups have also asked for the sale of this technology to governments to be halted.
Amazon’s Rekongnition software allows for identification of objects, people, text, scenes, activities and can even pull out any improper content (Amazon, n.d.). It claims to be extremely accurate in facial analysis and recognition on images and videos that have been inputted, and is built on the same validated technology developed by Amazon’s scientists (Amazon, n.d.). This software is available to the public at USD $12.33 (Snow, J. 2018). Rekognition’s software is said to be biased against women and people of color, and this is not the first of such incidences.
Another research conducted by Joy Buolamwini from MIT Media Lab had built a database of 1,270 politician faces, chosen on their country’s line up for gender equality (Goode, 2018). Using three different facial recognition software, specifically by Microsoft, IBM and Megvii of China, all three results reflected errors in gender recognition based on a person’s skin color (Goode, 2018). There was a higher percentage in misidentifying people of darker skin, as well as women subjects (Goode, 2018). The MIT Media Lab also mentioned that it is compulsory for comprehensive artificial intelligence testing and subgroup accuracy statements as many times, businesses do not reveal the accuracy on artificial intelligence’s performance on various subgroups. Some of which even come clean that they do not check (MIT, n.d.). In addition, these assessments have to be intersectional and phenotypic precision should also be carried out (MIT, n.d.). Having biasness creeping into today’s creation of software, otherwise known as algorithmic bias (MIT, n.d.), this clearly shows how social and cultural aspects can affect the ethical side of new technology.
The ethical dilemma in the news story of Amazon’s Rekognition software is the use of fairness with facial recognition technologies. Given the fact that the test on this software, and even other similar software showed prejudice towards women and communities of color, it questions the credibility, reliability as well as how ethical is this technology when it comes to being impartial. Not only that, with such technology readily available to the public, it also challenges the issues on security and privacy.
The pros of a facial recognition technology includes improved security as should there be any intruders into a secured premise, the system will take note and immediately inform whoever is in charge (Dao, D. Q, 2018). This also allows for lesser mistakes where security guards had to manually check the person’s face against some sort of identification photo. Additionally, it conceivably may lower the costs of engaging security staff and companies. Secondly, technology is constantly improving everyday, and the same goes for facial recognition technology. Even though there are still problems of prejudice as mentioned previously, it is not doubt that this technology is improving and slowly becoming more reliable. As technology in this aspect continues to develop, the accuracy of it will also be increased along the way, allowing users to be more confident in this sort of system (Dao, D. Q, 2018). Lastly, security guards were required to overlook the system and make sure it runs correctly. However with the improved technology in such software, the process can be completely automated while maintaining its high accuracy (Dao, D. Q, 2018). This may potentially mean saving a lot of time due to increased convenience, as well as cutting down costs.
The cons of a facial recognition technology consist of the data storage problem (Dao, D. Q, 2018). Data is everywhere and so easily accessible in today’s world. This also means that a huge amount of storage is required to store all these useful data. Pertaining to this, should a facial recognition system have to store all these data, it will definitely lower its efficiency due to the processing time required. To counter this, companies use a large number of computers to process all the data as well as to decrease the time that is needed to do so (Dao, D. Q, 2018). However, this will be a stumbling block for facial recognition systems until technology develops to overcome it. Another con would be the camera angle, which plays a crucial role in processing a face (Dao, D. Q, 2018). Numerous angles are required to accurately identify a person’s face. Any minor interference such as facial hair or hair accessories can potentially affect the outcome. The only way to avoid such obstacles is to make sure that the database is constantly revised and up to date with its data (Dao, D. Q, 2018). The impact of using such technology on an institutional level such as educational, professional or even government institutes may be beneficial. It allows for higher security and convenience while lowering costs. However, it is to be noted that with the defects in the system regarding biasness, it may not be ethical for government institutes to fully rely on facial recognition technology when it comes to law and enforcement, as it needs to account for security and privacy matters.
On the cultural level, it is more detrimental as the fact that such biasness can seep into technology; it further promotes discrimination towards women and people of color. People in these categories are likely to receive even more hostility especially when digitized systems are also against them. Lastly on the global economy level, facial recognition technology may improve the world’s economy in terms of finances. At the same time, with the advancing research in this aspect of technology, it is expected that people, companies and even government agencies around the world would want to engage in the use of facial recognition technology. This not only helps to keep the country up to date, it also provides extra security for the people. It is also important to keep in mind that although there are many benefits to this technology, there are also many other aspects to consider when employing it. The main ethical issue of this facial recognition technology and its use of fairness is on social justice where Egalitarian philosophers believe that being ethical is whatever with the most social justice or otherwise, most fair for everyone (Pavlik, J. V., & McIntosh, S., 2017).
In this case of Amazon’s Rekognition software, it really opened my eyes to see that beyond what we are culturally rooted in the world, it actually is brought forward into technology to use it against people with no power. Hence, it is a dilemma if people should be using such technology despite the clear ethical defect of biased facial recognition. This to me is definitely not social justice because of such discrimination going on in the world. People have opinions and can voice out their thoughts, however, technology do not and men are the ones who create these technology. The fact that even these men created technology has prejudice against the minorities; it accentuates the need for social justice. There is no one-time solution to solve this issue as this discrimination against minorities has been around for years. However, what people can possibly do is to not incorporate it into technology and to ensure the fairness of using facial recognition systems.
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers. You can order our professional work here.