Since the last decade, drones and UAV’s have quickly risen in popularity and have proven to be in the new age of comprehensive aerial technology. These vehicles are controlled wirelessly via a remote or an established connection, where the controller can be up to a thousand of kilometers away. Recently, this technology has been doing wonders around the world; saving human lives, contributing to further technology and making things better for everyone. There are many types of drones, each coming in different shapes, sizes, and their own share of benefits. The three main categories of drone use are recreational, commercial and military. Recreational drones are used for leisure purposes, such as hobby’s and photographs. Commercial drones are used for business purposes, such as surveillance, policing, explorations and delivery. Military drones are used for reconnaissance, attacks and defenses, military operations, and warfare. In today’s society, the vast majority of drones is for recreational and commercial uses and outnumbers that of the military. However, though the benefits seem endless, there are also numerous ethical and moral dilemmas that come with this aerial technology. Thus, this essay will focus on the ethical issues that arise with the use of military drones and UAV’s, in particular, the ethical issues with autonomous drones in which there are civilian casualties, right of authority, moral compass and invasion of privacy.
“The US military hopes that drones will be capable of changing their own missions, altering course without a human command, and buzzing through the skies in coordinated groups within the next 25 years” – US Defense Department 2014. The argument for the benefits vs consequences of drones will always favor the latter, but it is different for autonomy. Autonomy in technology is when a machine can function by itself without the help of a human, and when it comes to drones there are 4 levels of autonomy. The first is tele-operation, which is piloting the drone via a remote control. The second level is programmed autonomy, where the drone is programmed to do a range of things and behave in a certain way (we programed drones in class). The third level is supervised autonomy, as if an error occurs, a human can step in and correct it. The last and most dangerous level is “complete autonomy” – which is essentially what the US Defense Department describes: the ability to handle a problem completely without human intervention. With completely autonomous drones, a lot of ethical issues may occur.
The first issue is civilian casualties. Autonomous drones have complete control of their systems and programs, and considering that they are weaponized, the chances of getting civilian casualties are high. According to The Bureau of Investigative Journalism, the US has reportedly carried out a minimum of 430 drone strikes to Pakistan since the beginning of 2004. Out of these drone strikes, a range of 2515 – 4026 were killed, in which 424 – 969 were civilian casualties and 172-207 were children. This equivalents to 17% – 24% of the casualties being civilians and children, so for every drone strike there have been at least 7 civilian deaths. This is an astounding number of civilian casualties considering that these drones come equipped with state-of-the-art remote sensory and recognition technology. In addition, the latest US drone strike was less than a week ago, on the 19/08/2018, where multiple drones were deployed in Afghanistan to eliminate high-profile militant soldiers. However, though they succeeded, it is impossible to state that no civilian casualties will be taken. Thus, autonomous drones eliminate the risk of soldiers being harmed/killed, and they probably decrease the deaths in warfare, but it is still very undermined to what the ethical consequences of sending them out are.
The second issue is the right of authority. What happens in the construction of an autonomous drone is unknown, but it is sure that complex AI and sensory computer programs are put in check to hand over the task of identifying, eliminating and killing specified targets. A machine is capable of mass destruction. It is fully self-functioning and can kill, and we are handing over the task to it. From the perspective of the country’s national defense and the military, the main goal is to protect citizens and ensure that all threats have been eliminated or at least kept under tabs, and using drones to do so may be the most efficient way. However, giving drones the authority to kill can entail many questions about the state of human ethics. As long as our enemies are eliminated, does it really matter if humans killed our enemies or if machines killed our enemies? Are we dehumanizing not only our enemies but ourselves as well? Why do machines have more authority on life and death situations than their own creators? Is safety more important than rights? All of these vital questions bring in the conversation of why drones should be able to kill and make decisions over humans, and they need to be answered.
But as we give authority over to the drones, it is important to pause and reflect on consequences of this decision as we can be affected too. If drones do end up having the ability to kill, our own morals come into play. Have we removed the responsibility of killing from our own hands? Does the act of killing mean less because suddenly it is machines who are doing it? This is the third issue, we have to change our own moral compass. The decisions that we make in regard to drones can very well change the humanitarian enterprise of morals and ethics. Consider this, when a drone kills, it kills without remorse. When a man commits an action, whether it is killing or not, there is always remorse. So, if the decision to kill our enemies belongs to the machines, have we intentionally surrendered our emotional stake? Do we give up the possibility of feeling remorse towards the act of killing as we are not committing it anymore? Because the difference between a man and a machine is that the actions of a man can have implications, unlike that of a drone, which isn’t self-conscious and doesn’t have emotions.
The last and final issue is privacy. As with any drone, if footage of a person is taken without their consent, it’s an invasion of privacy, and the person in charge of the drone would get into trouble. However, because autonomous drones do not have a controller and act “completely autonomous”, this technically doesn’t apply to them. Currently, there aren’t a lot of laws set in place for this kind of situation, so if information is gathered, it could be used for unintended purposes. Without any human intervention, autonomous drones can pretty much choose whoever they think is suitable to gather information on. With their complex technology, there is an immense amount of information that could be gained. They can act as a hidden surveillance camera. In a military context, it would be a great opportunity to gain insight on your enemy, you could follow them, track them down and get a lot of valuable intel. However, if this was decided by society, it would probably be revoked and many people would deem it inappropriate, although it isn’t breaking the law. Here, we have a legal situation where basic human rights are being violated as drones can be used for unintended purposes to gather information about people without their permission.
In conclusion, it is recommended that more precautions be taken with drones and new laws be implemented to minimize the impact of their consequences. I would suggest that new programs be built specifically for military drones.
This essay has been submitted by a student. This is not an example of the work written by our professional essay writers. You can order our professional work here.