from the class: Sample Replies to a Classmate’s Discussion Post
(1) Excellent post, Jordan! I also agree with Dr. Saxe’s moral arguments, as it seems strange and paternalistic that the State can determine what people can and cannot do with their own bodies, especially regarding something as intimate and important as one’s death. I think that as long as there are robust informed consent processes prior to undergoing physician-assisted suicide, this is an ethically justifiable practice that should be legalized. Do you think that all U.S. states will eventually legally allow physician-assisted suicide?
(2) Respectfully, Jordan, I must disagree with your post. While perhaps physician-assisted suicide itself is not bad, I believe that there are serious concerns regarding physician-assisted suicide potentially being a “slippery slope” into more nefarious and immoral practices, such as active euthanasia and/or involuntary active euthanasia of the mentally ill. Are you not concerned that legalization of physician-assisted suicide could lead into more problematic practices later on?
I just need 2 different response to these 2 different classmates
response 2) Module 9 Discussion Post 2: Micro Drones Killer Arms Robots – Autonomous Artificial Intelligence – Warning !!
What is the speaker’s point of view about the topic? Is the speaker biased?
Stuart Russell, professor of computer science/AI shows a sobering video meant to strike fear into those who would support computers/ai making decisions on killing humans. The speaker’s viewpoint is biased because his company supports arguments against AI/data driven killing.
What ethical issues and ethical reasoning are argued?
The ethical issues argued in this video are targeted killing and drones. Proponents of targeted killing would stem from a realist or even a just war theory view. These killings are justified when it is in defense or defense of others (Mackinnon and Fiala 2018, 541). Those in agreement would also support the use of drones and the increased accuracy they provide in discriminating these targets to prevent civilians from being injured. However, civilian non-combatants have been frequently killed by these drones. We also get into a sticky situation with preventative targeting of those likely to commit terrorism who have not yet committed a crime, and what constitutes a battlefield and a combatant.
Do you think that the use of drone technology has moral and ethical implications? Why or why not?
Yes. Rights of privacy and freedom are at threat here. I understand the argument that computers/ai make fewer mistakes because human emotion is not involved, however, they also dont have empathy. If drones are used as demonstrated in the video then they would be categorized as a weapon of mass destruction and would fall under such regulations.
Do you agree with the speaker’s point of view?
I take a moderate just war stance to the speakers viewpoint. When drones are used for self-defense/defense of others then it is okay (Mackinnon and Fiala 2018, 531). I agree with the data on the improved accuracy and reduction of civilian losses involved when drones are used. But when drones are used as such in the video where they invade human rights and freedom of speech causing mass loss of life then it is unacceptable. There are too many morals/ethics involved to allow a computer/ai the decision to take life.
response 1) I watched The Edge of Drone Warfare by James Rogers.
What is the speaker’s point of view about the topic? Is the speaker biased?
Mr. Rogers argues that the widespread availability of precisely controlled, long-range, consumer drone systems presents significant threats to public safety and national security. He is a defense strategy consultant and, among other things, advises the UK government on drone policy. Because he acts in a professional capacity to consult with governments and other organizations on the threat drones pose, he is interested in persuading people that drones are a threat to justify his consultancy. That said, the cases he discusses are well known, and his perspective on them seems to be balanced.
What ethical issues and ethical reasoning are argued?
He argues that we ought to be cautious in embracing this new technology and invest in methods to mitigate its risk. It seems to be a wholly utilitarian/consequentialist argument. Based on the clear evidence that these devices can be used to disrupt government and kill innocents, such disruption and killing are bad, so we ought to control it.
Do you think that the use of drone technology has moral and ethical implications? Why or why not?
Certainly, there are three domains of drone use that have distinct ethical concerns. Consumer drone use and the potential for unlawful/terroristic uses of that technology, law enforcement and the ethical concerns around surveillance and a changed risk evaluation that might result in excessive use of force by police, and the use of drones in warfare. The police use of drones ought to be heavily restricted, based on the history of police abuse of force, especially against minority communities. Giving them the power to spy on and kill their fellow citizens without putting officers at risk in a society where they already kill and steal without consequence presents an unacceptable risk for no benefit. The use of drones in warfare is not without problems, but where the benefit to soldiers’ safety and the precision it offers compared to traditional airstrikes far outweighs the cost of the relatively small amount of collateral damage that is demonstrated.
Do you agree with the speaker’s point of view? Be specific and thorough. Express how and why you agree or disagree and discuss how ethics and values contribute to your opinion. Consider the theoretical concepts discussed in the course. Do not state your viewpoint; rather, provide relevant details to support your findings and/or position.
I agree with what Mr. Rogers says, although I feel his position is underdeveloped. He doesnt appear to address the potential benefits of drones for business and hobby use and consider the degree to which the burden of controlling illegitimate use should fall on the users through remote ID systems (FAA, 2020) vs. the kinds of defensive technologies he describes in his talk. The threat he describes is real. The kinds of consequences that can come from weaponized drones being used unlawfully is something that we have a duty to protect against based on the idea that killing and breaking the law is wrong and that this technology empowers such things when abused.