CMSC 304

Social and Ethical Issues in Information Technology

Ethics Frameworks

CMSC 304

Social and Ethical Issues in Information Technology

Deontology

Ethics Frameworks

Quiz Review

3 Ethical Frameworks

  • Deontological: This framework is based on a set of rules such as company rules, government rules, or religious rules such as the Ten Commandments. If the action in question adheres to the rules, it is considered to be the right thing to do in that context.

  • Utilitarian: There are many varieties of this approach, but the basic idea is that the right thing to do is the thing that brings the most utility, or happiness, to the most people. While the action may help some and hurt others, if the benefits to some outweigh the costs to others, the action is “the right thing.”

  • Virtue Ethics examines the background of our moral decisions and actions. For example, it asks how a person who is generous will see a situation and its moral demands differently from someone who is greedy. The “right thing to do” is the thing that would be chosen by the virtuous person.

2 Rounds of Discussion + Report Out

  • Groups must be 3 people minimum, 4 maximum
  • Each person should take some individual notes during the discussion
  • Choose a scribe to write down the group's summary for each question
    • For each question in a round, choose a different scribe
  • The goal is NOT to gain consensus, rather to appreciate and consider different viewpoints + examples
  • Feel free to reference the readings while discussing
    • Reference, not "read for the first time"
  • When the round ends, I'll ask the groups to volunteer a summary of their discussion
    • Anyone from the group can answer, doesn't have to just be the scribe
  • I'll then ask if anyone disagrees, or has a counterpoint, or if you had a different interpretation
  • Each group that participates will get some points
  • Please raise your hand if you need clarification or have a question, or need some inspiration! This is not a quiz or test
    • No ChatGPT or other AI during discussions

Week 4 Discussion: Deontology

This week we read 3 things:

Deontology Highlights

  • Deontology emphasizes the rightness or wrongness of an action by reference to certain action-guiding principles:

    • laws, rules, maxims, imperatives, or commands

  • In deontology, following these principles makes an action "ethical"

    • even in situations in which the consequences of an action are understood to be desirable or good!

    • i.e. no "ends justify the means"

  • Instead, your "means" must be based on duty, and there must a choice

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
    • Only intention matters, not circumstances or consequences

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly

CMSC 304

Social and Ethical Issues in Information Technology

Round 1

1. In the proposed third law, what do the authors mean by the term "situated autonomy"?

2. In the video “Artificial Intelligence Explosion,” the team wrestles with the idea of “solving ethics” by creating a set of “correct, unambiguous rules.” Do you think the authors of the article “Beyond Asimov: The Three Laws of Responsible Robotics” have done this? Why or why not?  

3. The alternative laws emphasize human responsibility in the design and deployment of robots, rather than robot-centered accountability. What are the potential challenges and benefits of placing this responsibility on humans rather than on the robots themselves?

4. In what ways could the alternative laws of responsible robotics presented in the article “Beyond Asimov: The Three Laws of Responsible Robotics” be applied to current or emerging technologies (e.g., autonomous vehicles, medical robots, etc?)? Can you think of any scenarios where these laws might be difficult to implement?

5. (bonus question, if you’re done early) In the video, Lucy says “following rules cannot produce understanding.” Do you agree with this? Does it matter whether artificial intelligence has "understanding"? What does "having understanding" even mean?

CMSC 304

Social and Ethical Issues in Information Technology

Deontology Highlights

  • Deontology emphasizes the rightness or wrongness of an action by reference to certain action-guiding principles:

    • laws, rules, maxims, imperatives, or commands

  • In deontology, following these principles makes an action "ethical"

    • even in situations in which the consequences of an action are understood to be desirable or good!

    • i.e. no "ends justify the means"

  • Instead, your "means" must be based on duty, and there must a choice

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
    • Only intention matters, not circumstances or consequences

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly

Deontology Highlights

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
    • Only intention matters, not circumstances or consequences

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly

Deontology Highlights

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
      • Not all laws are as equally as important: e.g. jaywalking vs. murder
    • Only intention matters, not circumstances or consequences

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly

Deontology Highlights

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
      • Not all laws are as equally as important: e.g. jaywalking vs. murder
    • Only intention matters, not circumstances or consequences

      • Task of deontology is to understand how best to honor one’s duties within those circumstances

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly
      • This type of thinking is called moral relativism, and is generally frowned upon

Deontology Highlights

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
      • Not all laws are as equally as important: e.g. jaywalking vs. murder
    • Only intention matters, not circumstances or consequences

      • Task of deontology is to understand how best to honor one’s duties within those circumstances

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly
      • This type of thinking is called moral relativism, and is generally frowned upon

moral relativist: 

Someone who believes that all moral judgments are based on individual viewpoints and that no one viewpoint ought to be privileged above any other—save that person’s own, because most moral relativists are critical of anyone who disagrees with their position on the matter (Midgley 1991).

Deontology Highlights

  • Deontology can seem deceptively simple, and has several misconceptions:

    • It's black and white: either you follow the law (ethical) or you don't (unethical)
      • Not all laws are as equally as important: e.g. jaywalking vs. murder
    • Only intention matters, not circumstances or consequences

      • Task of deontology is to understand how best to honor one’s duties within those circumstances

    • Sometimes multiple duties conflict, so just pick one and stick to it blindly
      • This type of thinking is called moral relativism, and is generally frowned upon

When we say "law," the kind we follow from the government is only one kind of law. Some rules or duties can also be:

  • community based
  • religious
  • professional

Different communities may have different ideas about which parts of human life must be guided by these rules vs. which parts are free of moral obligation

Main Guy of Deontology

Immanuel Kant (1800's Enlightenment)

  • Kant sez: human reason is the most important guide to making moral choices (as opposed to some deity)

  • Most of the time, our laws approximate some absolute moral law, or at least serve as good reminder

  • In order for an action to have moral worth, it must be an action that you, the agent, recognize as right (not just blindly following the rules)

    • But just as importantly, it must be true for EVERYONE

  • A common way to test this is to ask: "If everyone did it, would it still be ethical?"

    • ​use lying as an example

  • Never use others as a means, only as an end

Image from DALLE: "Here are the full-body images of Immanuel Kant as a robot, complete with his Prussian-inspired outfit and futuristic details. These should give you a better sense of his overall appearance." lol thanks DALLE

Deontology in Practice

  • What to do when several duties conflict?
    • For example, you have to choose between saving a stranger and saving your child
    • You have a duty to try and save both, so which to choose?
  • Answer by asking:
    • which is a more fundamental duty to your role in the world? 
      • some would argue that duty to ones child >> duty to a rando
      • what if you are a firefighter on duty, and your role is public servant?
    • which is the more relevant duty?
      • if instead you could either save your kid or stop a terrorist, which to choose?
      • is it your job as a civilian to stop terrorists? how likely is it that an amateur could stop them?

some say deontology is more interested in "what is right" than "what is good"

Deontology: Best and Worst

  • At its worst, deontological thinking can enable fanaticism, leading people to believe they are justified in punishing others for doing things they take to be wrong
  • At its best, deontology can afford people the moral courage to stand against the majority, even when there is no obvious reward for doing so
  • Common criticisms
    • not enough focus on consequences: I may have good intensions, but still f*ck it up
    • too heavy focus on authority
    • overly righteous and impractical: "an action has no moral worth if it is not done out of duty"
      • what about joy? isn't it good to seek happiness in life?
      • does any emotion matter at all? Kant sez no

Deontology and Artificial Intelligence

  • In order for responsibility to exist there must be a choice, and the agent must have the capacity to choose differently.
  • Defining deontology becomes complicated when we start thinking about automated agents, which are programmed to do certain tasks and which learn new things according to programs that have been written by humans.
  • Are automated entities exercising responsibility in the deontological sense?
  • Do we want them to?

Burton, Emanuelle; Goldsmith, Judy; Mattei, Nicholas; Siler, Cory; Swiatek, Sara-Jo. Computing and Technology Ethics: Engaging through Science Fiction (p. 42). MIT Press.

Activity: Deontology

  • The virtue ethics framework asks the question “what would a virtuous person do” to determine what is right.

  • A virtuous person, according to Aristotle, would have good underlying character traits that would cause them to make good choices.

  • The main benefit of this framework is that it is very flexible. It can be applied to a wide variety of decisions.

  • What are some challenges of using the virtue ethics framework?

Activity: Deontology

  • What are some challenges of using the virtue ethics framework?
  • Instead, might it be easier to develop a set of rules to follow that will always produce the ethical action?
  • Using sets of rules can be intuitive for computer science and Artificial Intelligence, as they often use algorithms
  • An algorithm is defined as:
    • A set of rules and procedures that leads to a decision
    • Could be machine learning, or simply control flow logic (rule-based if-then-else)

Activity: Deontology

  • According to Oxford Languages, Artificial Intelligence is “the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

  • Any time you’ve written a computer program that uses “if statements” you’re writing a program that makes decisions – that employs AI.

    • Your program basically consisted of a set of rules (an algorithm) for making the decision

    • Perhaps the decisions your programs have made in the past have not seemed like ethical decisions.

A primer on the readings for next week

  • As you read, keep this ancient wisdom in your mind: "Go touch grass"
  • Also as you read these stories, consider that they could make great talking points in an interview, advertising that you are a well-rounded and thoughtful individual, beyond leetcode and grinding

Upcoming Schedule

  • Next week: Utilitarianism with our usual format
  • Week 6: Begin the writing!
    • You'll having readings for that Monday on some writing topics
    • We'll have a guest lecture + activity from the library on how to do research for papers, citing, and other notes on academic research
  • Week 7: Writing week #1!
    • First paper will be due
    • No readings so you can focus on your paper
    • We'll do in-class peer review activity after you submit it
  • Week 8: All about data + ethics
    • Very light reading because you'll be revising your paper, which means:
      • I'll do a slightly longer lecture than usual on Monday
      • Read the article by Wednesday
  • Week 9: Back to our usual format

This will be posted on Blackboard and I'll send a Discord announcement with this info!