Skip to content Skip to sidebar Skip to footer

Is Digital Trolley a Problem?

University of Ljubljana


Moral dilemmas are situations in which individuals must make a choice between two or more actions, where each option has a moral implication that conflicts with the other. These dilemmas test an individual’s values and ethics, as they are forced to weigh the consequences of their choices and determine what is right and wrong. Examples of moral dilemmas include deciding between telling the truth and keeping a promise or choosing between helping someone in need and following the rules. Moral dilemmas can arise in personal, professional, and societal contexts and can have lasting impacts on individuals and communities. They can arise in digital environment as well.

The digital environment has given rise to a number of ethical dilemmas, as technology continues to advance and permeate all aspects of our lives. These ethical challenges stem from the intersection of rapidly developing technology and the conflicting values and norms that exist in society. Important dilemmas that arise in digital environments include privacy vs. security, freedom of speech vs. hate speech, autonomy vs. targeted advertisement, transparency and proprietary data ownership, and more. Some dilemmas are old, now just in a new, digital context, but some dilemmas are completely new.


In addition to introducing important concepts from this field, the workshop also contains two activities, which are its central part. In the first activity, students learn about the role of personal values ​​in moral decision-making. In this activity, students learn what values ​​are, what the hierarchy of values ​​is and what influence values and hierarchy have on their moral decision-making. The second activity is dedicated to case studies. In this activity, the students encounter practical examples of moral dilemmas, and in doing so, they themselves have to propose a solution to the moral dilemma and provide arguments for their decision. At the end, there is a discussion about the experiences students have with moral dilemmas in the digital environment.

The expected duration of the workshop is 90 minutes. If necessary, you can skip your individual workshop if you want the workshop to last 45 minutes.

Materials that should be made available to students: blank sheets of paper (A4), cards with values ​​written on them (there should be as many sets of cards as there are intended groups), sheets of bigger paper (A3), markers, printed examples of moral dilemmas for case studies.

Learning outcomes that will be attained through the workshop:

  • students can understand the concepts of ethics, morality, moral dilemmas, digital environment, and the role of values in moral decision-making.
  • students can identify conflicting values in moral dilemmas and assess the hierarchy of values
  • students are aware of different perspectives on solving moral dilemmas and can empathise with conflicting views
  • students gain experience of arguing their position, which can help them in the future when participating in discussions on moral dilemmas
  • students are familiar with some important moral dilemmas that are often present in the digital environment and are better able to identify new dilemmas in the future.


Photo source:


The moderator stimulates the students at the beginning with the question: “Have you ever found yourself in a moral dilemma while using the Internet?” Moderator allows the students to list a few examples.

The moderator then opens a short discussion with questions to stimulate students’ thinking and help them to participate actively in the workshop. Possible questions include:

What is the first thing you think of when you hear the term “moral dilemma”? Why is it important to talk about it?

Have you ever been faced with a decision that you felt would have moral implications? How did you react?

Can you think of any historical events that involved moral dilemmas?

Do you feel that technology and social media have an impact on your ability to solve moral dilemmas?

2. Definitions of important concepts and examples of moral dilemmas

The moderator explains to the students the basic concepts relevant to this lesson: ethics, morality, ethical dilemmas, digital environment. (See Annex 1 for help)

The moderator presents an example of a moral dilemma – The Trolley Problem. (See Annex 2 for help)

The moderator then asks the students if they know any other examples of moral dilemmas. If the students do not offer examples themselves, the moderator can give some other examples before continuing with the lesson. (See Annex 1 for help)

3. The role of personal values ​​in decision-making

The moderator says that moral dilemmas are often the result of a conflict between two or more values ​​that are important to us.

The moderator can illustrate this with a somewhat simplified version of the example given by the French philosopher Jean-Paul Sartre (See Annex 3): During the Second World War, a young French man finds himself in a moral dilemma whether to join the army and defend his country or stay at home and take care of his ill mother who relies on him. A young man finds it difficult to make a decision – and this is a clear sign that it is really a dilemma – because the value ​​of the country (and e.g. respect for authority, freedom,…) and the value of the family are in conflict. He feels an important duty to both of them, because they are both really valuable to him.

The moderator divides the students into smaller groups (4-5 per group). To each group, distribute set of cards with words that indicate possible values ​​(truth, respect, justice, sincerity, friendship, family,…) – for an example of a list of values, see Annex 4

The first task involves the students in a group placing all the cards in such a way that the values ​​that they think fit together are all in one stack. They can write down which groups of values ​​they have identified on the sheet. You can also have a discussion about what connects the values ​​that students recognize as related. (You can skip this exercise in case of lack of time).

For the second task, each group must choose from all the cards those values ​​that they think are the most important (the moderator can set a limit, for example, “Choose the most important 5 values”).

In the third task, each group must then rank the values ​​they recognize as the most important in a hierarchy from the most important to the least important.

Finally, conclude this section with a brief discussion of how a clear idea of ​​an individual’s values ​​can make it easier for a person to navigate moral dilemmas, because decisions can often be made more easily if they are made in favour of a value that is higher on the list.

4. Central part of the workshop: CASE STUDIES

The moderator divides the students into groups again, or uses already created groups from the previous task. Each of the groups receives one example of a moral dilemma from the digital environment. (See Annex 5 for suggestions)

Each group must consider different perspectives and arguments for one of the two (or more) options presented as potential solutions to the moral dilemma. They can help themselves by first identifying which values ​​are in conflict in the dilemma. In the end, the group must decide on one of the proposed solutions to the dilemma.

At the end of the exercise, each group writes down their findings on A3 paper and briefly presents the dilemma they dealt with, the views they considered and the reasons for their final decision.

The other groups are given the opportunity to comment on the choice of the group presenting their case and their decision for the solution.

5. Discussion

In the final part of the workshop, the moderator talks with the students about additional relevant questions related to the lesson. (See the next chapter “Discussion check”)


  • How do moral dilemmas in the real world differ from those in the digital world?
  • Have you ever known what you “must do,” but simply did not “feel” like doing it? 
  • When faced with a dilemma, do you listen to that “little voice” and follow your moral compass?
  • What are some examples of ethical dilemmas in which you personally find it particularly difficult to choose what is the right thing to do?
  • How do cultural and social norms influence what appears to us as a moral dilemma? Do you have any examples of moral dilemmas that are really problematic in the West, but in other countries might not be a dilemma at all?
  • Do you feel that we can (or even should) ever break the law in the name of morality?
  • Is there any way we could avoid moral dilemmas altogether? If so, in what ways?
  • Do you think that teaching ethics in high schools would help to make it easier for students to deal with moral dilemmas? Can ethical decision-making be taught?
  • Can moral dilemmas be solved in a way that the solution is good for all involved, or does the solution to a moral dilemma inevitably involve making compromises?


Morality Play: You are presented with 19 different moral scenarios in which you have to make a judgement about what is morally right thing to do. At the end you are presented with analysis of your responses.

Moral Machine: A platform for gathering a human perspective on moral decisions made by machine intelligence, such as self-driving cars.


ANNEX 1: Key concepts

Ethics and morality

Although many use the terms ethics and morality interchangeably, they tend to be separate. There are several different possible distinctions to be made, but one of the most often cited and conveniently mentioned distinctions between ethics and morality is understood as follows:

Ethics refers to the branch of philosophy concerned with the study of moral principles and values that determine appropriate conduct in various situations. It is concerned with questions about what is right or wrong, and what is fair and just. Ethics helps individuals and organizations make decisions about how to act in ways that are considered morally right, and can inform the development of laws, policies, and codes of conduct.

Morality, on the other hand, refers to an individual’s personal beliefs and principles concerning right and wrong behavior, and the goodness or badness of certain actions, habits, and character traits. Morality is often shaped by cultural, religious, and personal beliefs, and can influence an individual’s ethical decisions and behavior.

Moral Dilemma

In philosophy, a moral dilemma is a situation that presents two or more options, none of which can be chosen without violating some ethical principle. The “right” choice is thus not obvious. A moral dilemma often involves a conflict between moral obligations, and forces individuals to consider their values and beliefs, as well as perhaps the motivations for or consequences of their actions. This type of dilemma raises important philosophical questions about the nature of ethics and morality, and can challenge an individual’s beliefs and understanding of right and wrong.

Digital Environment:

A digital environment refers to the virtual space created by technology, where individuals and organizations interact, communicate, and carry out various activities. It encompasses all the digital devices, systems, and networks that are used to store, process, and transmit information. Examples of digital environments include the internet, social media platforms, online gaming communities, virtual reality systems, and cloud computing networks. The digital environment has revolutionized the way people live, work, and interact with each other, and has led to new opportunities and challenges in areas such as privacy, security, and ethics

ANNEX 2: The Trolley Problem

The Trolley Problem

The Trolley Problem is a classic ethical thought experiment used to explore the nature of moral decision making. It is often presented as follows:

A trolley is heading towards a group of five people who will be killed if nothing is done. The only way to save them is to divert the trolley onto a different track where there is only one person.

The dilemma is whether it is morally permissible to intentionally steer the trolley onto the alternate track, killing the one person in order to save the five.

This thought experiment raises important questions about the value of human life, the ethics of sacrifice, and the morality of indirect responsibility. It has been debated by philosophers and ethicists for decades, and has been modified and expanded into several variations to further explore different aspects of moral decision making. The Trolley Problem continues to be a relevant and widely discussed ethical dilemma, and has applications in fields such as artificial intelligence, autonomous vehicles, and military ethics.

ANNEX 3: Sartre

Sartre uses the anecdote of a former student’s moral dilemma during World War II to illustrate both the limits of making decisions based on a defined moral code and the erroneousness of blaming “passions” for people’s actions. The French student’s brother was killed in 1940 by the Germans, but his father nonetheless later abandoned the family to collaborate with the Germans. The student had to choose between staying in France with his mother, who “found her only comfort in him,” and leaving to fight with the Free French against the German occupation. After realizing he was caught between moral principles—family and nation, or the obligation to care for his mother and the obligation to avenge his brother’s death—he came to Sartre for advice. The philosopher told his student that there was no correct or incorrect decision. Neither moral codes nor the strength of his affections for one or the other party could determine what to do; rather, the student had to “invent” his own solution to the problem.


ANNEX 4: An Example of Possible Set of Values

1. pleasure

2. freedom

3. truth

4. family

5. knowledge

6. power

7. health

8. wisdom

9. honour

10. respect

11. love

12. safety

13. friendship

14. reputation

15. kindness

16. beauty

17. creativity

18. courage

19. responsibility

20. honesty

ANNEX 5: Case studies for group work

Case 1: Text generating AI systems

For one of your school subjects, you have to submit an essay on a topic of your choice by a certain deadline. You try very hard, write the essay in detail and have it ready on time. One hour before the deadline, you want to send the essay to your professor’s email address as agreed. You notice that the document with your essay is no longer on your computer – you must have accidentally deleted it, you are sure. One hour before the deadline, you certainly don’t have time to write a new essay. You remember that your classmate told you that there are now computer programmes based on artificial intelligence that allow you to write an essay on a topic of your choice. You find yourself in a moral dilemma: should you use a programme that will write the essay for you? You have already done the work and put in the effort after all and you really need to submit your essay on time, otherwise you cannot finish the course. You are worried that it is not moral to submit a piece of work that is not yours. Should you use the software or not?

Case 2: Fake identity

You are the parent of a 16-year-old student whose grades at school have recently dropped. It seems to you that her leisure time may have something to do with this deterioration. As you know that your daughter likes to post on social networks very often and very much, you would like to keep track of what she does in her free time, because it seems to you that you will be able to notice bad influences. Of course, since you don’t want her to know that you regularly follow her posts, you think about creating a profile with fake data (name, gender, age, address). You find yourself in a moral dilemma: do you create an online account with a fake identity? You are not directly harming anyone and you ultimately want the best for your daughter. But knowing that you are actively contributing to making it perfectly acceptable to have a fake identity online makes you wonder whether it is appropriate – you certainly wouldn’t want many of your friends on social networks to be “fake” either. 

Case 3: Hate speech problem

You are the owner of one of the major social networks (e.g. Facebook or Twitter). You do not want people to be completely free to insult each other on your platform and to use hate speech against vulnerable groups of people. In particular, of course, you do not want to allow any threats or incitement to morally objectionable acts. At the same time, you do not want to restrict the speech of users of your platform. There are several reasons for this: you want to ensure freedom of speech (everyone is free to express their opinion), you do not want to impose your values by restricting others, and last but not least, you do not want people to leave your platform in frustration. You know that there are algorithmic systems available that will filter and delete hate speech on your platform by themselves. After all, it would be impossible for a human being to review all the posts of all the people in time. But you also know that algorithms are sometimes biased and make mistakes and will probably also delete well-intentioned posts by people who they wrongly identify as threats, thereby preventing well-intentioned people from communicating freely. Are you faced with a moral dilemma: should you use automated algorithmic systems to control hate speech?

Case 4: Privacy of data

You are the owner of an online shop. You would like all customers of your online shop to give you their basic information (name, gender, age, address) before using your services – this will help you to tailor the offer that a potential customer sees on your site to them. This is good for costumers because it will make it easier for them to receive offers that are relevant to them and in most cases will reduce their search time. But it will also be good for you because costumer specific offers drastically increase the chances of the customer actually making a purchase. However, you know that you are infringing on the privacy of customers who may wish to remain anonymous. You may also be discouraging them from making a purchase at all. You are faced with a moral dilemma: do you request basic personal data from customers who wish to make a purchase on your site?

Case 5: Cheating for an item

You regularly play a multiplayer online video game with many players from all over the world. On a special occasion, a teammate you don’t know from before offers you a very rare in-game item for free. You check who the player offering the new item is on the forum, which is dedicated to the community of players of the game you are playing. You see that many people are pointing out that this player is cheating in the game. You can be almost certain that the player has obtained the item he wants to give you by cheating. You are faced with a moral dilemma: do you accept the item anyway? You don’t cause trouble for anyone directly, and it wasn’t you who cheated. On the other hand, you are accepting a gift that was not obtained fairly, and you are actively contributing to the fact that this behaviour goes unpunished, and that the cheater continues to cheat.

Case 6: Boldly expressing an opinion

You notice on one of social networking platforms you are using (e.g. Facebook or Twitter) that your friend has made a post that is not true and may even be offensive to some people. Because you know your friend well, you know that he will certainly not delete the post, even if you suggest it in a private conversation, because he sincerely believes in his post and will defend it fiercely. You don’t want your friend to spread false information and, in particular, to potentially offend certain groups of people, even though that may not be his intention. You can write your opinion in a comment on his post, this would allow you to point out factual information on the subject that is true, and perhaps mitigate the offence that some may be taking and participate in a frank discussion on the subject with your opinion. Of course, in doing so, you are challenging your friend’s credibility and possibly jeopardising your friendship. You are faced with a moral dilemma: do you clearly point out your friend’s mistake on a social network and express your opinion?

Case 7: Autonomous vehicle

The ethical dilemmas surrounding digital technologies often extend into the real world – extending their impact well beyond the digital environment. One such example is the self-driving cars that have recently been developed by many companies that are investing a lot of money in the use of artificial intelligence in the field of transport. The car will have to make certain decisions on the road while driving itself – when to overtake the vehicle in front, when to drive more carefully because of the road conditions, etc. But the self-driving car will also have to make decisions about what to do in a more complex situations. A child jumps in front of the car on the road after her ball. The car may be digitally programmed to swerve off the road to save a person in the roadway, endangering the life of the car occupant. Or it can be programmed not to swerve off the road to protect a passenger in the car. You are the developer of a system that will help to make decisions in a self-driving car. You find yourself in a moral dilemma: how do you program the car? To ensure the safety of the passenger at the possible expense of the person on the road, or to take care of the person who is on the road, possibly risking the welfare of the passenger?

Leave a comment

2021 – 1 – SK01 – KA220-SCH-000034395

This website reflects the views only of the PLATO’s EU project consortium, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

Get In Touch

Plato’s EU© All Rights Reserved.

Skip to content