University of Ljubljana
INTRODUCTION TO THE TOPIC
This lesson focuses on three main concepts. It first tries to establish what a bias is. After giving a broad definition of it, students are led to explore concrete examples of the biases that they might hold themselves.
The second concept explained is an algorithm. Through a basic definition a link is established towards what algorithms might be doing in an online setting, especially in the field of social media. This leads to the final concept of information bubbles or echo chambers, as algorithms reinforcing bias in the perceptions of reality.
In general, bias can be defined as a systematic error or deviation from the truth that is introduced by subjective factors. This can lead to skewed or distorted perceptions, judgments, and decisions that are not based on objective evidence or facts.
Bias refers to a prejudice or inclination towards a particular perspective, ideology, or outcome that influences one’s judgment or decision-making processes. Bias can arise from a variety of sources, including personal experiences, cultural background, and cognitive biases that affect perception and interpretation.
Bias can impact individuals, organizations, and societies in many ways, leading to unequal treatment and unfair outcomes, as well as the spread of misinformation and the reinforcement of existing inequalities and power dynamics.
An algorithm is a step-by-step procedure for solving a problem or achieving a specific task, typically expressed in a computer programming language. It is a sequence of well-defined instructions designed to perform a specific task or solve a well-specified computational problem, with a finite running time, given specific inputs.
In the context of AI and machine learning, bias refers to a systematic error or discrepancy in the algorithms or models that result in unequal treatment of different groups. Bias can be introduced into AI systems in various ways, such as through the training data used to develop the models, the algorithms and methods used to build the models, or the decision-making processes used by the models.
For example, a machine learning model trained on biased data may generate biased results, such as unfairly classifying certain groups or making incorrect predictions about individuals based on their race, gender, or other factors.
In the context of social media, algorithms refer to the mathematical formulas and processes used by platforms like Facebook, Twitter, and Instagram to determine what content is shown to users. These algorithms use various factors, such as user behavior, past interactions, and machine learning models, to prioritize and personalize the content that is displayed in a user’s feed, search results, and advertisements. The goal is often to maximize user engagement and keep users on the platform for as long as possible.
In the context of social media, bias refers to the ways in which the algorithms used by these platforms can influence the information that users are exposed to and limit their exposure to diverse perspectives and ideas. This phenomenon is sometimes referred to as an “information bubble” or a “filter bubble”.
The algorithms used by social media platforms are designed to personalize the user experience and maximize engagement by showing users content that is most likely to interest them. This can lead to the formation of information bubbles, where users are primarily exposed to information and perspectives that align with their existing beliefs and biases, and are less likely to encounter opposing viewpoints or challenging ideas.
The impact of information bubbles on social media can be significant, as they can reinforce existing biases and limit the exposure of users to diverse perspectives and ideas. This can contribute to the spread of misinformation, the reinforcement of echo chambers, and the erosion of public trust in information and institutions.
It’s important for social media platforms and users to be aware of the potential for information bubbles and to take steps to promote diversity and expose users to a wider range of perspectives and ideas. This can include providing users with tools and options to customize their feed, promoting quality information and journalism, and encouraging users to engage with diverse perspectives and sources.
LESSON OVERVIEW
The lesson is planned in three separate parts – each of them offering opportunities for delving deeper into the subject matter. The first part deals with the topic of bias in general, second explains algorithms and the third joins the two concepts in the context of a digital environment.
Each segment includes an activity for the students to participate in and be active. As the lesson is devised as a workshop, the general idea is to have the students as active as possible. So in the lesson breakdown, the teachers are encouraged to actively involve students not only in the foreseen activities but in the lesson implementation in general.
Students will perform judgements on insufficient data, they will try to act according to simple algorithm leading them and they will reflect on “defensive tactics” to neutralise the effects of echo chambers in their lives. The minimal duration of the workshop should be 45 minutes with 15 minutes per each segment but ideally each segment would offer more opportunities to engage students in discussion over a 30 minute period for a total duration of the workshop coming up to 90 minutes. The workshop is best conducted in person but can also be carried out online (in a virtual classroom) with slight modifications.
Materials that should be made available to students:
- Activity 1 handouts (optional – they can do the activity orally or in their notebooks).
- Activity 2 materials (printed on several pieces of paper).
- Larger posters and markers for activity 3.
Learning outcomes that will be attained through the workshop:
- The student understands the concepts of bias, algorithm and echo chamber/information bubble.
- The student recognises how biases are formed and maintained in the society.
- The student understands the effects and dangers of biases and the need to work on their mitigation and suppression.
- The student understands how algorithms in social media work in terms of amplifying pre-existing biases.
- The student recognises activities and behaviours that lead to the algorithms reinforcing biases and is aware how he can lessen the impact of bias on his social media.
- The student makes a personal commitment towards activities aimed at diminishing biases in his digital environment.
LESSON BREAKDOWN – WORSKHOP ACTIVITIES
- The workshop begins with a student activity The Nile story in the appendix. The activity should take at least 15 minutes if you really rush through it. It would be ideal to dedicate at least 30 minutes to it. If you go deep into discussion you can easily expand the material to 45 minutes with more student interaction.
- For the second part of the lesson, you can include a really fun and interesting activity that will make you seem also quite cool/OG to the students. It will take about 5 minutes. In terms of transition towards the algorithms you can do an easy magic trick. You can watch the tutorial online: https://www.youtube.com/watch?v=ogHjO4vRtJ0. The point is, that it is a self-working magic trick, meaning it uses a simple algorithm and no special skill (no sleight of hand).
- When you finish the trick you can explain that this was done using an algorithm. You can either have the students try to guess how the trick was done, or you can explain it to them and have them write the algorithm for the trick, or you can leave the situation in a shroud of mystery and tell the students to trust you in the fact that it was in fact an algorithm not magic. … Or you can claim to be a real magician – up to you. … In any case – if you decide to go into the explication of the algorithm, you can come up with something like this for the final stage.
- If the card cut to is not a spade – have the spectators cut again.
- If the cart cut to is a spade: 1. remember the number of the card; 2. turn the deck around; 3. count to the number that you remembered; 4. turn the card; 5. watch the spectators being amazed by your skill.
- Using the card trick activity or another introduction to algorithms you can now try to explain a bit about the algorithms to the students. You can use some of the information in the appendix. This would take between 5 and 15 minutes.
- Once you explain about algorithms you can try to implement the Algorithms are hiring activity, where students could also try the being an algorithm. This should take 5 – 10 minutes.
- The being an algorithm activity is a great segway towards explaining about echo chambers or information bubbles. This should be done in a more solemn an serious manner. There is some information provided for the teacher in the annexes as well as stories of recent occurrences that he might use. The teacher should stress this part of the lesson as the new knowledge acquired. The teacher should also try and include students to reflect together on how the information bubbles are formed, to try and name some concrete bubbles, to give examples of bubbles that they might have in their lives. This activity should take 15 – 40 minutes.
- The final part of the lesson should be the creation of a pledge as a type of CTA summary of the entire workshop. This activity can take between 10 and 20 minutes.
DISCUSSION CHECK
These are possible follow-up or discussion questions that can be used to deepen and expand on the topics explored through the course of the workshop:
- How does bias in the media and social media platforms impact public opinion and decision making?
- What are the consequences of information echo chambers on society?
- How can individuals protect themselves from the influence of biased information and echo chambers on social media?
- To what extent is bias in the media and social media platforms responsible for the rise of political polarization?
- How do algorithms on social media platforms contribute to the formation of echo chambers and the spread of biased information?
- Should social media platforms be held responsible for mitigating the spread of biased information and echo chambers on their platforms?
- What role do individual users play in avoiding echo chambers and promoting diverse perspectives online?
- How can education and media literacy help individuals recognize and counteract the influence of bias and echo chambers?
- How can social media platforms better balance the promotion of diverse perspectives and the prevention of misinformation and propaganda?
- Can the trend towards the creation of echo chambers on social media be reversed, and if so, how?
ADDITIONAL RESOURCES
- ” Algorithmic Illusions: Hidden Biases of Big Data” by Kate Crawford (2013) – This talk explores the hidden biases in algorithms and how they can perpetuate inequality. 17:25 (https://www.youtube.com/watch?v=irP5RCdpilc)
- “The Filter Bubble” by Eli Pariser (2011) – This talk discusses how algorithms can create echo chambers and limit the diversity of information and ideas we are exposed to. 8:48 (https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles)
- “The Bias in AI” by Joy Buolamwini (2016) – This talk highlights how algorithmic bias can perpetuate and amplify existing societal biases, and the need for accountability and transparency in the development of AI systems. 8:35 (https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles)
- ” Making Technology Less Manipulative” by Tristan Harris (2017) – This talk discusses how technology companies design products to keep users engaged and create information bubbles that reinforce existing biases and beliefs and how we can try to avoid that. 57:01 (https://www.youtube.com/watch?v=8YGv5vtDsiQ)
- “The Dangers of Online Filter Bubbles” by Zeynep Tufekci (2017) – This talk explains how online algorithms can lead to information bubbles, limiting exposure to diverse perspectives and leading to the reinforcement of existing biases. 22:46 (https://www.ted.com/talks/zeynep_tufekci_we_re_building_a_dystopia_just_to_make_people_click_on_ads)
ANNEXES
The Nile story
Implementation of the activity and talking points
This is the warm up activity that can be used to show how easily and quickly we are ready to form a uninformed decision, make judgements on incomplete data. In essence this activity can be used to show willingness to concede to our own bias.
Students can either be read the first part of the story or given handouts with the first part printed on them. After they are acquainted with the first part, they are asked to determine who is the most and who is the least ethical person in the story. They should rank the participants 1 to 5. They can do this either in groups or individually.
After they make the assessment, they are either read or given the second part of the story that will – revealing more of the background – show the participants in a very different light.
Because this activity is intended to show how bias works and because bias works best on the subconscious level it is best if we do not tell the students the aim and purpose of this activity beforehand. Perhaps we might even mislead them – letting them think that the activity will focus on the topic of ethics (as they are asked to asses the ethical conduct of participants).
After the second part of the story the students are asked if they would like to change their assessments. At this point a segway can be made towards the topic of bias. It would be most effective if the teacher asks the students to try and explain what happened. He can do so using the following questions:
- Can someone explain to me, what just happened?
- Why did you make your initial decisions the way you made them?
- What made you change your decision?
- Do you know what making assertions on insufficient or corrupt data is called?
- Can you give me some more examples of times when people make assertions or decisions based on incomplete data?
- Did you notice anything else when thinking about the story – in addition to the level of ethical behavior of participants? Did their age, gender, presumed nationality play a role in your decision making?
- Do you think you hold any other biases?
After a discussion with the students, the teacher should explain more on the topic of biases. He can give a general definition.
Bias refers to a prejudice or inclination towards a particular perspective, ideology, or outcome that influences one’s judgment or decision-making processes. Bias can arise from a variety of sources, including personal experiences, cultural background, and cognitive biases that affect perception and interpretation.
Bias can impact individuals, organizations, and societies in many ways, leading to unequal treatment and unfair outcomes, as well as the spread of misinformation and the reinforcement of existing inequalities and power dynamics.
In general, bias can be defined as a systematic error or deviation from the truth that is introduced by subjective factors. This can lead to skewed or distorted perceptions, judgments, and decisions that are not based on objective evidence or facts.
It’s important to be aware of and address bias in order to promote fairness and accuracy, and to ensure that decisions and outcomes are based on objective evidence and sound reasoning. This can involve recognizing and challenging one’s own biases, seeking out diverse perspectives and information sources, and using methods such as critical thinking and fact-checking to verify information and reduce the impact of bias.
When talking about bias we need to realise that there are several different types of bias. They can be held personally (individual has personal bias based on subjective factors influencing his perceptions and decisions) or systemically (built into the structure of society or organizations, seeming inherent, influencing opportunities and treatment dispartites for certain social groups). Some bias is explicit bias (consciously held prejudices), while a lot of it is implicit bias (unconscious attitudes and beliefs).
We can also name couple of most common biases present in our societies. Students can obviously help identifying the list, but here we can include some common biases:
Ageism: Prejudice against the elderly or young people based on their age.
Racism: Prejudice against people of different races or ethnicities.
Sexism: Prejudice against people based on their gender or sexual orientation.
Xenophobia: Fear or hatred of foreigners or anything that is perceived to be foreign.
Ableism: Prejudice against people with disabilities.
Classism: Prejudice against people based on their social class or economic status.
Anti-Semitism: Prejudice against Jewish people.
Islamophobia: Fear or hatred of Muslims or Islam.
Nativism: Prejudice against people who were not born in a particular country.
Homophobia: Fear or hatred of people who identify as homosexual, bisexual, or any other non-heterosexual orientation.
Here it might be prudent to stress the difference between bias and discrimination. Bias and discrimination are related but distinct concepts.
Bias refers to a preconceived attitude or judgment towards a group or individual, based on factors such as race, gender, sexual orientation, etc. Bias can be conscious or unconscious, and can influence one’s thoughts, feelings, and actions towards others.
Discrimination, on the other hand, refers to the unequal treatment of individuals or groups based on their membership in a particular category (e.g. race, gender, sexual orientation, etc.). Discrimination can take many forms, such as unequal access to education, employment, housing, and other resources. Discrimination can be a result of biases and prejudices, but it is also shaped by systemic and institutional factors.
In summary, bias refers to an attitude, while discrimination refers to actions or behaviors that result in unequal treatment. A society that values freedom of speech and thought may allow biases to exist, but it does not have to condone or accept discrimination based on those biases.
Freedom of speech and thought are important principles that protect individual rights to express and hold their opinions, beliefs, and ideas, even if those ideas may be controversial or unpopular. However, freedom of speech and thought do not give people the right to discriminate or harm others based on those biases.
In a society that values freedom, it is important to strike a balance between protecting individual rights to express their opinions and beliefs and promoting equality and non-discrimination for all. This can involve setting limits on hate speech, for example, or promoting education and awareness about the harms of discrimination and bias.
The story
Fatima is a lovely, young girl that lives on the left bank of the great Egyptian river Nile. She is very much in love. Her parents are not supportive of her infatuation, because they think that she is too young and that it is inappropriate. Her beloved, Omar, lives on the other side of the river and she cannot visit him lightly. One day she decides that she will run away from home and try and visit Omar. Fatima tells her friend Tarek and he is very much against her plan. He threatens to tell her parents, he says, she should not try and cross the river. She asked him for his help and he refused her. He thinks – among other things – that it is wrong not to listen to your parents. Nonetheless, Fatima escapes home one evening and gets to the river bank, where – because Nile is such a great river – there are no bridges or crossings just the ferrymen. She only finds the one boat still at the bank and the owner of the crossing barge is the old man Ahmed. Fatima explains her plight to Ahmed and he is reluctant to take her across. He finally agrees to take her across if she agrees and spends the night with him. Fatima is not happy but she really wants to see Omar and she agrees. Next morning, Ahmed takes Fatima across the river and she finally reaches the right bank and looks for Omar. When she finds him, she leaps into his arms overfilled with joy and Omar obruptly refuses her and pushes her away. She tells her that he cannot be with her and that she is stupid to think otherwise. Fatima starts crying and as Omar is leaving Hassan who just saw Omar leaving rushes to him and punches him hard in the face.
Who is the least and who is the most ethical person in this story. Rank the participants and try to think why are you ranking them the way that you are.
Fatima (main protagonist)
Omar (Fatima’s beloved)
Tarek (Fatima’s friend)
Ahmed (the old ferry barge operator)
Hassan (the boxing enthousiast)
Bet you didn’t count on this … Here are some additional information.
Fatima is 14 years old. Tarek – her friend – is her classmate. In addition – not that it is really important for our story – he is also in love with her. Omar – her beloved – is their 35 year old teacher, who is married and has three children. Ahmed is in fact 69 years old and is Fatima’s grandfather who has not seen her in quite a while. He loves her dearly and because she came to his ferry barge it was already really dark and he thaught that crossing at that hour would have been dangerous. He also wanted to dring tea with Fatima and chat with her. And sensing that he is dealing with some sorrows of the unrecquainted love he wanted to try and reason with Fatima as well. That leaves us with only Hassan. Well, we do not know fully, who Hassan is or why he was at the river bank, when he was. But we do know that the local radio station early in the morning reported that a male individual escaped earlier from the local mental institution. He was said to have been irrational and violent and that he might be going around punching people in the face for no reason. If anyone should see him, they need to contact local authorities immediately.
Who is the least and who is the most ethical person in this story. Rank the participants and try to think why are you ranking them the way that you are.
Fatima (main protagonist)
Omar (Fatima’s beloved)
Tarek (Fatima’s friend)
Ahmed (the old ferry barge operator)
Hassan (the boxing enthousiast)
Trying to be the algorithm
Introduction to algorithms
An algorithm is a step-by-step procedure for solving a problem or achieving a specific task, typically but not necessarily expressed in a computer programming language. It is a sequence of well-defined instructions designed to perform a specific task or solve a well-specified computational problem, with a finite running time, given specific inputs.
Perhaps it can be easier for the students to understand algorithms through their real life examples.
Recipe for baking a cake
Input: A list of ingredients, a desired type of cake
Output: A baked cake
Steps:
- Preheat oven to specified temperature.
- Mix dry ingredients (flour, baking powder, salt) in one bowl.
- In another bowl, cream together butter and sugar.
- Add eggs to the butter-sugar mixture, one at a time, mixing well after each addition.
- Alternately add the dry ingredients and milk to the egg mixture, starting and ending with the dry ingredients.
- Pour batter into a greased baking pan.
- Bake in the oven for the specified time.
- Test the cake by inserting a toothpick in the center. If it comes out clean, the cake is done. If not, bake for a few more minutes.
- Let the cake cool, then decorate and serve as desired.
GPS navigation
Input: A starting location, a destination, and real-time traffic data
Output: The most efficient route from start to finish
Steps:
- Collect current location data.
- Input destination information.
- Use real-time traffic data to calculate the quickest route.
- Provide turn-by-turn directions to the driver.
- Continuously monitor traffic and recalculate the route if necessary.
Washing the clothes
Input: A load of dirty clothes, preferred wash cycle (e.g. delicate, normal, heavy duty)
Output: Clean clothes
Steps:
- Sort clothes by color and fabric type.
- Choose the preferred wash cycle based on the types of clothes and level of dirtiness.
- Fill the machine with the appropriate amount of water.
- Add detergent and clothes to the machine.
- Agitate the clothes for a specified time.
- Drain the machine and rinse the clothes.
- Spin the clothes to remove excess water.
- Transfer the clothes to the dryer or hang them to air dry.
Going into computer languages maybe it might be easier if we understand a more mathematical algorithm such as finding an average.
What is actually going on is:
- Add up all the numbers in a list.
- Divide the sum by the number of numbers in the list.
- The result is the average of the numbers.
What we might input in the computer in a simple case of using MS Excel.
Let’s say that we have all the numbers in the list in the A column from A1 to A10.
In the field where we want to do the average we will write the »formula for the algorithm«:
=SUM(A1:A10)/COUNT(A1:A10)
SUM is the command for summating everything in the range
COUNT is the command for counting non-emty cells in range
Algorithms in the digital environment
In the context of social media, algorithms refer to the mathematical formulas and processes used by platforms like Facebook, Twitter, and Instagram to determine what content is shown to users. These algorithms use various factors, such as user behavior, past interactions, and machine learning models, to prioritize and personalize the content that is displayed in a user’s feed, search results, and advertisements. The goal is often to maximize user engagement and keep users on the platform for as long as possible.
To be more concrete and try to show how the algorithms would work, here is a general idea of how the Facebook News Feed algorithm works to select the next several posts in your feed:
- Relevance: Facebook’s algorithm prioritizes content that is relevant to you, based on your interests, activity, and other signals such as the pages you follow or have engaged with in the past.
- Timeliness: The algorithm also takes into account the timeliness of the content, so that the most recent and relevant posts are shown first.
- Engagement: Posts that are likely to generate engagement, such as comments, likes, and shares, are also prioritized by the algorithm.
- Friend and Family: Content from friends and family is given higher priority, as this type of content is generally more personal and relevant to the user.
- Past Behavior: The algorithm takes into account your past behavior on the platform, such as which posts you have liked, shared, or commented on, to determine what content is relevant to you.
Based on these factors, the Facebook News Feed algorithm will try to predict which posts are likely to be the most relevant and engaging to you, and show those first.
At this point we can stress some potential bias in these processes:
– Algorithms and artificial intelligence (AI) can also be influenced by bias, particularly if they are trained on biased data. This can result in unfair or discriminatory outcomes.
– To address bias in algorithms, it’s important to carefully evaluate the data used to train AI systems and to design systems with fairness and inclusivity in mind. This can help to ensure that AI systems are transparent, accountable, and free from discriminatory outcomes.
– Bias can also be present in digital environments, such as social media platforms and online forums. This can affect how information is shared, what perspectives are emphasized, and who is included or excluded in online communities. It’s important to be aware of these biases and to actively seek out diverse perspectives in order to have a well-rounded understanding of digital environments.
Algorithms are hiring
Here is a list of articles that focus on the topic of migration in the EU and Europe. You can see that some of them are more and some less favorable towards the question. Generally the odd numbers represent more favorable and the even numbers less favorable. Ask a student to select either article 1 or 2 and then have another student play the role of the algorithm and get them the next article. This student should use the algorithm written below. You can suggest to this student to try and experiment in the step 2 that generates a RANDOM number. Have them try to get to the next article that is of the opposite view from the previous.
This is of course impossible, because the algorithm is set in the way that starting from and odd number would always get an odd number and starting from an even number would always produce an even number.
Alternatively to one student playing the role of the algorithm you can all do this together as a class.
At the end of this activity, the teacher should explain that there was no way to cross even to odd or the other way around. Algorithms usually don’t work in that way – but the point to stress is that the algorithms can be used to a specific end. This end in the case of social media is of course mostly maintaining engagement.
- Start with a number, say “N”.
- Generate a random number between 2 and 10, call it “P”.
- If N is larger than P subtract P from N to get “Q”.
- If N is smaller than P add them together to get “Q”.
- Multiply Q by 2.
- If N is odd multiply P by 4 and subtract 5 to get “R”.
- If N is even add 3 to P and multiply that by 2 to get “R”.
- If R is smaller than Q, subtract R from Q to get “S”.
- If R is larger than Q, subtract Q from R to get “S”.
- If S is larger than 28, subtract 20 to get “T”.
- T is your next article in the feed.
List of articles:
- “The Benefits of Migration: A European Perspective”
- “The Burden of Migration on Europe’s Social Services”
- “Diversity and Inclusivity: The Strength of Europe’s Migration Policy”
- “The Threat of Terrorism and Migration: A European Concern”
- “The Importance of Protecting Migrant Rights in the EU”
- “The Impact of Migration on Wages and Employment in the EU”
- “The Positive Impact of Migration on the EU Economy”
- “The Strain on Housing and Infrastructure in Europe’s Cities”
- “Migration as a Driver of Cultural Exchange and Understanding in Europe”
- “Why Europe Needs to Control Its Borders”
- “Why Europe Needs a Compassionate Approach to Migration”
- “The Risks of Mass Migration to Europe’s Cultural Identity”
- “The Role of Migration in Fostering Innovation and Growth in the EU”
- “The Failure of the EU’s Migration Policy: A Critique”
- “How the EU Can Address the Global Migration Crisis”
- “The Economic Costs of Illegal Migration in the EU”
- “The Promise of Migration for Building a More Inclusive and Diverse Europe”.
- “The Negative Impact of Migration on Public Health in Europe”
- “Breaking Down Barriers: The Benefits of a Free Movement of People in the EU”
- “The High Costs of Providing Services for Migrants in the EU”
- “The Benefits of Migration: A Humanitarian and Moral Imperative”
- “The Risks of Criminal Activity and Human Trafficking Associated with Migration”
- “The Vital Role of Migration in Tackling the EU’s Skills Shortage”
- “The Negative Impact of Migration on Public Services and Quality of Life in Europe”
- “The Contribution of Migrants to Europe’s Cultural Heritage”
- “The Threat of Overcrowding and Environmental Degradation in Europe’s Cities”
- “The Potential of Migration to Drive Entrepreneurship and Job Creation in the EU”
- “The Failure of the EU to Address the Root Causes of Migration”.
Creation of information bubbles
General introduction
The algorithms used by social media platforms are designed to personalize the user experience and maximize engagement by showing users content that is most likely to interest them. This can lead to the formation of information bubbles, where users are primarily exposed to information and perspectives that align with their existing beliefs and biases, and are less likely to encounter opposing viewpoints or challenging ideas.
The impact of information bubbles on social media can be significant, as they can reinforce existing biases and limit the exposure of users to diverse perspectives and ideas. This can contribute to the spread of misinformation, the reinforcement of echo chambers, and the erosion of public trust in information and institutions.
It’s important for social media platforms and users to be aware of the potential for information bubbles and to take steps to promote diversity and expose users to a wider range of perspectives and ideas. This can include providing users with tools and options to customize their feed, promoting quality information and journalism, and encouraging users to engage with diverse perspectives and sources.
Some concrete examples of echo chambers/information bubbles
Cambridge Analytica and the 2016 US Presidential Election: Cambridge Analytica, a political consulting firm, used data from Facebook to create personalized political advertisements for Donald Trump’s campaign. This helped to create a “filter bubble” of information for specific groups of voters, which influenced their opinions and voting behavior.
The Spread of Misinformation during COVID-19 Pandemic: During the COVID-19 pandemic, false information about the virus spread rapidly on social media. The algorithms of social media platforms often amplified misinformation and reinforced people’s preexisting beliefs, creating information bubbles and hindering efforts to combat the spread of the virus.
The Use of Targeted Ads during the Brexit Referendum: During the Brexit referendum in the UK, Vote Leave, the official campaign for Brexit, used targeted advertisements on Facebook to reach specific groups of voters. These ads often contained false information, and helped to create information bubbles that reinforced voters’ preexisting beliefs and influenced their opinions on Brexit.
Polarization in the US: In recent years, the US has become increasingly polarized, with people on opposite sides of political and cultural issues forming their own information bubbles. Social media algorithms often reinforce these bubbles by showing users content that aligns with their beliefs, making it harder for them to see different perspectives and understand opposing viewpoints.
Pledge
The summation point of this lesson should be a drafting of a pledge on what to do (personally) to avoid getting trapped in information bubbles.
The main point of this activity is to have students come up with their own ideas about how to avoid getting trapped in information bubbles. You can do this as a class, but perhaps it would be best to do this in groups and have the groups present their final list of suggestions.
Here is a list of good examples that the teacher might use to guide the students towards:
- Follow a diverse range of news sources and opinions.
- Seek out sources that challenge your beliefs and opinions.
- Turn off recommended content and explore beyond what is being recommended to you.
- Avoid relying solely on social media for news and information.
- Fact-check information before accepting it as true.
- Take a critical approach to the information you encounter.
- Engage in healthy debates and discussions with people from different backgrounds and perspectives.
- Read articles and watch videos from sources you don’t agree with.
- Seek out alternative narratives and explanations.
- Don’t just rely on one source for information.
- Be mindful of who is behind the information you are consuming.
- Turn off notifications for sources that reinforce your beliefs.
- Be mindful of the language and tone used in information and news sources.
- Avoid spending excessive amounts of time on one source of information.
- Practice information literacy and critical thinking skills.