Skip to content Skip to sidebar Skip to footer

Can algorithm affect how I vote?

University of Ljubljana

INTRODUCTION TO THE TOPIC

Online algorithms play a critical role in shaping the information that people are exposed to and the opinions they form. These algorithms, which are often used by tech companies such as Google, Facebook, and Twitter, determine what content is displayed in search results, news feeds, and timelines, based on factors such as users’ online behaviour and demographics.

As a result, online algorithms have the potential to significantly affect democracy. On the one hand, they can provide people with access to a wider range of information and perspectives, promoting informed debate and decision-making. On the other hand, they can also perpetuate echo chambers, where users are only exposed to information that reinforces their existing views, reducing the diversity of opinions they are exposed to and limiting their ability to critically assess different arguments.

Online algorithms can also be manipulated to spread misinformation, which can have serious consequences for democracy. For example, false information can be spread through social media to influence public opinion and interfere with elections. Additionally, online algorithms can be designed to prioritize certain types of content, such as sensationalist or polarizing news, over more balanced and accurate information. This can lead to a fragmented public discourse, where people are unable to engage in constructive dialogue and find common ground.

The use of algorithms online has some very positive effects but at the same time the impact of online algorithms on democracy has led to growing concerns about the need for greater transparency and accountability in the design and deployment of these algorithms. Regulators and governments are exploring ways to ensure that algorithms are used in a way that serves the public interest and does not undermine democratic values and institutions. Can individuals do something about it as well?

Photo source: https://www.pressenterprise.com/2018/03/20/cambridge-analytica-breached-your-facebook-political-cartoons/

LESSON OVERVIEW

Although the workshop also includes lecturing where the moderator explains to the students basic concepts relevant to this topic (democracy, algorithms, echo chambers, etc.), the main part of the workshop is condensed into 4 activities. In the first activity, students learn that algorithms can target individual people and groups with advertisements and news of interest to them. In the second, in connection to the concept of echo chambers, the students learn about how the same news can be told in many ways. In the third, the moderator discusses with the students about possible ways to be more careful online. In the fourth, there is a final discussion, which allows the students to share their own experiences in connection to the topic of the workshop. The workshop is expected to last 90 minutes. If you only have 45 minutes, you can consider the Second activity as additional activity and include it only if possible. The workshop is best conducted in person but can also be carried out online (in a virtual classroom) with slight modifications.

Materials that should be made available to students:

Paper and pens.

Learning outcomes that will be attained through the workshop:

  • students can understand the concepts of democracy, algorithms, “framing” of a story, and echo chambers
  • students know examples of the use of algorithms in the online environment and their general impact on democratic values
  • students are aware of the well-known case of the impact of algorithms on democratic processes (Cambridge Analytica scandal)
  • students acquire the basic ability to detect biased news
  • students know some general tips on how they can be more cautious when online

LESSON BREAKDOWN – WORSKHOP ACTIVITIES

Photo source: https://right.ly/our-views-and-opinions/cambridge-analytica-explained/

1. Introduction

The moderator asks the students if they are aware of the Cambridge Analytica scandal and allows them to tell what they know about it. The moderator then explains the case in more detail, if necessary (Annex 1).

2. Definitions of relevant concepts

The moderator then introduces the basic concepts of democracy and the algorithms (Annex 2). Moderator may first ask the students what they themselves think these two concepts mean before explaining them.

Moderator then goes on to explain how the two concepts are linked. He/she explains this by giving students some examples of how algorithms were used in the Cambridge Analytica scandal and the impact they potentially had on democratic processes (Annex 3).

3. First activity: Targeting specific demographic groups              

PART 1:

The moderator divides the students into 5 groups and explains the same scenario to all groups:  

You work for a political campaign that is trying to get a particular candidate elected. The campaign wants to use targeting to reach specific demographic groups with personalized messages to influence their vote.

Each group must prepare a short message to be shown as an advertisement (in the context of the election campaign) to specific target group on their social networks. The aim is for each group to present the objectives of their political candidate in a way that is interesting to their specific target group.

Group 1: Targeting young adults (18-24 years old) who are politically active and mainly concerned about environmental issues (e.g. climate change) and social justice (e.g. equality of races, genders, sexuality or religious views)

Group 2: Targeting middle-aged individuals (35-54 years old) who are working professionals and are mainly concerned about job security and economy (e.g. costs of living, public debt, financial services – credit and savings)

Group 3: Targeting seniors (65+ years old) who are retired and mainly concerned about social security (e.g. pensions and benefits) and health care (e.g. access to and cost of healthcare)

Group 4: Targeting students (18-24 years old) who are mainly concerned about education (e.g. access to and cost), housing shortage, and job opportunities

Group 5: Targeting recent immigrants who are concerned about immigration policy (e.g. asylum seeking, financial assistance), access to public services (e.g. schools and healthcare), and cultural integration

PART 2:

Each group reads out the short political message they have prepared, and the other groups guess which target group the political message is aimed at. (A general description of the potential target group should be enough; the students don’t have to guess exactly. If you think the task will be too difficult for the (perhaps younger) students with whom you are conducting the workshop, you can tell them in advance which target groups to choose from.)

4. Second activity: The same story in a different way

PART 1:

The moderator first tells the existing five groups (or new ones) that the same information can be said in ways. Moderator can explain the concept of “framing” as the angle or perspective from which a particular story is told. In the case of the media, it’s about being able to tell the same news story in many different ways. The way the story is told can depend on what we want to achieve. In this way media can be used to shape mass opinion.

The moderator distributes the same set of information to all groups of students:

News: A new study was released today

What the study found: The number of endangered species has increased by 30% in the last decade

The most likely causes: Habitat loss and climate change

Who did the research: A team of international researchers [1]

The moderator instructs each group to write a short news story (based on information given) in a few sentences, using one of the specific framings.

Group 1: Negative framing of the news story

Group 2: Positive framing of the news story

Group 3: Political framing of the news story

Group 4: Environmental framing of the news story

Group 5: Scientific framing of the news story

Some other options include neutral, economic, and social framings.

If the task is too difficult, or if students find it hard to come up with a way to give a specific emphasis (e.g. political) to the news story, the moderator can (quietly) give each group hints on what they can pay attention to, or what they can mention in the news (for examples of potential news stories see Annex 4).

PART 2:

Then each group reads their short news story and the other groups guess what framing they used (the moderator tells the guessing groups in advance which options to choose from) and whether they added any facts in their writing that were not included in the original information (i.e. if they used “fake” facts).

PART 3:

Moderator explains that algorithms play a significant role in this problem of different framings of the same story. Algorithms are used in many online platforms, such as social media and news websites, to personalize and filter the content that is shown to each user. The news we see on social networks may come from different news sources, which are often biased. This means that even if the information is the same every time (and often it isn’t), different users may see different versions of the same story, based on the algorithms’ understanding of their interests, preferences, and behaviors.

For example, if a user has shown an interest in environmental news, an algorithm may present them with a very “environmental” or even negative framing of a news story that emphasizes its environmental impact. On the other hand, a user who has shown an interest in scientific news may see a very “scientific”, neutral or even positive framing that focuses on the importance of research and the power of science.

This personalized filtering of content can result in the creation of echo chambers (for description of this concept see Annex 5), where users only see information that reinforces their existing beliefs and biases, and can limit exposure to diverse perspectives and news narratives. This can be harmful to democracy, as it can contribute to the spread of misinformation and the polarizing of society.

5. What can I do about it all?

While tackling the problems raised at the workshop also requires social and legal solutions, perhaps individuals can also take some steps towards a safer and more responsible use of the internet. Some possible steps which can be discussed with students may include:

  • not posting confidential information
  • not agreeing to cookies being stored on websites
  • seeking out diverse sources of information
  • fact-checking information
  • engaging in a meaningful dialogue (perhaps especially with people with different opinions)

6. Closing discussion

Here the moderator can refer to the questions in the next section “Discussion check”.

DISCUSSION CHECK

The following questions may be helpful for further discussion on the topic:

  • Have you come across news that you thought was biased?
  • What do you think are the pros and cons of targeted advertising and news reporting?
  • Have you ever seen an instance of fake news or misinformation spreading quickly online, and do you think algorithms played a role in this?
  • Do you think there is a role for government regulation in ensuring that algorithms are used in ethical and democratic ways, or should this be left up to corporations and individual choice?
  • Do you often notice that the algorithm has incorrectly assessed your interests (e.g. offering you similar ads on all platforms that you are not interested in)?

ADDITIONAL RESOURCES

#ForYou: A card-based patter-matching game that helps youth aged 13-18 understand the role that algorithms play in their online and offline lives, and the value of their personal information to companies that use those algorithms: https://mediasmarts.ca/digital-media-literacy/educational-games/foryou-game-about-algorithms

More info and short video clip on echo chambers: https://edu.gcfglobal.org/en/digital-media-literacy/what-is-an-echo-chamber/1/

ANNEX

ANNEX 1: Cambridge Analytica scandal

The Cambridge Analytica scandal was a political scandal involving the misuse of Facebook users’ personal data. Cambridge Analytica was a political consulting firm that specialized in using data to influence elections.

In 2014, Cambridge Analytica obtained data on millions of Facebook users through an app called “This is Your Digital Life.” The app was marketed as a personality quiz, but it also collected data on users’ friends, allowing Cambridge Analytica to gather information on a vast number of people. This data was then used to create political advertisements that were specifically targeted to individual Facebook users, based on their personal information and online behavior.

Gathered data was allegedly used to influence elections, including the 2016 US presidential election. Cambridge Analytica reportedly used the data to create psychological profiles of voters and target them with political ads designed to sway their opinions. The scandal raised serious concerns about the potential for the manipulation of public opinion through the use of personal data and the role of technology in elections. However, the exact extent of Cambridge Analytica’s influence on election outcomes remains unclear.

ANNEX 2: Definitions of the concepts: democracy and algorithms

Democracy:

A democracy is a form of government in which power is held by the people, usually through elected representatives. The people have a say in the decisions that affect their lives, and they elect leaders to make decisions on their behalf. There are different types of democracies, but the basic idea is that all citizens have an equal say in the government and in the decisions that affect their lives. This can be achieved through regular elections, the ability to vote on important issues, and a free and open press. A key aspect of a democracy is the protection of individual rights and freedoms, such as freedom of speech, religion, and the press.

Algorithm (in the context of computer science and digital environment):

An algorithm is a set of instructions that a computer or machine can follow to perform a specific task. In the context of artificial intelligence (AI) and online use, algorithms are used to automate processes and make decisions based on data. This allows them to operate quickly and efficiently, and to make predictions or recommendations based on patterns and relationships in the data. For example, algorithms are used in social media platforms to determine what content to show a user, or in recommendation systems to suggest products or services based on a user’s past behavior.

ANNEX 3: Examples of the use and effect of algorithms on democratic values (in the context of Cambridge Analytica scandal)

How algorithms were used in the Cambridge Analytica scandal:

As not all information on the extent of the Cambridge Analytica operation is yet known, some examples may represent alleged uses.

Algorithms played a significant role in the Cambridge Analytica scandal and were used in many different aspects. Some of the examples include:

For data analysis: Algorithms were used to analyze the personal data obtained from Facebook users to identify their political views, personality traits, and other characteristics. This information was then used to create targeted political advertisements.

For predictions:  Algorithms were used to create predictive models that could predict which messages would be most effective for different groups of people, based on their personal data and online behavior. This allowed Cambridge Analytica to tailor their advertisements to maximize their impact.

For targeting: Algorithms were used to create targeted political advertisements that were specifically designed for individual Facebook users, based on their personal information and online behavior. The advertisements were intended to influence their political views and voting decisions.

The main difference between predictions and targeting is that algorithms were used to detect the behavioral patterns and predict what kind of content will be most suitable for specific persons (predictions). But algorithms were then also used for creation of this particular content (targeting) – almost all steps were thus automated at least to some extent.

Social media algorithms: Facebook’s algorithms were used to help to spread the targeted political advertisements to a wider audience by showing them to users’ friends and followers.

Why does this use of algorithms pose a threat to democracy?

Such use of algorithms is dangerous for democracy for several reasons. Some of these include:

Manipulation of public opinion: By using personal data to create targeted political advertisements, Cambridge Analytica was able to manipulate public opinion and potentially influence the outcome of elections. This undermines the democratic process by allowing outside forces to shape public opinion and decision-making.

Increasing bias: By creating filter bubbles and showing users only the information and viewpoints that align with their existing beliefs and opinions, algorithms can reinforce biases and prevent exposure to diverse perspectives. This can lead to increased political polarization and a lack of understanding between different groups of people.

Threats to privacy: The collection of personal data without consent is a violation of privacy, and it raises concerns about the potential misuse of personal information. This undermines people’s trust in institutions and erodes the foundation of a democratic society.

Problem of transparency: The use of algorithms and personal data in political advertising raises questions about transparency and accountability. Voters may not be aware that they are being targeted with specific messages, and there is no way for them to easily determine the source of the information they are receiving.

ANNEX 4: Examples of potential written news stories

Negative: The number of endangered species in the world has skyrocketed by 30% in just the past decade, according to a new study. Habitat loss and climate change are being blamed for this devastating loss of biodiversity, which threatens to destabilize ecosystems and disrupt food chains. This could also have serious consequences for the human race.

Positive: Despite the challenges posed by habitat loss and climate change, a new study shows that conservation efforts are making a positive impact on endangered species populations. In the past decade, the number of endangered species has only increased by 30%, which is a slower rate of decline than was predicted.

Political: A new study reveals that the number of endangered species has risen by 30% in the past decade, largely due to the lack of government action on climate change and habitat protection. The findings are a call to action for political leaders to take decisive steps to address these issues.

Environmental: A new study highlights the dire consequences of human activity on the natural world, with a 30% increase in the number of endangered species in just the past decade. The report serves as a reminder of the urgent need for increased conservation efforts to protect our planet’s biodiversity.

Scientific: A new study has used data from various sources to quantify the decline in global biodiversity, finding a 30% increase in the number of endangered species in the past decade. The results offer important insights into the causes and consequences of habitat loss and climate change and can inform future efforts to protect species and their habitats.

ANNEX 5: Echo chambers

An echo chamber is a phenomenon in which individuals are exposed primarily to information, opinions, or beliefs that align with their own. This can occur in various settings, including online communities, social media, and news media.

In the context of online news and social media, echo chambers can be exacerbated by algorithms that are designed to personalize and filter content for individual users, based on their interests, preferences, and behaviors. This can result in users only seeing information that reinforces their existing beliefs, rather than being exposed to diverse perspectives and news narratives.

Echo chambers can lead to a lack of diversity in opinions and perspectives, which can contribute to the spread of misinformation and polarize society. They can also reinforce existing biases and beliefs and make it difficult for individuals to engage in meaningful dialogue and compromise, as they are less likely to be exposed to opposing views and ideas.

Reinforcing biases and misinformation, polarizing society, suppressing voices are all examples of ways in which echo chambers can undermine democratic values.

Leave a comment

2021 – 1 – SK01 – KA220-SCH-000034395

This website reflects the views only of the PLATO’s EU project consortium, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

Get In Touch

Plato’s EU© All Rights Reserved.

Skip to content