Lesson 6: Algorithmic Bias
45 minutes
Overview
This lesson centers around the How AI Works: Equal Access and Algorithmic Bias video from the How AI Works video series. Watch this video first before exploring the lesson plan.
In this lesson, students will practice cropping images to uncover the bias underlying the Twitter cropping algorithm. Then, students will read and watch a video about the discovery of this biased algorithm. Finally, students will discuss ways to recognize and reduce bias along with analyzing Twitter's response to the allegations of bias in their cropping algorithm.
This lesson can be taught on its own, or as part of a 7-lesson sequence on How AI Works - click here to view all lessons in this sequence.
Standards
IC - Impacts of Computing
- 2-IC-20 - Compare tradeoffs associated with computing technologies that affect people's everyday activities and career options.
- 2-IC-21 - Discuss issues of bias and accessibility in the design of existing technologies.
- 3A-IC-24 - Evaluate the ways computing impacts personal, ethical, social, economic, and cultural practices.
- 3B-IC-25 - Evaluate computational artifacts to maximize their beneficial effects and minimize harmful effects on society.
- 3B-IC-26 - Evaluate the impact of equity, access, and influence on the distribution of computing resources in a global society.
Agenda
Objectives
Students will be able to:
- Reason about which types of tasks should not be completed by an algorithm
Preparation
- Complete the cropping widget activity yourself so you can see what your students will experience with the widget will be and what questions or issues they may bring up.
- Watch the parts of the video “Are We Automating Racism?” referenced in the lesson plan.
- Read the two articles from the links section below.
Links
Heads Up! Please make a copy of any documents you plan to share with students.
For the teachers
- Algorithmic Bias - Slides
- Sharing learnings about our image cropping algorithm (Twitter) - Resource
For the students
Teaching Guide
Before the Lesson
Getting Started with Code.org: Consider watching our Getting Started with Code.org video series for an overview of how to navigate lesson plans, setup a classroom section, and other important features of the Code.org platform. Each video also has a support article if you'd prefer to read or print instructions - click here to learn more.
Setup a Classroom Section: You can use a class section in Code.org to manage your students, view their progress, and assign specific curriculum - click here to learn more.
If you are using a learning management system, there may be additional steps to sync your classes with Code.org:
- Click here for steps to setup your classes with Google Classroom
- Click Here for steps to setup your classes with Clever
Become a Verified Teacher: Lesson plans and levels have additional resources and answer keys for Verified Teachers, which is quick process that verifies your position at an educational institution. Click here to complete a form and you should have access to verified teacher resources in ~1 business day.
Supports for Teachers: You can learn more about bringing AI lessons to your classroom through our AI 101 for Teachers series. AI 101 for Teachers is a free, foundational online professional learning series for any teacher and educator interested in the groundbreaking world of artificial intelligence (AI) and its transformative potential in education.
Warm Up (5 minutes)
Discuss: Think about a time when you shared a picture with your friends or on social media. Have you ever had to crop a photo you shared? What did you crop out and why?
Discussion Goal: Use this discussion to show that users usually have a reason for sharing photos and the focus depends on that reason. Students will most likely bring up a wide range of examples that can be used to highlight how there isn’t a one size fits all approach to sharing photos and cropping them.
Remarks
Today we are going to try our own hands at cropping and then closely examine a real-world situation where training a model to crop photos leads to algorithmic bias.
Activity (35 minutes)
Cropping Widget (10 minutes)
Code Studio: Have students log into Code Studio and begin interacting with the Cropping Widget. Each student should interact with the widget on their own, but they will be discussing with a partner.
Do This: Tell students to use the widget to crop the images as if they were uploading them to a social media account.
Circulate: Monitor students as they complete this task. Ask probing questions as you notice students pause on certain images, asking what they're thinking or what they notice in the image, or what causes them to pause or be unsure.
Intentionally Mischievous: The first few images are pretty innocuous with how students may decide to crop images, but as the images continue, the decisions become harder and harder. Students may decide to prioritize faces or text or color or foreground vs background when cropping. But eventually, images will deliberately pit student patterns against each other and force some uncomfortable decisions when deciding what information is important (ie: belongs in the frame), and what information isn't (ie: can be cropped out).
Lean Into Discomfort: Students may feel unsure how to crop certain images and may even feel slightly uncomfortable with a decision. This is okay and expected - the goal is to surface that even something as seemingly simple as "cropping a photo" can contain complex decisions. Emphasize that it's the decision itself - that the system is forcing them to decide in the first place - that creates this uncertainty, and not whatever they happen to decide with each individual picture.
Regroup: As you notice students reaching the end of the activity, pause the activity and regroup.
Discuss: Which images were challenging for you to decide how to crop? Which images did you and your partner crop the same? Which images did you and your partner crop differently?
Discussion Goal: Use this discussion to highlight the different experiences and perspectives everyone brings to deciding how to crop an image. Direct students to consider not only what they decided to keep in the picture, but also what they decided to crop out of the picture. What happens when we crop something out of a picture? It never gets seen by the audience. Look back at the photos you cropped. What or who got left out? You may choose to guide the class toward describing general characteristics of images that were challenging to crop.
Remarks
Now imagine we recorded how everyone in this room cropped the images from this widget. We could use that information to train a model. This model could be used to crop other photos being uploaded to the social media site.
Discuss: What patterns do you think might emerge from the data we collectively contribute to this model?
Discussion Goal: Help students generalize some of the patterns or "rules" they may have been using when cropping, such as:
- emphasizing faces
- emphasizing text
- emphasizing things in light vs dark
- emphasizing things in the foreground instead of background
Depending on your level of comfort with the class, you could probe implicit biases with how the class may have been cropping, such as:
- did the class tend to prefer the left-side of the screen or the right-side of the screen?
- did the class tend to prefer younger faces, older faces, or faces their age?
- did the class tend to prefer men or women or crops without people at all?
Twitter Cropping Algorithm (25 minutes)
Remarks
It turns out that the issue of how to crop an image is something social media platforms have been working on for some time. We are going to watch a video that shows the issues that arose when Twitter used machine learning to train an algorithm to do this cropping.
Video Part 1: Watch the video “Are We Automating Racism?” from 0:00-4:05
Video Part 2: Watch the video “Are We Automating Racism?” from 9:30-13:46
Display: Show the next slide, which shows a definition that to use in the following remark.
Remarks
In the video, the primary reporter Jos asks her two colleagues, “So do you think this machine is racist?” We might all have different definitions of the term racism, but for the purposes of this discussion we are going to use this definition of racism - prejudice, discrimination, or antagonism directed against a person or people on the basis of their membership in a particular racial or ethnic group, typically one that is a minority or marginalized.
Discuss: Using this definition of racism, what aspect(s) of racism did the Twitter cropping algorithm do: prejudice, discrimination, or antagonism?
Discussion Goal: Students should gravitate towards prejudice and discrimination, since one group is being given different treatment purely based on the color of their skin according to the twitter algorithm. Use this discussion to highlight the evidence brought up in the video both from the reporters and the tweets they mention from other users showing how the algorithm was biased or prejudiced towards lighter faces compared with darker faces.
Lived Experiences: Students may bring in other examples from their own lives where technology has appeared biased or prejudiced against them based on their race or skin color, especially involving sensors and image recognition. Facial recognition in particular has a history of having racial biases against dark-skinned people, which may have impacted students in your classroom. Consider how you may want to validate the lived experiences of the students in your room while continuing to guide the conversation towards the solution-focused section of the lesson, where Twitter ultimately implemented a solution for this particular algorithm.
Looking to the Future: This activity puts students in a position to see how innocuous problems (like cropping photos) can lead to unintended consequences, and students may be uncomfortable reflecting on their experience with the widget and the consequences of their decisions. Ensure students that these types of decisions are everywhere, beyond just this image and cropping algorithm issue, and it's how we respond that matters (as we'll see shortly with how Twitter responded). Remind students that they are future leaders and being aware of how bias creeps into decisions and into our data will make them better equipped to solve problems like these when they see them in the future.
Discuss: How was the Twitter cropping algorithm trained? According to the video, where is a potential source of bias when training similar cropping algorithms?
Discussion Goal: The most important thing for students to come away from this discussion with is the fact that the machine is not "racist", but it is biased towards lighter faces because of the datasets used to train it. Use this discussion to ensure students understand how the model used the saliency datasets discussed in the video and how some of those datasets were trained on photo libraries containing images that do not represent the population as a whole.
Prompt: If you were the CEO of Twitter and found evidence of this bias in your cropping algorithm, how would you respond? What steps would you take?
You may choose to have students share their responses if there is time, otherwise move to reading the article.
Remarks
Now that you’ve had time to consider how you might respond, we are going to read about how this actually played out and Twitter’s response when presented with evidence of bias in their cropping algorithm.
Do This: Have students read the article “Twitter says its image-cropping algorithm was biased, so it’s ditching it (CNN)”. After they are finished reading they should mark up the text with the following:
- Highlight / Underline: Any information in this article that you want to know more about
- At The End: Write a 10-word summary of the article
To Print or Not To Print? This lesson is written assuming that you have printed out the article and have it physically available for students to write on, even though it is also possible to have students interact with this text digitally. If students read the article digitally, it is most important that they still follow the active reading strategies outlined in this lesson - highlighting the text, writing in the margins, and summarizing. This may require some additional time & instruction to teach students your preferred tools of digital annotation, and may require some additional adjustments to some of the later annotation strategies in this lesson.
Wrap Up (5 minutes)
Discuss: Have students discuss their reactions to the quotes pulled out from Rumman Chowdry, a software engineering director for Twitter’s machine learning ethics, transparency, and accountability team.
- Do you think Twitter’s response was appropriate?
- What are the risks and potential harms of systems, such as social media platforms like Twitter, becoming too dependent on machine learning?
Remarks
The story of Twitter and its photo cropping algorithm shows how something that’s built with one intention, in this case to better highlight photos, can lead to unintended effects, in this case, bias of white faces over black faces, which can also have harmful impacts on society or culture. And, this case provides an example of a software company that has taken actionable steps to address algorithmic bias by establishing this team (machine learning ethics, transparency, and accountability team) lead by Rumman Chowdry.
Diving Deeper - Rumman Chowdhury: Consider exploring the CSEdWeek CS Heroes resources on Rumman Chowdhury for additional activities, posters, and resources about Rumman, AI Ethics, and her work on bias in machine learning models.
(Optional) Video: Watch the video How AI Works - Societal Impact for an extended look at AI's impact on society and concerns about bias beyond just this example.
After the Lesson
Teacher Survey
We'd love to learn more about the folks teaching these lessons and the classroom experience. Please let us know in this How AI Works Teacher Survey.
Additional Lessons
If you'd like to teach additional lessons from the How AI Works video series, click here to explore additional resources
AI and Machine Learning Unit
If you'd like to dive even deeper into AI and Machine Learning, consider exploring our 5-week unit on AI and Machine Learning. Click here to view the unit and learn more
This work is available under a Creative Commons License (CC BY-NC-SA 4.0).
If you are interested in licensing Code.org materials for commercial purposes contact us.