Question 3: Map the Algorithm
- As you read or view a piece of information on-screen, pause and look fully at the screen.
- Are there articles or videos suggested for you to view next?
- What does it look like there are about?
- What does this site predict about you?
- What intellectual or emotional direction is it funneling you toward?
Introducing This Reading Response to Students
Most of us—including our students—have had the experience of being on YouTube or Instagram, or some other platform, watching one video, and then scrolling or clicking to the next video, and the next, and the next, in a landslide of endless content that holds our attention. A few years ago, we may have marveled, How does it know what we like? But now, we know: one click sends us down an algorithmic pathway that is meant to be eternal. And the mechanisms behind the algorithms are constantly honing their abilities.
Students today know this. Still, it is important for them to remember they know it. Putting a layer of awareness between their minds and the next upcoming video is crucial to their critical abilities. Teaching them to have even a second’s notice of This website thinks I want to see more of this will help them remember that they can control their own consumption. And more importantly, teaching students to think, The website knows I like x, so it thinks I might like y, can help them see how some algorithms are programmed to direct their thinking.
The power of algorithms to direct thinking has already been proven. One 2020 study found that YouTube’s algorithm channeled viewers toward extremist thinking (Basu). Software engineer Guillaume Chaslot (2019), who helped develop the artificial intelligence (AI) behind YouTube’s video recommendations, has traced how the algorithm has led viewers down pathways to terrorist thinking and child pornography. He posits that as AI becomes more sophisticated, its predictions will fit viewers’ preferences almost seamlessly; this will result in less and less inappropriate and illegal material being reported or flagged. I recommend reading Chaslot’s Wired article “The Toxic Potential of YouTube’s Feedback Loop” for your own understanding.
To introduce the idea in class, have students share their own stories of videos or posts that popped up in their feeds and what they thought about them. Students could discuss in small groups the times when video or post suggestions were well chosen, surprising, uninteresting, and even uncomfortable. Students probably understand the feedback loop aspect of their favorite platforms, but they may not understand the depth of the intentionality behind it or the commercial aims. Teaching them to become aware of this, even a little bit, will help them not only as critical thinkers but as reflective citizens and humans, ones who choose their own ideas.
Student Examples
Map the Algorithm for “Wolff: Andretti F1 Bid Made a Statement with Cadillac and GM Tie-Up,” by Armaan (grade 10)
As I was reading articles about recent news on Formula 1, I kept getting suggestions to read about other related articles. In these suggestions I would get statements like “Read Also: Why Andretti is both right and wrong with his F1 ‘greed’ accusations” (Cleeren and Noble). All the other articles, videos, and news were related to the current article I was on. All of them were about recent news in Formula 1. The website knows I was curious about information about recent Formula 1 news so it suggests more articles for me to read that might spark my interest, making me click on the article which makes me spend more time on their website which for the company means more money. All the algorithm is doing is making the consumer spend more time on their website making the reader feel curious and relieved while on the site but lazy and ashamed after wasting their time with mind-numbing information. The algorithm is way more intelligent than we think and manipulates our time and emotions starting with a simple suggestion. [TEXT: CLEEREN AND NOBLE 2023]
Map the Algorithm for “Dear Rider (2021): Official Trailer | HBO,” by Phoebe (grade 11)
After watching the movie trailer for the movie Dear Rider: The Jake Burton Story I noticed that the YouTube recommended videos changed from my usual art and drawing videos to not just snowboarding videos, but also movie and TV show reviews, specifically for the show Euphoria, which I thought was strange because I’ve never watched or searched up anything about that show. Dear Rider: The Jake Burton Story is about late Burton Snowboards founder Jake Burton Carpenter and how he advanced the sport of snowboarding. From what I heard Euphoria is about the lives of teenagers as they deal with a number of things such as drug addiction, gun violence, sexual identity issues, and mental health issues. I was confused as to why these videos were coming up on my recommended page because the movie and movie trailer have nothing to do with any of those things, but then I read the description box under the trailer. The description box said, “Dear Rider, an HBO original documentary and true story of how Jake Burton Carpenter turned a childhood pastime into a cultural phenomenon, premieres November 9 on HBOMax.” Euphoria is on HBOMax as well, so the YouTube algorithm must have associated the words HBOMax with the show Euphoria and began suggesting those videos to me, assuming I would be interested in those as well. [TEXT: VILLENA 2021]
First, Respond to the Student
When I saw Armaan’s comment about the reader “feeling lazy and ashamed,” I noted that Armaan might feel this way himself after too much surfing—who doesn’t? We talked about how websites are designed to keep viewers engaged but also that we can come up with plans to disrupt that engagement if we would rather spend our time more meaningfully. While Phoebe seemed unfazed by the commercials for Euphoria, I wondered if she recognized how they could be extremely harmful for other viewers.
Then, Develop Critical Consciousness
Both Armaan and Phoebe touched upon deeper realities about algorithms in their RRs. Recognizing truths such as the looping design of algorithms and their financial goals is a major first step in practicing critical consciousness when online. Algorithms are designed to engage; the viewer is supposed to zone out. The shame that Armaan described is even part of the design, and algorithms can distract from that as well. Viewers will find themselves in new spaces, such as Euphoria, to the financial gain of big companies. Our work can help students become aware of these pathways.