Skip to Main Content

INFO 3050: Living an Informed Life through Information Literacy

This guide contains readings for the course, INFO 3050: Living an Informed Life through Information Literacy.

Module 2 : Critical Thinking in the Twenty-First Century

Critical Thinking in the Twenty-First Century

(Image:  "News room of the New York Times newspaper" by Marjory Collins [Public domain])

In contemporary America there is a near constant barrage of messages (ideas, opinions, advertisements, etc.). Whether you encounter these messages on TV, online, on the radio, in print, or in conversations with others, your mind processes them somehow. If a message aligns with what you believe, you may not look too closely, accepting it almost unconsciously. If it challenges or contradicts what you believe, you may become angry, afraid, or feel threatened. How many times have you dismissed information because it came from that one politician or celebrity you can't stand? How often do you repeat something you heard on a trusted news website or read on a trusted social media account without verifying it? In our hyper-mediated world, it's easy to consume and share information without really thinking about whether it's factual or has any real intellectual value. Critical thinking can be a powerful means of managing your feelings and beliefs about the world and the other humans in it.

Before we dive into critical thinking as a concept and skill, let's examine some of the factors that influence our lives as twenty-first-century Americans. For the purposes of this course, we will focus here on how we encounter information, how it's created, and some of its uses. We will also look more closely at some big ideas that can help us understand why critical thinking is so important.

As mentioned above, we live in a "hyper-mediated world". Let's break this term down to get a clearer understanding of what it means. "Hyper" comes from Greek and means "over, above, beyond," and can imply "exceedingly, to excess" [1]. The word "mediated" has its roots in a Latin word that means to "divide in half," and evolved into something like to "be in the middle" [2]. So, "hyper-mediated" means there is an excess to things in the middle of, or between us and what we encounter. Think of how much you can do with one smart phone. In a matter of minutes a person could take pictures of their lunch, distribute those pictures to hundreds of people through a social media app, read about a flood in Bangladesh, buy shoes, text a meme to a friend, read an email from their boss, finish an episode of a TV show, and schedule a dentist appointment. Day-to-day activities that used to require a variety of analog tools, networks of people, and long stretches of time can now flow seamlessly and almost instantly through a single piece of extremely advanced technology. While this is convenient, and in many ways an improvement, having this single device between us and almost everything we do puts us at the mercy of apps and the Internet. We don't always know how they function, where the digital data goes, or who uses it for what.  

Nowadays, when people use the word "technology," they are likely referring to computing and or information technology. They mean their smart phone or laptop. This suggests that these particular technologies have become central in our daily lives, how we work, play, and communicate. Let's take a step back and define "technology" more broadly. In 1937, the American sociologist Read Bain wrote that "technology includes all tools, machines, utensils, weapons, instruments, housing, clothing, communicating and transporting devices and the skills by which we produce and use them"[3]. In other words, anything used to extend our abilities beyond what we can do with our bodies and minds is a technology. Even a stick used by a chimp to extract ants from a hole in order to eat them could be called "technology" [4]. Still, it can be difficult to separate the the word from the devices most of us rely on for our employment, entertainment, social interaction, and education. While these technologies facilitate twenty-first-century life, they have disrupted, replaced, or eliminated many of the processes and standards humans have traditionally relied on to legitimate authority, regulate quality, and authenticate accuracy.

A hundred years ago, film and radio were in their infancy, and the average American would have likely relied on newspapers or magazines to find out the scores of recent sports games, the result of the latest election, or what was happening on the battlefields of the Great War. Newspaper and magazine publishers relied on reporters, researchers, copy editors, managing editors, and even paper boys to get information out to the public. Although fact-checking had yet to become common practice [5], news stories passed through several hands before publication. Often called "gatekeeping," the approval process for news and magazine articles could be extensive. Other forms of print media, like books, usually face similar scrutiny. Today, anyone with a webcam and a YouTube account can rant in their bedroom about the latest law passed by Congress and reach as many viewers as mainstream news anchors in multi-million dollar production studios covering the same story. This makes the idea of gatekeeping seem almost old-fashioned. Even so, there are fair-minded people in all types of media spaces creating all types of content who strive for accuracy and quality in the information they produce. One major challenge we face in our hyper-mediated, technology-driven world is knowing the difference. We have to be the gatekeepers of our own minds and critical thinking provides a means to ensure our beliefs about the world and others in it are fair, accurate, and well-reasoned.

Defining Critical Thinking

Though many of the principles behind critical thinking can be traced back to Socrates,  a modern definition was formulated in the early 1940s by aan American educator named Edward Glasier. He defined critical thinking this way:

"The ability to think critically, as conceived in this volume, involves three things: ( 1 ) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one's experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods. Critical thinking calls for a persistent effort to examine any belief or supposed form of knowledge in the light of the evidence that supports it and the further conclusions to which it tends. It also generally requires ability to recognize problems, to find workable means for meeting those problems, to gather and marshal pertinent information, to recognize unstated assumptions and values, to comprehend and use language with accuracy, clarity, and discrimination, to interpret data, to appraise evidence and evaluate arguments, to recognize the existence (or non-existence) of logical relationships between propositions, to draw warranted conclusions and generalizations, to put to test the conclusions and generalizations at which one arrives, to reconstruct one's patterns of beliefs on the basis of wider experience, and to render accurate judgments about specific things and qualities in everyday life."[6] 

For a simpler explanation of critical thinking watch the following video.

Video: "What is Critical Thinking?" by teachphilosophy <>


[3]  Bain, Read (1937). "Technology and State Government". American Sociological Review. 2 (6): 860–874. doi:10.2307/2084365 <>. JSTOR 2084365 <>.
[4] To see chimps do this, watch the video "Chimps & Tools | National Geographic" on YouTube <>.
[5] see this TIME article for a short history of fact-checking <>

[6] Edward M. Glaser, An Experiment in the Development of Critical Thinking, Teacher’s College, Columbia University, 1941. Find this and other definitions of critical thinking at