Today the Library Journal published the follow up to my previous column about information literacy as an unnatural state: Education is no Salvation. In that one I’m trying to explore the motivated reasoning and cognitive bias literature a little more with the goal of showing what we’re up against when educating people to be “information literate.” Definitely still a work in progress.
The question I’m now asking myself is why. What difference does it make if we’re more aware of cognitive bias, motivated reasoning, and all the tricks the mind plays? For the most part I’m content with Aristotle’s maxim that humans by nature desire to know, but librarians tend to be a practical breed, and the question I’ve often gotten when doing anything theoretical is what difference it will make in practice. Right now, I don’t know, but every practice is based on some theoretical construct, usually one we apply unawares.
In providing some context to the last column, I used the phrase “scholarly habitude” to describe what I think is one of the aims of higher education, at least in the traditional arts and sciences. It’s not a list of things we can do, but a state of being, a frame of mind, something along those lines. In some ways I’m going back to Aristotle and the notion of virtue ethics. Scholarly habitude captures better than “information literacy” the sense that being a scholar or academic researcher isn’t just about having a set of rules to follow. It’s also about being a certain kind of person: intellectually curious, skeptical, requiring evidence for at least some beliefs, etc. These traits aren’t necessarily abundant in people.
I’m thinking about this mostly in terms of teaching undergraduates how to research and write scholarly essays, which most of them are expected to do at some point. An example of one approach I mentioned in today’s column: the student who has something to say and wants some scholarly sources to support it. The exact opposite way that people should approach research, but the way that is the most natural and in accord with how the human mind seems to work. We make snap judgments and then try to justify them. As Daniel Kahneman puts it in Thinking Fast and Slow, our “System 1” comes to a conclusion very quickly, while our slower and more thorough “System 2” is usually happy just to accommodate System 1 without further prodding because it’s also lazy. As Michael Shermer puts it in Why People Believe Weird Things, “smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.”
The wrong way to approach research on an unfamiliar topic is to have an opinion and then look for sources to justify it. The right way is to look for evidence and follow the evidence where it leads. There’s an academic analogy for this, but I’m not sure how far I want to pursue it. It’s similar to the distinction between theology and religious studies. I don’t want to say theology isn’t scholarly, just that it’s not really in accord with current information literacy standards in some ways. Theology can be defined as faith seeking understanding, meaning the theologian believes something to be true and then seeks to understand and justify that belief. Although I’ll leave open the possibility that some people do, in general people don’t hold religious beliefs for rational reasons based on evidence that could withstand public scrutiny. That’s why most religious people tend to practice the religion they grew up with and few convert to a different religion. Children growing up in a religion didn’t rationally choose to follow that religion, although later on many of them seek to understand their faith in a rational way. Hence, theology.
Religious studies, on the other hand, takes a different approach. We can use the insider/outsider distinction. Theologians study a religion from the inside, while scholars of religion often come at religions from the outside, trying to understand those religions without necessarily practicing them. They approach the available evidence and try to make sense of practices that might seem bizarre to outsiders, and to outsiders all religions have their bizarre practices. Understanding a religion as an outsider partly means explaining why strange practices don’t just exist because they’re practiced by crazy people. “They eat the body and drink the blood of who again?” “What kind of loving god would forbid bacon!?” “Your religion says I can’t publish a picture of this guy? WTF?” I’ve noticed lots of people like to make fun of Scientology without considering what they’re own religion looks like to people who don’t practice it.
Or a slightly different analogy, a bit broader. The traditional foil of theology is philosophy, and during the European Middle Ages philosophy was considered the handmaiden of theology, at least by the Catholic Church, and they were the intellectual standard that mattered. During the 17th and 18th centuries, philosophy broadly conceived came into its own again, and philosophy became the queen of the sciences. Every study that wasn’t motivated by religion could be considered philosophy, and indeed what we now call natural science was called natural philosophy in the 18th century. That’s why we now have PhDs, doctorates of philosophy, for disciplines that we don’t consider to be philosophy by contemporary standards, because they’re all involved in the same Enlightenment driven enterprise: to discover and disseminate knowledge about the world. The way to do that is approach the world with as few preconceptions as possible and see what you find. That approach explains why we (or perhaps “we”) no longer believe that demons cause epilepsy or the earth is the center of the universe. Academics follow the evidence, unless they’re economists or philosophers, because those people just make stuff up.
If we use theology/ philosophy analogy, what we’re trying to do when we try to teach students about academic research is move them from a theological mindset to a philosophical one, where the preconceptions, uninformed beliefs, and cognitive biases don’t motivate all of their reasoning. Writing what they know isn’t a good idea, because they don’t know very much, their experience of the world is limited, and their experience of scholarship even more so. Those preconceptions and biases instead should become objects of investigation themselves. That boundary has to be crossed before they can begin to examine evidence in the way information literacy standards suggest. Part of a good liberal education is about breaking down your past self to prepare to develop a better self.
So where does this leave library instruction? If all these cognitive biases and preconceptions are completely natural, extremely difficult to overcome, and probably impossible ever to completely overcome, how does this affect us practically? For one thing, it should lower high expectations. If you were unaware of all the ways the mind obscures and distorts reality for our benefit and how difficult making the philosophical leap really is, and you were already frustrated how little you could get done in the hour you might spend with a class, this news should lower your expectations and perhaps explain your frustrations. If you thought a little library research instruction is going to have a remarkable effect, you should probably change your opinion.
Then there’s the question, what the heck do librarians do instead, or in addition? I don’t have any ideas on that yet, but I’m convinced so far that librarians play much more of a support role in this enterprise than some think we do.