Like It Was Written Yesterday

Another part of historical change rhetoric I’m looking at is the persistence of themes. Read these quotes and see if you could tell when they were written purely based on the language:

Applying technology is not a “one time” event, it is a continuing activity, since technology, whatever form it takes, is constantly changing. This reality is a key aspect of librarianship today and helps explain why our profession is clearly in transition….

Even without our deliberate choice, changes are being imposed on our working environment by technology as well as by other pressures external to and within our profession. In the past decade librarians have discovered that we must either initiate change or adapt to it. We simply can’t ignore new developments and hope they will leave us untouched. An ostrich-like attitude is downright dangerous….

Perhaps, as is already true in some specialized library service assignments, advanced degree studies in addition to MLS/information science type training will be frequently expected. Position ads in current library journals already show quite a variety of preferred background qualities….

Those are from an article in the Library Journal from 1985, “Managing Change: Technology and the Profession” by Karen L. Horny. The “past decade” in which librarians discovered they had to change was a big decade for library automation, although Horny also does a pretty good job of predicting how the rise of personal computers, the digitization of content, and the ability to do things “online” will change libraries and information. (“Perhaps electronic readers will become so compact and legible that it will be possible to curl up with a good online novel!”)

Statements almost identical to this appear in the current library literature all the time. The age of such themes–combined the significant changes that have occurred in libraries over the last 30 years–seems to me an indication that angry or frustrated attacks on current librarians as hopelessly resistant to change don’t have much evidence to support them. The elder librarians around today were the very ones implementing all the significant technological changes that resulted not from the Internet or the rise of social media, but from the initial automation of catalogs and indexes starting in the 1960s. It seems to me that wave of technological change was much more shocking for librarians of the time than our current situation, which is more or less a steady development building upon the drastic and rapid change that really happened 30 or so years ago. As people get older, perhaps they get more resistant to change, or perhaps not, but the retiring generation of librarians certainly lived through and implemented significant and rapid change.

Here’s another quote I left out because of the date:

It has been especially rewarding to see that some of a library’s longer-term employees have the greatest sense of the new technology’s benefits, since they can recall, often quite vividly, the limitations of former manual operations. It is also true that for most people who have entered the library field since the early 1970s, change is the accepted norm.

I’ve encountered plenty of examples of the exact same sentiment among librarians writing today, except the time frame is since the beginning of the 21st century or some such. Librarians without a historical knowledge of how technology has affected librarianship for the past 45 years or so are always in danger of making foolish claims about the current state of the profession.

30 Years of Change and Hype

For a possible research project, I’m reading around in the historical library literature about change in libraries. Here’s a great quote from John Berry in a Library Journal editorial from 10/15/83 about the first LITA conference:

The usual band of cheerleaders delivered typical, often condescending, pleas for everyone to get on this or that automation bandwagon, and the usual “experts” delivered typical indictments of working librarians who offered any resistance to the cosmic imperatives of the new age.

I’m trying to get an idea of just how long hyperbolic change rhetoric in librarianship has generated a specific kind of criticism, not of the change, but of the rhetoric. Now I know it’s been at least 30 years.

Mac: For the Middle Aged Librarian

Generally I take devout advocates of just about anything with a grain of salt, and in the tech world there are no more devout advocates for anything more than for Apple products. It’s not that I think Apple doesn’t make great products; it’s more that I don’t think their computers are three times better than some competitors, which they should be to justify being three times the price. Also, I object philosophically to worshiping commercial products, companies, or CEOs. It just seems wrong.

Over the years I’ve owned numerous Apple products as well, so I’m not an anti-Apple purist, although I have on occasion poked fun at Apple purists. I got my first computer 28 years ago for my 16th birthday. It was an Apple IIc, and it didn’t do a whole lot. Fortunately the only thing I needed it to do was process words, and I wrote like a demon on that thing for eight years, when I replaced it with a Mac Color Classic, which was a great computer. If the Internet hadn’t come along and ruined things for it, I’d probably still have that computer. That lasted four years and I only upgraded because I needed something Internet-capable. Ever since then I’ve had a series of Windows desktops and laptops, most of which I’ve been happy with. Being cool or having a computer with a slick chassis just didn’t seem worth the price. I’ve used enough devices over the years that I’m pretty OS-neutral at this point, so that’s not a factor either. I also had an iPhone, but I never liked it as much as I like my current Samsung GS4. And although I have a 7″ tablet computer, I use it almost exclusively to watch Netflix while on my treadmill, so I’ve had no desire for the much more expensive iPad.

Thus, I’ve been coasting along for years without any reason to love or praise Apple, but then something happened. I got old. Specifically, my eyes got old. After a five-year hiatus I had an eye exam a couple of months ago. I hadn’t even been having problems and only went because I was taking my daughter and thought what the heck. The heck was that my eyesight had deteriorated such that they prescribed me trifocal progressive lenses. When I put them on the first time and started looking around, I thought I was going to throw up. It’s made worse by the current vogue for glasses with short lenses. I wear a comfortable pair of plastic frames much of the time, and my optician could find only one pair with temples long enough for my large head, and they have a short span from top to bottom, meaning that each of the focal lengths has a really small vertical span.

Suddenly, I had trouble seeing lots of things, but especially my computers, which is a serious problem given how much time I spend every day interacting with one. The only screens I could see at all well were my 27″ iMac at work, my 1080p, 15.6″, 6-pound Asus laptop at home, and my phone, and with the computers my horizontal vision has narrowed enough that I can only really use part of the screen comfortably and just keep shifting windows into that portion. My lighter, smaller netbook that I like to carry around the house, travel with, etc.? Things were looking very dim, especially if I moved from a higher resolution device. The larger computers are both good machines and I’ve been happy with them, and I’ll be using both for serious work where I have a lot of windows open, but I needed something more portable for everything else, and something with a keyboard because of all the writing I do.

So, since I have another birthday this month and got a promotion I decided to splurge a bit and got a 13″ Macbook Pro with the Retina display. And I have to say, in all honesty, this is the best ultraportable laptop a middle-aged librarian with dodgy eyes and a weird eyeglass prescription could ask for. Apple is welcome to use that quote from me for advertising purposes. It’s an even higher resolution than my Asus laptop, plus a lot lighter and smaller. I wrapped it in one of these obsidian babies so that I don’t even have to pretend I’m trying to be a cool Mac owner (having avoided the covers where the ratings said things like “and the glowing apple still shows through!”). It also looks like the one the Winchesters have on their Macbook in Supernatural, and my daughter and I have been enjoying a Supernatural marathon on Netflix Instant (which is a show that could make for an interesting discussion of research methods, if you ask me). After so many years of gently mocking friends and family for their blind allegiance to Apple, I finally can wax enthusiastic about something from Apple that no one else I know of is offering. If your eyesight isn’t what it used to be, and you spend a lot of time reading and writing with computers like this middle-aged librarian does, and you want something very light and portable, you can’t go wrong with the 13″ Macbook Pro Retina. Praise doesn’t get any less qualified than that, at least from me.

Putting Things in Perspective

For some reason a couple of older posts have been getting some recent traffic, one from a few months ago where I wrote about big name librarians and another much older one where I meditated upon my lack of fame. That last one is almost five years old, and while I’m probably better known among librarians than I was then, I don’t think I’m any more famous in any of the ways I wrote about. Still, after rereading those posts I felt like there was at least one more thing to say.

Some things have changed for me since I started this blog six years ago. My first post came on the day I officially received a promotion to “librarian with continuing appointment.” That’s our non-faculty-status equivalent of associate professor with tenure, and comes with more or less the same benefits. This month I officially received promotion to “senior librarian,” which is our equivalent to full professor. It was a nice honor, plus I got a higher percentage raise than usual. All to the good.

Also, since I started the blog, I’ve had a lot more opportunities to write and speak than before, and I’d have to credit this blog for leading to a lot of them. A blog post here inadvertently led to the book deal with Library Juice Press, and writing Libraries and the Enlightenment was the most professional fun I’ve had. It probably had something to do with being invited to join the great group of Peer to Peer Review columnists at the Library Journal. Possibly the name recognition has helped me win some elections within ALA that have been beneficial for my career. Good things have flowed from it. However, although I’ve done the writing, even the blog has benefitted from the person in OIT who supports WordPress and from all the other people I’ve interacted with online over the years.

I could focus on the me, me, me part of all this. I’ve had some success and I’ve also done a lot of work for it. In a sense, whatever success I have managed to have I’ve deserved, one could say. And I’m not saying I don’t deserve it. Those who know me well know that modesty isn’t exactly one of my strengths. Earning this recent promotion took a lot of work on my part. Some things I did deliberately over the years with the chances of promotion in mind, and some things just sort of happened, but nevertheless I wasn’t slacking. Just gathering up materials for my dossier took a lot of time.

On the other hand, what has struck me most about the whole process was how much the responsibility for it rested in other people’s hands, in fact a lot of other people’s hands. The more I consider it, the wider the circle of people and institutions that contributed gets. It’s kind of staggering when I start to think about it.

Just considering the promotion process directly, my supervisor had to write on my behalf. Somewhere between 10-20 other people in the library, a couple of academic departments, and across the profession wrote positive letters of reference for me (at least I’m assuming they were positive). That alone was one of the best parts of the process, knowing that so many people were willing to write on my behalf, and I’m very grateful to them. A group of my colleagues had to read all that stuff and make a decision, which other people had to approve.

But it keeps on going. One of the factors was probably having a book published. Rory Litwin is responsible for offering me the contract and support through the process, but I could never have gotten the book done on time without two research leaves from the university, which required other people writing on my behalf or agreeing to grant them. Sure I wrote the book, and it was a lot of work, but without Rory and the Dean of the Faculty it wouldn’t have happened.

Another thing I assume played a role was my leadership within ALA. I know I’ve done some good work in the organization, but that wouldn’t have been possible without the generous conference travel support I’ve had my entire career, both at Gettysburg and at Princeton. Without that support, my career would have been very different. I wouldn’t have worked with as many good people over the years and wouldn’t have had the opportunities to do the things I have. There are also all the people within the organization who helped me out, taught me stuff, worked with me, and enabled me to do what I did. I’ve written before on how to run a good ALA committee meeting, but to get anything done you’ve got to have the actual committee. It’s not a one-person operation.

Keeping the faculty and students in the departments I serve happy is one part of my job, and a couple or professors probably wrote on my behalf. Typically that role for the faculty involves either buying stuff or solving problems. Buying stuff is a lot easier when you have generous acquisitions budgets, as I generally have had. Even with the buying, I’m just the middleman. There’s a whole team of people who make the actual purchases, catalog them, link to them, make them accessible, and often quickly so that faculty and students get things as rapidly as possible. It’s unfair that I probably get more of the credit for that than I deserve.

Solving problems presents the same situation. I “solve problems” usually by being a conduit between the professor or student with the problem and the person in the library who actually solves the problem. Database isn’t working right? Well, I can diagnose and troubleshoot, but if there’s a bill to be paid or tech support to be contacted, I’m not the one doing that. OPAC glitch? Um, yes, we have people for that. I’ll contact them. Whatever the problem is, unless it involves a research project of some kind, my role is to find the person who really can solve the problem. I’m not saying that doesn’t take some knowledge and skill and that I’m not a responsive and capable liaison. I’m just saying that without all those other people, I’m pretty useless for a lot of things and I know it. I’m pretty good at what I do, but without a whole bunch of other people being good at what they do, I couldn’t be as good.

There are a lot of things I do more or less on my own and I could write about those things, but my goal here is to remember just how much I can’t. Thus, back to “big name” librarians, or “famous librarians,” or the amusing category of “rock star” librarians. I’ve never met any librarians who seemed especially stoked about their own alleged fame or celebrity status. They’re possibly out there, but we don’t run in the same circles. It’s always like that with me and celebrities, I guess. I’ve lived in New Jersey for eight years and have yet to socialize with Bruce Springsteen or even the cast of Jersey Shore. If they are out there, it would be impossible for me to take them seriously. For one, they’re still just librarians. Mostly, though, I know that however externally successful you are, and no matter how great you might actually be, you’re dependent on opportunities you didn’t necessarily create and a whole network of people who enable you to do what you do. Even real rock stars need great sound engineers.

I Don’t Appreciate that Fact

I’m learning all sorts of new things reading Will Storr Versus the Supernatural. From page 273:

‘Do you believe in heaven and hell,’ I ask [David Vee, the Founder of Ghosts-UK].

‘No,’ he says, ‘people were seeing ghosts BC–Before Christ–so that rules that one out totally. The earth is millions and millions of years old. You know, I think the Bible is a damn good book, but it’s nothing like the original translation. How can we translate that when we still have difficulty translating the original Latin, which is only five hundred years old? It’s very difficult, because it has so many syllabuses and nouns and whatever. It’s like the voices on the EVPs I’ve recorded here. Most of them are in German Latin, which is what people spoke until the nineteenth century. It wasn’t until eighteen-twenty-something that we began to speak English. A lot of people don’t appreciate that fact.

I read through that paragraph a couple of times trying to make sense of it but eventually gave up.

Stories We Tell Ourselves

During my travels to and from ALA I read a fun new book, Will Storr’s The Heretics: Adventures with the Enemies of Science. This is the latest example I know of in the genre of books about pseudoscience, although it differs significantly from the ones I read over Christmas break and blogged about here. Storr’s book is more informal, with his personal views and demons inserted alongside the reporting about various groups, from parapsychologists to alleged Morgellons sufferers (that was a new one to me).  This turns out to be a good thing, as his troubled mind and basic decency come through to allow the subjects of investigation to be seen with as much respect as possible. People who claim to suffer from Morgellons, for example, may indeed actually suffer from delusional parasitosis, but they get a fair shake from Storr.

Also, while he is clearly on the side of science and the skeptics, he’s not afraid to expose  dogmatic skepticism when it rears its supposedly rational head. A number of skeptics loudly declaring homeopathy to be bunk (which Storr and I both agree it is) don’t like to be asked whether they’ve read actual scientific studies on homeopathy, and if so which ones.  The harshest treatment anyone gets in the book (and that isn’t very harsh) is when Storr catches James Randi up in a number of potential lies about his past. Apparently the hero of the skeptics isn’t always a paragon of honesty. None of us are, though, which is one of the points the book makes. A tour of concentration camps with the Holocaust denier David Irving is less disturbing than it might have been because of Storr’s focus on the illogical rather than the horrific. At one point Irving declares that a gas chamber couldn’t have been used for executions because there are handles on the inside doors, although he fails to notice that the locks to the room are all on the outside. Another luminescent moment is Irving’s declaration the he doesn’t want to be anti-Semitic, but “the Jews don’t make it easy for” him. We see what we believe.

The Irving chapter is an outlier of sorts in a book devoted to science and pseudoscience, but that’s because unlike some such studies, Storr is very much concerned with how the mind works and the tricks it plays on us. Even the skeptics can become quite dogmatic without being able to point to evidence for their beliefs. Storr tries to take the perspective of the agnostic, saying in effect, “I believe I’m right, but I could be wrong, and if possible I withhold judgment until I have real evidence.” It’s not very easy to do, if it’s possible at all, but Storr does a better job than I’ve seen in books like this. (His book Will Storr Versus the Supernatural, which I started reading after enjoying this, is much the same.)

The conclusions he reaches through readings and interviews regarding cognitive psychology I found the most interesting, and reminiscent of several articles I have read about such studies. Instead of explaining, I pulled out a few representative quotes summing up some of what he found out about cognitive dissonance, confirmation bias, confabulation, the Hero-Maker, and the stories we tell ourselves that make us out to be better and more moral than we really are. I pulled selectively in the order they appear in the book:

Humans are subject to a menagerie of biases, a troubling proportion of which hiss seductive half-truths in the ear of our consciousness. They tell us that we are better looking, wiser, more capable, more moral and have a a more glittering future in store than is true. (110)

 

We typically have a bias that tells us we are less susceptible to bias than everyone else. Our default position tends to be that our opinions are the result of learning, experience and personal reflection. The things we believe are obviously true–and everyone would agree if only they could look at the issue with clear, objective, unimpeded sight. But they don’t because they’re biased. Their judgements are confused by ill-informed hunches and personal grudges. They might think they’re beautiful and clever and right but their view of reality is skewed….

Most of us think we are the exception. This most disturbing of truths has even widely demonstrated in study after study. When individuals are educated about these ego-defending biases and then have their biases re-examined, they usually fail to change their opinions of themselves. Even through they accept, rationally, that they are not immune, they still think as if they are. It is a cognitive trap that we just can’t seem to climb out of. (112)

 

Just as the knifefish assumes his realm of electricity is the only possible reality, just as the hominin believes his tricolor palette allows him to see all the colours, just as John Mackay is convinced that lesbian nuns are going to hell, we look out into the world mostly to reaffirm our prior beliefs about it. We imagine that the invisible forces that silently guide our beliefs and behavior, coaxing us like flocks of deviant angels, do not exist. We are comforted by the feeling that we have ultimate control over our thoughts, our actions, our lives….

There are seven billion individual worlds living on the surface of this one. We are–all of us–lost inside our own personal realities, our own brain-generated models of how things really are. And if, after reading all of that, you still believe you are the exception, that you really are wise and objective and above the powers of bias, then you might as well not fight it. You are, after all, only human. (113)

 

But all this is not enough. Cognitive dissonance, confirmation bias, the brain’s desire to have the outer, real world match its inner models of it–it takes us part of the way there. It tells us that a properly functioning brain cannot be trusted to think rationally and, because our minds play these tricks without telling us, that owners of brains cannot be trusted to judge their own rationality. (224)

 

We are natural-born storytellers, who have a propensity to believe our own tales…. A series of remarkable scientific discoveries, going back to the nineteenth century, have bolstered this view. They have assigned it a word, which describes what we do when we unknowingly invent explanations for behaviors and beliefs whose causes we are actually ignorant of: confabulation. (234)

 

The stories that we tell ourselves are another essential component to all this. The model of the world that we build for ourselves to live within is made of observations of cause and effect that are soaked in emotion. These micro-stories, whose purpose is to explain and predict the world, can grow into staggering tales of magnificent drama and complexity. In _The Political Brain_, Professor Westen writes ‘research suggests that our minds naturally search for stories with a particular kind of structure, readily recognizable to elementary school children and similar across cultures.’ In this structure, a crisis strikes a settled world, heroic efforts are begun to solve it, terrible obstacles are surmounted and dreadful enemies are battled, until a new and blissful state is achieved. According to Professor Westen, the political left and the right each has a ‘master narrative’ that relects this structure–a grand, over-arching plot that comes loaded with a set of core assumptions, that defines the identity of heroes and villains and promises a paradisiacal denouement. (254)

 

The Hero-Maker tells us why intelligence is no forcefield and facts are no bullets…. Facts do not exist in isolation. They are like single pixels in a person’s generated reality. Each fact is connected to other facts and those facts to networks of other facts still. When they are all knitted together, they take the form of an emotional and dramatic plot at the center of which lives the individual. When a climate scientist argues with a denier, it is not a matter of data versus data, it is here narrative versus hero narrative. David versus David, tjukurpa* versus tjukurpa. It is a clash of worlds.

The Hero-Maker exposes this strange urge that so many humans have, to force their views aggressively on others. We must make them see things as we do. They must agree, we will make them agree. We are neural imperialists, seeking to colonise the worlds of others, installing our own private culture of beliefs into their minds. (384)

 

*Tjukurpa: Every Aboriginal newborn is assigned a ‘tjurkurpa’–a story from the time of the world’s creation which, in its details, will tell them everything they need to know about where to find food, medicine and water for hundreds of miles around. It will teach them about magic and spirits and detail an elaborate moral code. (372)

We all tell stories about ourselves where we’re the heroes, other people are the villains, and our heroic acts save the day somehow. Well, we don’t all tell such stories. Apparently, really depressed people tend to have a more realistic understanding of their own lives than the majority of us who can believe our own hero narratives. There are a couple of ways to look at this. Modern psychology seems to be in the business of tricking our brains back into believing we’re living meaningful lives and not thinking about what relatively insignificant specks of matter we are in the universal scheme of things. The other way out is to try to back away from conventional views and interpretations of the world and just accept it as it is, understanding as Nietzsche put it that “it’s only as an aesthetic phenomenon that the world and existence are continually justified.”

What we shouldn’t do is believe that modern psychology is coming across something so startlingly new about our self-narrative skills that the knowledge is completely unprecedented. It seems to me like we’re finally starting to understand the details of things that even some ancients understood in very broad terms. At least two ancient philosophical traditions–the Daoist and the Stoical– seem to have been aware of just how tricky and biased the mind can be when interpreting reality, only instead of being suspicious of reality as the Platonic tradition was and positing some more real reality beneath the appearances, they recommended not allowing conventional knowledge and prejudice to judge that reality.

For example, here’s a passage from the Stoic Marcus Aurelius’s Meditations (the OUP Farquharson translation), Book 8, section 49:

8.49. Do not say more to yourself than the first impressions report. You have been told that some one speaks evil of you. This is what you have been told; you have not been told that you are injured. I see that the little child is ill; this is what I see, but that he is in danger I do not see. In this way then abide always by the first impressions and add nothing of your own from within, and that’s an end of it….

Marcus’s point seems to me to be a similar understanding of the ways we bring our prejudices and biases automatically to help us interpret the world. Say something bad about me? I’ll hate you for harming me! The relevant Greek here is: μένε ἀεὶ ἐπὶ τῶν πρώτων φαντασιῶν, literally “always stay with first impressions,” or perhaps “appearances.” (I double checked that one with our Classics librarian. Thanks, Dave!) If the scientists Storr consulted are right, that might not be possible to do, since even our awareness of our biased brain isn’t enough to make us think we’re not biased. It also seems true that intelligence as such is no corrective. Even philosophical training, which helped shake loose a good number of my childhood prejudices, doesn’t keep up from telling biased and heroic stories about ourselves. (For some evidence, follow the self defensive moves in the Colin McGinn scandal within philosophy. You might conclude, as I did, that sometimes a handjob is really a handjob.)

The same general idea shows up in the Handbook of Epictetus (translation from the Everyman edition)

45. Does someone take his bath quickly? Do not say that he does it badly, but that he does it quickly. Does any one drink a great quantity of wine? Do not say that he drinks badly, but that is drinks a great quantity. For, unless you understand the judgment from which he acts, how should you know that he is acting badly? And thus it will not come to pass that you receive convincing impressions of some things, but give your assent to different ones.

The Daoist tradition makes what to me looks like a similar demand to the Stoics. Here’s a passage from stanza 3 of the Dao De Jing (the Ames and Hall translation.)

They weaken their aspirations and strengthen their bones,

Ever teaching the common people to be unprincipled in their knowing (wuzhi)

And objectless in their desires (wuyu),

They keep the hawkers of knowledge at bay.

It is simply in doing things noncoercively (wuwei)

The key term here is wuzhi, which Ames and Hall translate as “unprincipled knowing,” although based on their explanation I prefer the phrase “unprejudiced understanding,” as in trying to understand something without the biases and judgements we bring to everything. In the introduction, they analyze the “wu forms”:

Wuzhi , often translated as “no-knowledge,” actually means the absence of a certain kind of knowledge–the kind of knowledge that is dependent upon ontological presence: that is, the assumption that there is some unchanging reality behind appearance. Knowledge grounded in a denial of ontological presence involves “acosmotic” thinking: the type of thinking that does not presuppose a single-ordered (“One behind the many”) world, and its intellectual accoutrements. It is, therefore, unprincipled knowing. Such knowing does not appeal to rules or principles determining the existence, the meaning, or the activity of a phenomenon. Wuzhi provides one with a sense of the de of a thing–its particular uniqueness and focus–rather than yielding an understanding of that thing in relation to some concept or natural kind or universal. Ultimately, wuzhi is a grasp of the daode relationship of each encountered item that permits an understanding of this particular focus (de) and the field that it construes. (40-41)

At least as I’m understanding it, practicing wuzhi would be akin to relying upon Marcus’s proton phantasion, or first impressions. This might be ultimately impossible, and after his investigations Storr seems to think so. Even if we’re aware that we have biases, prejudices, or “principles,” we can’t necessarily be aware of what they are, and it could be we’re no better off than we were before.

This is the point at which I’m torn. Perhaps we are the center of the stories we tell about ourselves and we tend to dismiss those unlike us and secure ourselves in a cocoon of self-congratulatory good feeling, but couldn’t an awareness of that as constant as possible be helpful in our dealings with others as well as our understanding of ourselves in relation to the world? We might not be able to escape the mind’s trap, but if we know we’re in a trap we’re maybe a little better off, or at least a little less arrogant and cocksure. An awareness of the problem all round can only help communication.

I was going to apply some of this to various library disagreements I’ve encountered recently, but I’ve gone on long enough and will save that for another post or column. It does have application to problems in the profession and the workplace, but right now I’m still pondering. It’s a lot to think about.

Predictions of the Library’s Future

I’m working on another library history project and having fun reading through some old library literature. Here’s a good example of librarians trying to predict the future, from a 1933 Jesse Shera article in the Library Quarterly, “Recent Social Trends and Future Library Policy.”

With the older people constituting an increasingly larger percentage of our population, the demand for leisure-time activities and the services of the librarian should increase, while the children’s librarians, relieved of the burden of ever increasing numbers to serve, can shift their attentions from quantity to quality. Further, the curtailing of immigration will not only be reflected in our rapidly falling birth-rate, but our population will more and more become racially homogeneous,* so that library work with the foreign born will become decreasingly important.

*T. J. Woofter, “The Status of racial and ethnic groups,” Social trends, I, 553-60I.

He sure got that one wrong.

On Librarians Writing

It seems to be the month for librarians to write about writing. Within the past week I’ve read three different articles or blog posts about librarians writing: Emily Ford’s Becoming a Writer-Librarian in In the Library with the Lead Pipe, Trudi Bellardo Hahn’s and Paul T. Jaeger’s From Practice to Publication: a Path for Academic Library Professionals in College & Research Libraries News, and Joanna June’s Learn to Write (Well) at the Hack Library School blog. They’re all worth reading for potential writing librarians, and they made me reflect a bit on my own life as a writer. As an experienced writing teacher who has managed to publish some professional writing in a variety of formats, I thought I’d toss out my thoughts on writing as well.

Hahn and Jaeger write for academic librarians wanting to publish research, and their advice is more specific than the other articles. They have a helpful chart of different ways to proceed toward publication, with four categories: A) Highly Competitive Publications, B) Less Competitive Publications, C) Unpublished Presentations, and D) Support/ Service/ Recognition. I’ve done a bit of A, C, and D, and a whole lot of B. Since I don’t care whether my publications are “highly competitive” or not, I can work comfortably in the “less competitive” domain. This is also what they suggest for new librarian writers. “Academic librarians who are just starting out should consider all the options available in column B.” Column B includes articles in non-refereed journals, magazines, or newsletters; columns in a journal or magazine (permanent); guest columns; and blogs among others. That seems like good advice. Looking back at my CV, two of my first three publications were in C&RL News itself, and the next few were guest columns or editorials or entries for a column I edited. I was a librarian for years before I published anything peer-reviewed and I avoided tenure-track jobs so I wouldn’t have to write before I had something to say. They also give the good rhetorical advice that “a highly competitive outlet is not necessarily always the best fit for a project, and the desire to have materials published in a certain type of outlet should not be prioritized at the expense of the determining the most appropriate outlet and audience.” That’s hard advice for someone needing to publish peer-reviewed articles for tenure, but still sound. Some topics need a book and some a blog post.

Ford also gives some good advice. The “Writing is Social” section reports about her participation in Academic Writing Month and Digital Writing Month, which provide social incentives and support for writers. I’ve never been a social writer myself, but if there’s one thing I’ve learned about writing it’s that writers use all kinds of different tactics to be productive. I usually don’t show my writing to anyone before sending it off for publication, but I’m a good self-editor with a lot of experience. Most people would benefit from a “community of practice,” I suspect. “Reflecting on Writing” suggests reading about writing, which might be something librarians don’t think about doing. I haven’t read her suggestions before, but with 17 years off and on teaching writing, I’ve read a lot of books on writing. Ford’s choices seem concerned with making writers more productive or overcoming blocks. I’ve never had writer’s block or difficulty organizing a writing project, so those aren’t issues for me, but I know they are for a lot of people.

However, I also think it’s a good idea for potential librarian writers to read nuts and bolts type books they might otherwise skip. Examples I’ve profited from in the past include: Jacques Barzun’s Simple and Direct: a Rhetoric for Writers, Joseph Williams’ Style: Lessons in Clarity and Grace, Richard Lanham’s Revising Prose, and Diane Hacker’s A Writer’s Reference. Hacker is the most basic, and I probably wouldn’t have read anything like it had I never taught writing. However, it’s still worth the time. I’ve encountered lots of writing from students and other unpublished writers that still have basic problems with subject-verb agreement or pronoun-antecedent agreement or who don’t know whether use which and that with restrictive clauses, or for that matter don’t know what a restrictive clause is. Williams’ book is an advanced guide to prose style and Barzun’s is full of solid rhetorical advice. Lanham’s short book is a good guide to revision, which is something lots of writers dread.

June’s short blog post contains useful tidbits for writers: read critically, write a lot, step away from your writing for a while before revising, and proofread by reading your writing aloud. All are sound suggestions.

Ford admits that she always wanted to be a writer. I’m sort of like that. I remember in the 5th grade wanting to be a fiction writer, but as I grew up that goal changed as I studied new subjects. At various times from age 13 on I’ve wanted to be a journalist, a photojournalist, an architect, a musician, a fiction writer (again), and an English professor. But mostly it’s been writer. I still write fiction sometimes, and have a couple of finished novels and lots of drafts, but I don’t bother trying to publish any of it because it takes too much effort and I don’t really care if it’s published because I write for my own satisfaction. Librarianship was what I ended up with after I’d abandoned other ideas., and it’s turned out that librarianship has offered me plenty of opportunity to write anyway.

When you’ve done something for so long, it’s sometimes difficult to articulate how you do it, but I decided to try. Below are the activities that I think have had the most positive influence on my writing.

Teaching writing

Teaching writing has probably done more to improve my writing than anything else. My writing was already strong in college, and I got through many a class with an A merely by my ability to crank out thousand-word essays at a rapid pace. However, that ability was mostly because of reading (see below). Teaching writing improved that ability significantly because I did two things I’d never done before: I read a lot of writing by inexperience writers, and I studied the nuts and bolts of writing. People who read a lot learn about what good prose reads like, but they don’t necessarily think analytically about how writing works. As a teacher, my job wasn’t just to grade essays, but to give specific advice for improvement, and to do that I needed to figure out what was wrong. Why didn’t that sentence work? What’s wrong with the organization of this essay? And what specifically can that writer do to improve? Doing that is a lot harder than you might think. Teaching writing gave me a vocabulary for discussing writing that helps with my own writing, but that I might not have gotten otherwise.

Teaching also exposed me to a lot of bad writing. People who read a lot of published prose might not realize how very bad most writing is. Especially in the pre-blog days, truly awful writing was unlikely to be published anywhere with much of a readership. But wade through a stack of 40 student essays and you realize that writing is a far from natural experience. Reading bad writing makes it easier to figure out what good writing is, though. Reading great novels or polished essays critically can teach you a lot (see below), but reading bad writing gives you a new perspective. If you can figure out why it’s bad, you’re on your way to looking at your own writing more critically.

Reading Everything

Writers read. This shouldn’t even have to be mentioned, but I can almost guarantee you that if you never read, you’ll never be much of a writer. By reading, I don’t just mean novels or great literature. In fact, unless you want to write fiction, there’s not any need to read a lot of fiction. But it helps to read everything: novels, short stories, poems, essays, new articles, encyclopedia entries, cereal boxes, blogs, tweets, comic books, textbooks, book blurbs, and even scholarly articles in library science. I’ve been an avid and indiscriminate reader ever since I can remember, and every little bit has contributed to my development as a writer.

Reading Critically

On the other hand, being an indiscriminate reader all the time isn’t a good idea. You need to learn to read critically. Good writing as well as bad writing can be read critically. Understanding how bad writing works can help you avoid it in your own work, but understanding how good writing works is more important and more difficult. My writing efforts are mostly novels and essays. I’m curious about how both work. My time as an English major and grad student has given me a large vocabulary to understand and describe writing in a critical way, which is sometimes different than how I might think about writing as a writing teacher. Thinking about writing as a critic and thinking critically as a practioner are both helpful.

Writing Every Day

Writers write. This was the advice I once gave someone who was pestering me over drinks about writing. She thought the drunken escapades of her life would make great blog fodder. Maybe. But although writers often like to drink, the most important thing is writing. Writers write, usually every day. It’s a bad sign when people keep talking about what they would write if only they could get around to it. It doesn’t even matter that much what you write. Sure, if you’re working on an article under a deadline, you might focus on that, but writing about anything helps. You can even write about how you want to write and have nothing to write about. It’s still good practice. And if you can just write a page a day on a writing project, that still adds up pretty quickly. Writing has been a daily part of my life for 25 years, and that constant practice is part of what makes it so easy for me.

Writing in my Head

Writers write and read about writing, but a lot of writing looks suspiciously like staring into space, or walking, or (in my case) lying in bed early in the morning with your eyes closed. William Wordsworth used to go for long walks composing poetry in his head. I’m frequently thinking about things I might write, conjuring up thesis and motive in my head, pondering possible organization. The actual typing part is often the last and fastest part of the writing process, at least for me. So while you should write every day, you could also spend time every day thinking about whatever you might want to write.

Accepting Constructive Criticism and Editing

Inexperienced writers are often insecure writers as well. They don’t like criticism. It offends them that someone would think their writing needs improvement. The writers most resistant to criticism are also usually the worst writers. Experienced writers learn that writing is just words on a page,and they learn to appreciate constructive criticism. This is different from merely negative feedback, which can be dismissed. Someone recently in public called a piece I wrote “pure bilge,” but that’s the sort of emotional and derisive comment that makes me question the reader’s judgment, not my writing.

Writers should also learn to accept editing. I write pretty well, but I’ve had numerous editorial suggestions over the years. With very few exceptions, I’ve taken the suggestions and revised accordingly, because good editors aren’t trying to change your message but improve its communication. Sometimes writers resist editing out of pride or insecurity, and sometimes it’s out of exhaustion. Years ago a librarian asked me to read an article he’d written before submission to a journal editor. I agreed. It was an incoherent 50-page article. That was the bad news. The good news was that with a bit of work it could be turned into two very good 25-page articles, and I suggested how to do that. He was exhausted and said he’d just trust the editor. I replied that if the editor was any good he’d say the same thing I did. Despite considering my comment arrogant at the time, he later sheepishly admitted that the editor was pretty good, plus he got two good articles published instead of one.

Revising

Some writers can produce polished prose on the first sitting without any revision at all. I can occasionally do that with short blog posts, but for most writing I revise, sometimes a little and sometimes significantly. Even with writing as informal as blog posts, I often reread the piece several times, making major or minor corrections as I go through. This is where knowing the nuts and bolts of writing helps the most. Writing a draft is sometimes mindless, because the goal should be to get some words down, but revising should be thoughtfully done. For blog posts my goal is clarity and coherence, but when writing for publication I spend more time thinking about everything from sentence structure to organization.

So that’s my story. Every writer’s story is a little different, but those are the activities that I think have helped my writing the most.