Sunday, December 27, 2015

RSL Published!

2015 has been an exciting year for this blog. I hope you have enjoyed it as much as I have. To put a cap on it, I'm happy to announce that an essay that was constructed out of posts that go as far back as 2005 has just been published in a special issue of Knowledge Cultures. I am grateful to Sharon Rider for her encouragement and editorship, which has yielded a text that I'm very happy to call my own and see in print. I am also happy to see it in the context of what looks like a very interesting issue, which I'm looking forward to reading and engaging with.

The essay is dedicated to Steve Fuller, for what I hope are obvious reasons to anyone who is familiar with his work. Here's the abstract:


Since 1968, at least, academia has been subject to the “crisis of representation.” This essay explores the consequences of the “postmodern condition” for the identity formation of academics. It is informed by Foucault’s and Deleuze’s understanding of the pivotal intellectual developments in the late 20th century, which are taken to challenge Wittgenstein’s presumption that language is essentially about the assertion of facts. Instead of abandoning representation, however, it proposes to meet this challenge squarely, proposing a disciplined engagement with its particular difficulty. Facts are deployed in academic writing, it argues, through the act of scholarship. The ability to represent a fact is at the core of the knowledge that is implicit in the self-formation of the scholar.

Keywords: postmodernism; self-fashioning; representation; scholarship; writing; facticity

Readers of this blog will, hopefully, recognize some of the ideas and the general spirit of the thing. If I find time, I'll make a little index here of the posts that it draws from. Do feel free to use this post also a place to engage with the ideas in the essay, which I'm sure there's much to disagree with.

Monday, December 21, 2015

Scientific Writing and "Science Writing"

"The standards of this criticism alter to the degree that historiography approaches journalism." (Martin Heidegger)

For me, 2015 will be the year that I finally lost all respect for "science writing". Not that I had been holding the genre in especially high esteem until now. Five years ago I participated in what Brayden King called a "backlash" against Malcolm Gladwell's "tsunami of wrong". I was especially worried about the reverence that social scientists have for his writing, and the influence it seems to have on the way they think and teach. I even found myself comparing Gladwell with Daniel Pinchbeck, who was at the time among those arguing that December 21, 2012 marked "the end of time" (in an admittedly only vaguely specified sense). (Last week, I finally got around to buying Terence McKenna's Food of the Gods, which has inspired much of Pinchbeck's project. More on that in the new year.) To my own surprise, I found Pinchbeck's writing to be more credible than Gladwell's (in a sense that I really hope you'll let me specify).

Even back when I was writing my PhD, my romance with popular science was beginning to unravel. I recently rediscovered an old anecdote about Erwin Schrödinger that I read in Leon Lederman's popular The God Particle, written before the Higgs boson was discovered, and before I lost my faith in the genre. At that time I was just beginning my master's studies in what would turn out to be the philosophy of science. But I was getting my history of science by this highly unscientific means. Indeed, this was the source of my understanding of physics beyond the high school level, and I cringe a little now recalling the confidence with which I declared what the metaphysical consequences of quantum mechanics are.

It was probably while reading Richard Dawkins' The Selfish Gene as a PhD student, that I started to realize that something was amiss with my approach to what is known by science. I had gone into my reading of the book expecting simply to "deconstruct" it (that's the sort of thing I was doing back then) but I found it to be a very compelling and illuminating read. It really did teach me, or so I thought, a great deal about evolutionary theory, and about how evolution "works". It helped me to understand things I had previously only vaguely believed. But at the time, a friend of mine was also telling me about the somewhat radical ideas of Paul Shepard and his "pleistocene paradigm" so I was thinking very hard about the evolutionary account of what it means to "be human". Also, though Steve Fuller hadn't yet testified in the Dover case, I'm sure I had discussed intelligent design with him by then. There was much to think about, in short, and I was trying to make up my mind.

I remember very clearly reading Dawkins' chapter on "the extended genome". I won't try to do it justice except to say that Dawkins made a compelling case for the idea that some of our features are the expression of genes that are not our own. (Some snails, as I recall, have thick shells because these shells are, not to their own evolutionary advantage, but to the advantage of parasites they carry. The parasites, not the snails, transmit the "selfish gene" that produces this thickness.) Bringing together my conversations about Paul Shepard and intelligent design, and no doubt my earlier reading of Lederman, I began to work out a theory of the "God Genome", i.e., a sense in which all the human body's traits are actually to God's (or some "advanced" alien species') overall advantage in the universe, or simply incidental products of some other, more or less divine, advantage, and not really to the benefit of our own genes. It was heady and exciting stuff.

But then a kind of depression set in. Dawkins' himself said that his chapter on the extended genome was really just a summary of a book he'd written for a less popular audience. I.e., an actual work of science, a piece of scientific writing. My friend was writing a dissertation about the evolution of cognition and would constantly correct me on elementary factual errors. It was frustrating for us both. He felt like he was teaching me high school biology, I felt like he was stifling my creativity. He, of course, was more right than I was. I simply had no basis to propose a paradigm-shifting account of human nature that makes of our bodies a divine "emanation". Though it was very exciting to think about these things, it just wasn't a very serious intellectual activity. I lacked a proper basis in science. I lacked knowledge. I was an ignoramus.

This, like I say, despite reading a great deal of popular science writing. As I've come to understand, especially since the invention of the TED talk (a "dark art"), it gave me the feeling of knowing without actually providing me with knowledge. Popular presentations of science tell us stories about what is known without giving us the critical foundations we need to engage with it, i.e., to question those stories. I know there are some people who will say that Darwin's Origin of Species is essentially a work of popular non-fiction. But the important difference is that his "public" was highly educated. They didn't lack the knowledge to engage with his ideas, only, perhaps, the time and equipment. Someone who had the necessary resources, would not need a more "specialized" version of his argument before his criticism could be of use.

At some point, perhaps around the time of The Selfish Gene, this stopped being true. Evolution became a theory you should believe even if you don't understand it, and even if it is beyond your abilities to understand it. The public became thankful for popularizers who could give people the feeling that they were "in on" this important theory. I don't have a good historical account of this process worked out yet, but as I write this post, it seems to me to be a worthwhile project to try to pinpoint the moment that scientific belief and a real scientific understanding were separated from each other. It is a consequence of the enormous advances in science and technology, of course, and the specialization that has driven it. I fear this has also affected the quality of our scientific writing.

I will definitely have to say more about this in the weeks to come. After all, the concept of "academic writing", my bread and butter, tells us a little about what is being lost. Knowledge was once something you acquired through years of study, guided by books, but framed by a classroom (other people), an observatory (other vistas), a laboratory (other experiences), a library (other books). If you did not have access to these "academic" conditions you did not presume to understand the topic. Scientists wrote about their discoveries for people who had the knowledge, intelligence, time and apparatus to test them. These days, "science" is becoming something that is produced in a lab and consumed in a book you buy at the airport.

Thursday, December 10, 2015

Academic Writing Is for the Birds?

"It’s tempting to just give up, and I usually do. But it’s also tempting at times not to, so here I go." (Duncan Richter)

As I've had occasion to say a few times before, one of the advantages of blogging is that it opens the possibility that Thomas Presskorn will read something you've written and think out loud about it. Recently, it gave him the chance to point me in the direction of Duncan Richter, whose blogging I haven't been aware of. His taste in music notwithstanding, he's written a brilliant post in somewhat back-handed defense of the humanities, very much in a key I resonate with. (I think I managed to justify that otherwise gratuitous jab at Belle and Sebastian at the end of that sentence, didn't I?)

Richter brings together two troubling trends in (indeed, indside) higher education, under the rubric of "making meaning" as an alternative to, I guess, actually knowing things. My own view, of course, is that higher education should make us more knowledgeable. It should not just imbue our students with greater knowledge of the world in which they live, and the history that shapes our lives, but also make us better able to know things going forward, giving us greater mastery over ourselves and, therefore, making us more bearable to our fellow humans.

The first example that Richter has found is from a recent column in Inside Higher Ed by Barbara Fister, a librarian at Gustavus Adolphus College. (I'm the resident writing consultant at the Copenhagen Business School Libary, and work very closely with librarians, so this connection is highly fortuitous.) The crucial paragraph reads as follows:

From a librarian’s perspective, I'm wondering how can we address a situation where the basic epistemological foundations of our practice are up for debate. For academic librarians, the new Framework for Information Literacy has a strong emphasis on context and on making meaning rather than finding and evaluating it in finished form. It’s not so much “here’s how to do it right” as “if you have a critical understanding of how these social systems operate, you’re better positioned to participate and raise questions.” I’m still a bit skeptical that librarians can effect this shift in perspective – it has to be built into students’ coursework – but it invites us to model a more critical and big-picture understanding, from the fifty-minute one-shot instruction session on up.

I'm not going to hold Fister accountable for the entire ACRL information literacy framework, but I will, it seems, have to engage with it as closely as I've been been engaging with the "post-process" tradition in composition studies. It looks as though we're going to need to make specific efforts to bring about a shift back from "if you have a critical understanding of how these social systems operate, you’re better positioned to participate and raise questions" to, simply, "here’s how to do it right". It's not that it isn't important to understand social systems critically or to participate by raising questions. It's just that getting some basic things "right" is, in fact, foundational.

I've long been arguing that our "basic epistemological foundations" are not discovered by theoretical debate but by practical engagement. There's a craft to scholarship and the state of our foundations depends on the state of the craft. The only way to learn a craft, in turn, is "by doing". And this does actually mean "doing it right", in the sense of working towards an ideal, held to a standard that allows others to judge that you've done it wrong, which is the essential experience of learning.

And this brings us to the second reading that would have driven Richter "drink and despair" if it hadn't occurred to him just to go out and look at birds. This is another article in Inside Higher Ed that presents "findings" to show that it's the quality of a writing assignment, not the amount that that matters. As Thomas pointed out in the comment, Richter's summary of this result is rather apt: "the report provides weak evidence that the bleeding obvious is indeed true." I think this is the sort of issue Andrew Gelman sorts under the importance of considering your "priors". But Richter makes an important point about how far the field of composition studies has moved us away from a grounded, common-sense understanding of what writing is, and what turning the problem of writing into an "empirical" question has done to our literary sensibilities.

This became clear to me a few months ago, when I was reading Freddie deBoer's critique of the empirical standards of composition studies. He mentioned some weaknesses of Arum and Roksa's Academically Adrift, which I had been confidently invoking in my writing seminars and lectures as proof positive that writing makes students smarter and group work makes them stupid. "Science proves it," I would say, if always with a little ironic wink. But it turns out that Arum and Roksa's evidence is being used to justify a number of rather dubious conclusions. This has forced Arum to say things like, "It is hard for me to imagine that any thoughtful educator believes that increasing the quantity of assigned writing is the most effective pedagogical approach to improving the quality of student writing," which apparently needed saying.

As I wrote in my reply to Thomas's comment, there may actually be a good explanation for the caricature of the Arum and Roksa result that "more writing" correlates with improved thinking, though not one that a ghost need come from the grave to tell us about. If we want to know why, statistically, teaching programs that assign a lot of group work generally make students dumber and those that assign a lot of writing make them smarter, we have to recognize that that it's generally easier to "craft" a smart writing task than to craft a smart group task. At the extremes: if you tell the students in one cohort to just "write about this week's reading" and the students in another cohort to just "talk in groups about this week's reading", then the students who've been given the most thoughtless writing task imaginable will get more out of it than those who've been given the most thoughtless group task imaginable. If you add to this "just grading" the result, i.e., giving them a grade for their performance (of "just writing" or "just talking"), it seems obvious that the grade for writing will be more informative and foster better learning.*

I think we who work in the academic writing racket need to take seriously observations like Richter's: "A field in which this makes waves is not one in which I want to work." Ouch! In any case, there's lots to think about here. Before the birds return from the south in earnest.

*Update (11/12/15): It occurs to me that this could be tested experimentally (although I'm not sure it would pass IRB). Give a cohort the CLA at the beginning of the school year and at the end. Give them the same readings and the same lectures (ideally, let them attend all the same lectures). But split them (randomly) into two sections for the purposes of class work and grading. The first section is split into groups that are told to meet weekly to discuss the course content. Every week, they submit 30 minutes of recorded conversation in the group (their choice of when to start the recording). The second group is to work individually, submitting one paragraph of prose every week that is "relevant" to their readings or lectures. Now, at the start of each week the students are given a grade on the past week's work. Here's the kicker: the grade is assigned randomly, albeit "on a curve" (to give a normal distribution of As, Bs, Cs, Ds and Fs). That is, there is no "intelligence" either in the assignment or the grading. The students are completely on their own to make sense of both and get out of it what they can. My hypothesis is that the students who are writing will still outperform the students working in groups on the year-end CLA. And also on any actually graded final exam.

Thursday, December 03, 2015

Against "Preparation"

"The readiness is all."

I was recently asked by a reader of this blog whether I had any suggestions about how to "prepare" for writing. He said he had found the 27-minute key-sentence approach useful, but wanted some advice about how to set up the writing task. I'm happy to oblige.

I've written a little about preparation before, sometimes in the context of "where" you think your knowledge resides before being written down. But these days, I should say, I usually say there is no such thing as "preparation" for writing. There is research and there is planning, but there's no need to prepare your writing session beyond deciding when to write and where to write. These decisions have to be made wisely, but can be made quite swiftly. Choose something you know well to write about. Choose a place where you will not be disturbed.

Can it really be so easy? Before I answer that, please notice that I'm not here denying the difficulty but locating it. Research is hard. And writing is also hard. Your efforts should be concentrated directly on those difficulties most of the time. The (relatively) easy part is deciding whether you are going to be doing research or writing. By creating a third problem, namely, "preparation", you are actually opening the whole space in which things like "writer's block" can thrive. Just don't do that and you'll be fine.

So, my advice is to reduce the problem of preparation to the problem of making a decision, and then to sort the remainder under "research". If you need preparation to write, you really just need to do more research, to gain more knowledge. And it is knowledge proper when you are able to quickly and easily identify it by way of a key sentence, which can then serve as the focus of a paragraph that you can write in 27-minutes.

That does, of course, make certain demands of your research (i.e., your "knowledge formation") process. Your research has to result in, roughly speaking, paragraph-sized claims about the world. You have to make up your mind about things that can be expressed in at least six sentences and at most two-hundred words. If you're writing such paragraphs in a regular disciplined way, and always deciding on what to write the day before, you will develop habits of mind that make such well-formed beliefs more likely. Your knowledge will be "ready" for writing, if you will, even as you are acquiring it. You'll be preparing your beliefs for their expression in prose. Indeed, it is my view that academic beliefs should, by definition, be prepared to be expressed in prose. If they don't play well in prose they aren't very good academic beliefs.

The trick, I've found, when deciding what to write tomorrow, is not to choose something you only recently discovered. Certainly not something you discovered today. Let the discovery settle into the apparatus of your prose a little before demanding it express itself in writing. To ensure this, try only writing tomorrow about something you knew before the weekend at the earliest. At the end of the day, then, spend five or ten minutes choosing between one and six things to write individual paragraphs about tomorrow. And make sure that each choice is marked by a clear, simple, declarative sentence. Then sit down tomorrow and write a paragraph to support each sentence. It's your whole mind that is prepared for this task now.