Friday, November 28, 2014

Where Is Your Knowledge?

If you know something, I always say, you can compose a coherent prose paragraph about it in 27 minutes. This sometimes starts a discussion about what you are allowed to bring with you into that 27-minute session, and sometimes a discussion about what sort of preparation is required. These two issues are of course related.

Let's start with preparation. On my rules, you are not allowed to do any "preparation" for your planned writing sessions. That may sound odd, so let me explain. A "planned" writing session is one that you have decided on at the latest the day before. At the end of the working day, you have decided when, where and what you will write (also, by implication, why and for whom you will write). At 5:00 pm today, for example, you might decide to write a paragraph between 8:00 and 8:27 am on Monday, in your office, that will say that Virginia Woolf thought novels communicate "the loneliness which is the truth about things". (Or, merely inspired by Woolf, and unsure what she herself believed, you may have decided to write a paragraph that says that novels communicate what Virginia Woolf called "the loneliness which is the truth about things". Do note the difference between those two tasks.) You have thereby chosen to write down something you know, first thing Monday morning. You have not decided to learn something by Monday morning. You have decided that you already know it, that you are, in that sense, already prepared.

It's like deciding you'll go for a 5k run on Monday. You're not going to spend the weekend getting into shape for it. If that is necessary, you should decide on a shorter run. Seems simple and obvious in the case of jogging, but it needs to be said in the case of writing.

What, then, does it mean to know something at the end of the day on Friday well enough to be able to write comfortably about it in your office on Monday morning? Notice that the place you will be sitting is part of the decision to write. In this case, you are predicting that you will know what novels do (or what Woolf thought novels do) in your office on Monday morning. What difference could the location make? Well, you may have a number of books in that office. I encourage you not to open them, however, unless you've clearly marked the pages you will be needing, so as not spend most of the 27-minutes searching through them or even succumbing to the temptation to actually read them. More usefully, your office contains the notes you have from your reading, and you can select the relevant pages from your notes, and lay them out beside your computer (or whatever you write on) before you leave the office for the weekend. With those notes at hand, then, you will know what you are talking about come Monday.

What this shows is that knowledge is not something you have in your head. Knowing something is a relationship you establish between, on the one hand, you memory, your habits, your imagination, even your hands, and, on the other, your notes, your books, your university's library, your data, and the vast complexity of the real world that it represents. When you know something you may not be able to quote your source verbatim, but you know exactly where to find it. (These days, of course, you may know only what search terms you can plug into Google to lead you directly to the source. There's something unseemly about this to me, but that may just be an indication of my age.) There is certainly a component of your knowledge in your head, in fact, in your whole body, (and it remains important to test our students for the presence of this component) but it does not suffice without the network of support that knowing something implies.

Still, my test remains those 27 minutes. If you can't decide in advance to write something down, and arrange a set of circumstances under which such writing can reliably happen, then you simply don't "know" what you are talking about. You may be very close. You may almost have learned it. But until you know how to set up a situation that lets you compose a coherent prose paragraph of least six sentences and at most 200 words in 27 minutes you have not reached that particular state of competence scholars valorize as "knowing". Keep at it. And later today, just choose something else to write about when you get in on Monday morning.

Wednesday, November 26, 2014

Normal Distributions

I think it's Albert Camus who said that we often underestimate the effort people make to be normal. Though I'm not sure it's a misinterpretation of what Camus himself intended, I think it's unwise to let the truth of this statement lead us to abandon "normativity" in the sense in which this notion is used in identity studies. I suppose I risk being called conservative, and will no doubt be asked to "check my privilege", but Andrew Gelman's post on grade inflation got me thinking about the impossible burden of identity work in a world without norms. Let's leave aside the important issues around race, gender, class, and sexual orientation, and consider just the question of academic achievement.

One of the things you're supposed to discover about yourself at university is whether or not you're inclined towards research and, of course, whether you have an aptitude for it. Obviously, not everyone is cut out for a professorship, and that's no shame on anyone. People go through years and years of schooling and then, at some point, many of them leave school to go into business, or politics, or entertainment, or gainful unemployment. It makes sense to have "elite" schools, like Princeton, where exceptionally high-achieving high-school students go to get a(n even) higher education. But once there, it would be really surprising if all them turn out have the intelligence and curiosity to impress "academically". It also makes sense to have less elite universities, where people who didn't do quite as well in high-school can go and, again, try to impress their academic teachers. This creates a career path for straight-A high school students through an Ivy League BA, to, say, a top law school and into the legal profession, but also a path for a B-student in high school, through a less selective state university, a master's degree somewhat higher up the ladder and, finally, a PhD at Princeton. That's because what it takes to succeed in academia isn't exactly the same thing as a what it takes to succeed in high school. You've probably seen my point coming: different norms apply.

I'm focusing on academic outcomes here, but they are of course affected by extracurricular distractions. The important thing is to have a system that actually registers the students' relative success at meeting the specifically academic standard at a particular point of their life path. At some point, the student runs into a limitation. Having received easy As in math all her life, she suddenly finds herself getting Bs in advanced statistics. This should not be a tragedy for her; she's just learning what she's good at. Having struggled for his Cs in high school English, he suddenly discovers he's able to earn As in philosophy. This isn't an indictment of high-school English. It's just, again, an exposure to a different set of norms.

What about the curve? I don't think there's anything wrong with the idea of meaningfully graduating at the "top of your class", i.e., of letting academic achievement be relative to your cohort, not some Platonic ideal grasp of a subject matter. And most people in most classes really should be satisfied with the Bs and Cs that are available to them after all the well-deserved As have been given out to people with abnormal intelligence or curiosity, and the well-deserved Ds and Fs have been assigned to those who need to find other things to do (or learn to show up to the courses they have enrolled in).

My point is that there are enough different kinds of "normal" out there for everyone to be normal in some ways, exceptional in others. By refusing to articulate clear, even pedantically clear, standards for "academic" work in higher education out of a "respect for difference", i.e., by refusing to mark out a space of perfectly respectable "normal" achievement (Bs and Cs), as well as a range of high and low achievement (As and Ds), we are robbing students of the opportunity to find out exactly where and how they are normal. Sure, some will still make the tragic effort to be normal (or brilliant) in an area they are simply not normal (or brilliant) in. They may be trying to impress their parents, for example, or embarrass them. The truly sad cases are of course those who pretend to be average where they are really brilliant.

Camus' insight is important, finally, because any effort we make risks being wasted. There should be vast regions of normalcy out there that most people, in most of their activities, can enjoy effortlessly. Being yourself should by and large ... on the whole and in the long run, on average, however you want to put it ... be easy. Our opposition to normalcy is really a demand for uniqueness. We are asking everyone to be unique in every way. And we then ask our already beleaguered faculty to grade these singularities by way of an assessment of the "whole person". Can't we see how impossible we're making things for ourselves? Just assign those damn 5-paragraph essays. Tell the students there are such things a good and bad writing, greater or lesser ignorance. Then spend the five or ten minutes per paper it will take to distribute their efforts under a normal curve. These "whole people" will be fine knowing only how well they did relative to each other in the art of composing themselves into five, more or less coherent, more or less intelligent, more or less knowledgeable, paragraphs.

Monday, November 24, 2014

Originality, Plagiarism and Pierre Menard, Part 2

Jonathan sets me straight. At least partly. On my reading, Pierre Menard neither "re-wrote Don Quixote without ever having read it" nor "transcribed" it through some unknown process that rendered it an "original" composition of his own. Both ideas are belied by Borges' text. "When I was twelve or thirteen years old I read it," writes Menard in his letter to the narrator, "perhaps in its entirety. Since then I've reread several chapters attentively." Our narrator also tells us that Menard's "aim was never to produce a mechanical transcription of the original; he did not propose to copy it." Jonathan at one point suggests Menard "reproduces or 'transcribes' it through an unexplained science-fictiony device" or alternatively (and I think more plausibly) "memorize[s] sections of it and then sit down to write, but never writing down something unless he felt it as his own". Jonathan emphasises that honesty is the key to this, since in one sense what he is doing is in fact transcribing: he is "writing across" from one text to his own. It's only when he has actually appropriated the words, so that they are no longer Cervantes' but his own, that his project has succeeded. The standards by which one can evaluate this process are of course unknown.

I'm still not convinced this is exactly what Borges, Menard or the fictional literary critic had in mind. I'm entirely willing to play at being "more Borgesian than Borges" as Jonathan suggests, of course. But I need to square my understanding of the text with, especially, this description of Menard's process, provided in that same letter to the narrator:

My [Menard's] general memory of Don Quixote [from his reading], simplified by forgetfulness and indifference, is much the same as the imprecise, anterior image of a book not yet written. Once this image (which no one can deny me in good faith) has been postulated, my problems are undeniably considerably more difficult than those which Cervantes faced. My affable precursor did not refuse the collaboration of fate; he went along composing his immortal work a little a la diable, swept along by inertias of language and invention. I have contracted the mysterious duty of reconstructing literally his spontaneous work. My solitary game is governed by two polar laws. The first permits me to attempt variants of a formal and psychological nature; the second obliges me to sacrifice them to the 'original' text and irrefutably to rationalize this annihilation."

Here the suggestion is that he'll work with his memory of the story, not his memory of the the text, which he insists is as imperfect as a novelist's image of a book he's not yet written. It's out of that imaginary that he will attempt to produce a text that is identical to Cervantes'. The claim is that he succeeded in writing two chapters and part of another.

I'm being pedantic mainly for the sake of making this clear to myself. And also because something Jonathan said reminded me of another remark of Borges' in his "Note on (towards) Bernard Shaw". "The heroic aspect of the feat," says Jonathan, "[is] bridging the distance between the two sensibilities without ever cheating. The exact mechanism ... is deliberately obscure since what matters is the negotiation between the two subjectivities." In his "Note", Borges dismisses a series of literary "devices"—Lully's metaphysical discs, Mill's worry about the exhaustion of the possibilities of music, Lasswitz's "total library" (which Borges successful made his own)—because they turned the problem into "a kind of play with combinations". I think Susan Blum's "folk anthropologists" are in the same category. "Those who practice this game," says Borges, "forget that a book is more than a verbal structure or series of verbal structures; it is the dialog it establishes with its reader and the intonation it imposes upon his voice and the changing and durable images it leaves in his memory." I think we have to remember that Menard was not trying to do something like those patchwriters who want to know what the minimum amount of changes you have to make to a text is if you want to turn it into paraphrase. He was, as Jonathan says, attempting a "negotiation between two subjectivities" in the most difficult terrain imaginable, i.e., in the mental space that differentiates the meaning of two identical texts.

Sunday, November 23, 2014

Originality, Plagiarism and Pierre Menard

A recent post of Jonathan Mayhew's reminded me of an old complaint I have about the blurbs on my Penguin paperbacks. My 1981 King Penguin edition of Borges' Labyrinths describes Pierre Menard as "the man who re-wrote Don Quixote word for word without ever reading the original" on the back cover. (This sort of thing happens a lot, I've found. I wonder if it's a convention I've never been told about. Perhaps blurbs are supposed to be misleading so as not to ruin the plot?) In any case, my reading of "Pierre Menard" doesn't have him doing any "transcribing", as Jonathan seems to say. In fact, I thought the opposite was true.

Pierre Menard, as I read Borges, was trying to write Cervantes' Don Quixote without plagiarizing it. The task seems to be an impossible one; indeed, it seems absurd. Menard intends to write the exact same words as Cervantes, but he, Menard, is now to be their author. As Borges's fictional literary critic points out, the words will be the same, but their meaning will be entirely different. Menard wanted to, literally, write Don Quixote.

How can you become the author of a book that has already been written? We can imagine a parallel universe in which, as in ours, Cervantes writes the Quixote in the early seventeenth century but, unlike ours, does not publish it, and does not achieve the fame he enjoys here. Then, four-hundred years later, Menard discovers the manuscript and publishes it as an original creation of his own mind. This would of course still make him a plagiarist, but it would be very difficult to discover (if he kept his own secret). Menard would now become the author, and, if he really did present it as something he had just written, his words would be interpreted as those of a contemporary.

Though it is hugely unlikely, we could also imagine another universe in which Menard, in a true coincidence, produces a work that is identical to Cervantes' unpublished manuscript, exactly as Penguin's blurb writer suggests. In this parallel universe, then, two people write the same manuscript independently, they both spring from ("originate" in) the imagination of each unique author. This, interestingly enough, is the sort of "impossible originality" that I've argued we demand of students. We want them to "come up with" ideas that are in most cases already available in the published literature they just haven't read yet.

But these are not the universes that Borges would have us imagine. Menard desires a universe in which Cervantes wrote and published Don Quixote and in which Menard, fully aware of Cervantes' achievement, could also write and publish the same sequence of words, but in his own name, and, like I say, without plagiarizing them. As Borges and Menard are aware, this requires Menard to forget Cervantes' version. The odds against Menard's project are formidable*: the odds of writing the Quixote without plagiarizing it are exactly the odds of writing an exact copy of any book that one has never read. In our parallel universe we need only posit that Menard does not actually discover Cervantes' manuscript. Rather, someone else discovers it after Menard has become famous (if writing an original Quixote in 19051935 warrants literary fame). I suppose there would be a scandal. No one would believe Menard had not transcribed Cervantes.

And that's what happens when we find that a student who has, as expected, submitted an "unoriginal" idea in an essay, has also, as expected not to, used the exact same words as, either another student, or an academic blogger, or published scholar. We would not be entirely surprised to find a sophomore English major propose that Nick Caraway was gay. But we would raise an eyebrow if the student wrote "It’s a testament to Fitzgerald’s talent as a novelist that he was able to provide so much textual evidence that Nick is gay without confirming it or drawing undue attention to it. Subtlety is an art." Here a set of quotation marks and a reference to Greg Olear, not to mention an ellipsis, would, of course, be expected.

___________
*Perhaps this is why Andrew Gelman is so passionate about plagiarism. The excuses are so often an affront to probability theory.

Wednesday, November 19, 2014

A Revision of Solitude?

According to Susan Blum, academia is beholden to an "eighteenth century model of the self and author [which assumes] a singularity and essence that [is] fixed, unchanging and in some ways isolated (unaffected by others’ influence)." But she and her fellow anthropologists have been questioning these assumptions, noting that recent technological developments render them obsolete and should have us rethinking our basic approach to higher education. Shaped by social media, "our students have become folk anthropologists, speaking out about the impossibility of singularity, the shared quality of discourse, the reality of fragments of texts incorporated into every utterance (or written document) and the collective nature of cultural creation." As Jonathan Mayhew has pointed out, this sort of thing has become pretty orthodox in the social sciences, travelling under the banner of "postmodernism". He's not exactly impressed.

As I was reading Jonathan's post, a remark about Rosmarie Waldrop's use of the "I" in her introduction to Curves to the Apple came to mind. "This 'I'," she says, "has lately been confused with the expression of unquestioned subjectivity and identity. But it simply indicates that language is taking place." She doesn't say who "has lately been confused", but it may well be those anthropologists and their students, who think that demanding "originality" of authors is tantamount to requiring them to be "geniuses". Now, Waldrop is a poet and her remarks resonate nicely with those of another poet, Tony Tost. He also doesn't say exactly who he has in mind*, but he seems to be correcting a common misconception when he says, "One is not condemned to a perpetual present, nor to the immediacy of seemingly random, unconnected signifiers. In summary, one is here because one has remembered to be here. In conversation, one discusses what rises" (Invisible Bride, p. 46). There's something distinctly postmodern about the "immediacy" he rejects. But, like Waldrop, he suggests that we should just keep talking. Perhaps it's just language.

Allen Grossman, a poet of Waldrop's generation (b. 1932) who recently died, also seems to hold an "eighteenth century" notion of the "in some ways isolated" self. Explicitly so, in fact: he invokes Descartes, the godfather of the "isolated subject". In his postscript to Descartes' Loneliness he tells us that "We, each one of us alone, think in our solitude about our own mind and about the world, in language—and each finds out thought about the self, about other persons and their claims upon the self, speaking and answering by means of language." There it is again—language. Grossman, if I recall, is one of Tost's influences, and perhaps we see a bit of it on the same page I already quoted: "Talking becomes a conscious stammering not in one's language, but in how one thinks," he writes; "a conversation represents not so much a break with solitude, but a newer form of solitude, a revision of the logic of solitude."

I became aware of Tost's work back in 2003, when I read a poem that, interestingly enough, was made by patching together materials found on the Internet by searching for variations on the phrase that constitutes the title, "I Am Not the Pilot". It had a profoundly liberating effect on me. The poet, as I've noted elsewhere, is rejecting the sort of "competence" that is demanded of him, and is performing that rejection precisely by plagiarising every word of the poem. (This "Google sculpting" has since become the hallmark of so-called "Flarf" poetry.) I have never held this against him. He remains my favourite living poet.

I'll continue this soon. There's an obvious tension here between the poetic sense of self and language and the anthropological one. At the same time, Tost's "I Am Not the Pilot" is perhaps a sign of a "revision of the logic of solitude", the logic that is characterized by Grossman as "Descartes' loneliness". That revision may be, as Blum suggests, driven by technology. That doesn't mean that we are, to use Tost's word, "condemned" to lose ourselves. What was it William Carlos Williams said? "When I am alone I am happy."

_________________
[Update: I just googled his phrasing to find his source. It can be found on page 93 in Pamela Odih and David Knights' "Just in time?", in the anthology Knowledge, Space, Economy, edited by John Bryson, Peter Daniels, Nick Henry, and Jane Pollard (Taylor & Francis, 2000.) It turns out Tony has patch-written this line! This is not surprising given what we know about "I Am Not the Pilot".]

Monday, November 17, 2014

"Originality is Impossible"

One of the most interesting professional tensions that I experience in my work as a coach is the resistance of anthropologists to my ideas about the writing process. So I guess I shouldn't be surprised to find myself in a disagreement with an anthropologist about the nature of authorship itself.

Susan Blum has provided a useful summary of her distinction between the "authentic self" and the "peformance self" in the American Anthropological Association's Anthropology News (March 2008). Two things stand out for me. First, she casts the "anthropological" notion of self as a foil for the traditional "academic" sense of self. That is, she suggests that there is a tension between what professional anthropologists know to be true about the self and what academics in general presume about it. Second, and more worryingly, she believes that students, unlike their university teachers, are in possession of this anthropological truth about themselves. That is, the students, qua "folk anthropologists", are right.

In defending the "academic", "authentic" self let me begin with what I think is a common misconception among patchwriters about originality. Here's one of Blum's subjects, i.e., a student she talked to during her fieldwork:

Ideas are gonna get recycled. There’s no way that a hundred kids in a class could write papers with all fresh ideas. That’d be a hell of a class if you could. In fact, I’d be willing to say that no one—not even one student—will come up with something that’s never been come up with before. And that’s not an indictment of them, it’s just these ideas are all over the place.

Now, academics know this as well as any student. When teachers ask students to submit "original" work, they are not asking them to "come up with something that’s never been come up with before", they are merely asking them to submit for evaluation their own ideas, i.e., ideas that, whether actually "original" or not (in the hyperbolic sense invoked by the student), are ones they actually "came up with". They will have arrived at these ideas on the basis of their reading, and it's therefore important for the student to properly reference the reading they have done, leading up to the part that they came up with themselves, so that the teacher can assess their abilities and give them a grade. Now, if they pass off some part of their reading as their own ideas they are plagiarizing, cheating. The are pretending they came up with something themselves that they just read in a book. But the fact that the teacher already knows what the student has "discovered" is not in and of itself a strike against the student. It's only a problem if the student hides the source.

Originality in the sense of something "new under the sun" is of course very rare. But it is possible to distinguish clearly between what you have learned from reading and what you have thought out yourself. This is very important in school, where almost all of what you learn is already known to others. But it remains important in a research career where "originality" in the strong sense of making that rare "novel" contribution, depends on knowing what is already known.

Wednesday, November 12, 2014

Against Patchwriting

I've decided to confront the issue head-on, if only for the sake of clarity. So I'll just announce straight off that I am against patchwriting. I use that term in the sense coined by Rebecca Moore Howard: "copying from a source text and deleting some words, altering grammatical structures, or plugging in one synonym for another" (Howard 1999: p. xviii). And when I say I'm against it I mean that I refuse to "celebrate" it as some writing instructors do:

Describing the textual strategies of Tanya, a student who in traditional pedagogy might be labeled "remedial," Glynda Hull and Mike Rose celebrate her patchwriting as a valuable stage toward becoming an authoritative academic writer: "we depend upon membership in a community for our language, our voices, our very arguments. We forget that we, like Tanya, continually appropriate each other's language to establish group membership, to grow, and to define ourselves in new ways, and that such appropriation is a fundamental part of language use, even as the appearance of our texts belies it" (152).

These and other studies describe patchwriting as a pedagogical opportunity, not a juridical problem. They recommend that teachers treat it as an important transitional strategy in the student's progress toward membership in a discourse community. To treat it negatively, as a "problem" to be "cured" or punished, would be to undermine its positive intellectual value, thereby obstructing rather than facilitating the learning process. (Howard 1995: 788-9)

I believe, in short, that patchwriting is a problem that should be addressed, even a disease that should be "cured", and in some cases a crime that should be "punished". Though I don't think it really is a "punishment", one simple technique here is to ensure that patchwritten work receives a lower grade. But this is where the "criminal element" comes in, because, like classical plagiarism, it is often not immediately apparent on the surface of the text. The first problem with patchwriting, like other kinds of plagiarism, is that it must be detected. Patchwriting conceals the relationship between one's own writing and the writing of others, and that alone should dampen any possible "celebration" of the student's accomplishment in this art.

The toleration—and encouragement, if that's what "celebrating" can be taken to imply—seems to be founded on a fundamental misunderstanding about scholarly writing, which is clearly on display in the passage I've quoted. It is simply not true that "we forget that we ... continually appropriate each other's language to establish group membership". Good scholars are constantly mindful of these acts of appropriation and therefore continually acknowledge their sources. There are acceptable ways of appropriating the work of others, namely, through paraphrase and quotation, always with adequate citation. There is no mystery (though there are of course a few subtleties) about how this is done, nor when it is done right.

I'll be writing about this in the weeks to come, mainly as a way of reflecting on the work of Rebecca Howard and Susan Blum, both of whom I've written about before. Like I say, I'm going to be taking a hard line on this, mainly in the interest of being clear. Let there be no doubt that I think patchwriting is a problem, and one we need to do something about. It is no more "an important transitional strategy" toward mastery of scholarly writing than any other form of plagiarism, nor does it have "positive intellectual value". True, like plagiarism in general, it does offer a "pedagogical opportunity", or what we also sometimes call a "teachable moment", but only in the sense that it provides an occasion to talk about intellectual honesty. Patchwriters are faking their linguistic competence, and they must be told that that is what they are doing, and that that is the opinion competent scholars form of them when they discover the real source of their language.

It's not, I should add, just a problem among students.

Update: it's not a coincidence that I'm returning to this subject today. Andrew Gelman had warned us that a post about this was "on deck" today. And sure enough: here it is.

Friday, November 07, 2014

What are the implications of a theory paper?

Two years ago, thinking myself wittily obvious, I said that theory papers "accomplish their theoretical aims by purely theoretical means". Yesterday, talking to a PhD student about her theory paper, I found myself saying, perhaps, the opposite. Theory papers, I said, do not have theoretical implications; only empirical papers can truly have "implications for theory". Just because you've thought about something, I said, your peers don't necessarily have to change their minds. That would require some actual, empirical results—a tested theory.

Now, in one sense, that's not really true, of course. When you write a theory paper, you are actually trying to affect the minds of your readers. You're trying to get them to see the world differently, to expect different things to appear under particular circumstances. Rather than showing them such things under such circumstances, as you would in an empirical paper, you confront them with aspects of the available literature that they are unfamiliar with or, perhaps, have just forgotten about. Once those writings, or your particular reading of them, is presented to them, you presume, they will come to expect familiar objects to behave in hitherto unthought-of ways.

If you write your theory paper very convincingly you can accomplish this goal—of changing someone's expectations about an object of inquiry—without any new empirical evidence. At the very least, you can shake the reader's confidence in currently held assumptions about how the object behaves in practice. So was I simply misleading that PhD student when I said a theory paper doesn't have theoretical implications?

Not quite. I was making a formal point about the rhetoric of theory papers. The section that corresponds to the "implications" section of an empirical paper has a particular rhetorical goal, namely, to make explicit what "follows" (logically, rationally) from the rest of the paper. Since the whole paper is about theory, the "analysis" will already have established how the theory must change. It will not just have provided premises from which draw "theoretical" conclusions; it will have presented a complete theoretical argument, conclusions and all, just as an empirical paper will draw empirical conclusions already in the analysis (or "results" section), from which (again, in the empirical paper) either "practical" or "theoretical" implications will then follow.

Just as the implications of an empirical paper reach beyond the empirical material itself (into theory and/or practice), so too must the implications of a theory paper reach beyond the purely theoretical arguments the paper makes. As I said two years ago, and again two days ago, these implications will often be methodological. That is, if you convince your reader to expect something different of the object of their research, this will, probably, have consequences for how they do that research. If you convince them to see the world differently, they'll probably begin to do things differently. Minimally, it suggests doing a study to find out if you're right.

A theory paper may also have "meta-theoretical" implications, or what can properly be called epistemological implications. That is, a reflection upon theory qua theory may lead us to rethink what knowledge is or at least what kind of knowledge we produce. Thus, the choice between "theoretical" and "practical" implications in an empirical paper is transformed into a choice between "epistemological" and "methodological" implications in a theory paper one. (Imagine the permutations for a methods paper!)

To sum up then: a theory paper does make a theoretical contribution but it does not, formally speaking, have theoretical implications.

Wednesday, November 05, 2014

Theoretical and Conceptual Papers

I originally proposed my forty-paragraph outline as a guide for the writing of what I call "the standard social science paper". This is the kind of paper that presents the result of an empirical study, framed by a familiar theory, guided by an accepted methodology, and with definite implications for theory or practice. I was recently asked about theoretical papers and, since I get this question often, I was sure that I could just point to a post on this blog that answered it. It wasn't quite as easy as I thought (though there is this post), and I thought the best solution would be to just write a fresh post on the subject.

What I will be offering here is not a normative guideline for what a theory paper should accomplish, of course. I'll leave that to the major theorists, especially those who serve as the editors of the journals that publish such papers. Instead, I will propose a way of organizing twenty hours work such that, at the end of it, you have produced the first draft of a 40-paragraph theory paper. This draft can then be edited into shape for publication. In outline, it will look as follows:

1. Introduction (3 paras)
2. Historical Background (5)
3. State of the Art (5)
4. Critical Occasion (5)
5. Conceptual Analysis (3 x 5)
6. Discussion (5)
7. Conclusion (2)

Remember that each paragraph should make a single, easily identifiable claim and either support it or elaborate it. It should consist of at least six sentences and at most 200 words. It should be written in exactly 27 minutes.

The introduction will consist of three paragraphs. The first paragraph should be devoted to a history of your field up to the present. The scope of this history will depend on your judgment. Whether your history starts in ancient Athens, in eighteenth-century England, or in Paris of 1968 depends on the contribution you want to make. The second paragraph should be devoted to the present state of the theory. What is the reigning consensus or standing controversy that defines your field of research. This, obviously, should be the state you want transform in some interesting way, either by settling a dispute or unsettling an agreement.

The third paragraph should announce your contribution. "In this paper, I will argue that..." Notice that "supporting or elaborating" this claim, which is about your paper not your theory, does not yet require you to argue your position. You only have to describe a paper that would make such a contribution. And that means you will essentially be outlining your paper. Now, you have already introduced the historical background in paragraph 1, which will have space to talk about in part two of the paper, so you don't have say anything more here. Also, in the second paragraph you have introduced the current state of the theory, which you will elaborate in greater detail the third part of the paper. What is left is to say something about how the theoretical problem you are interested in arose and why you are the right person to deal with it, to outline your analysis a little more, and to tell us why it is important, i.e., to summarize your discussion. That is, the conclusion ends with an outline of parts 4, 5 and 6 of the paper.

Part 4 takes the place of the methods section of a standard empirical paper. In a sense, you are still saying what you did, but it is perhaps more accurate to say that you are explaining what happened to you to force you into a theoretical reflection. It may simply be a development within your field (someone else's or your own empirical results, published elsewhere) or it may be an "event" like the publication of a correspondence or a translation of a previously untranslated work by a major theorist. World events, too, may be relevant here. After 1989 and 2001 there were all kinds of reasons to "rethink" the theories that framed work in a whole range social sciences. Since you're saying how the problem arose, you will also need to say what materials came into view: what texts have you read and how have you read them?

Part 5 will present your argument in detail. It's a good idea to divide the argument into sub-theses each of which can be demonstrated separately. Two to four sections of three to six paragraphs gives you some manageable space to work with here. Finally, part 6 will cash out your analysis in consequences, usually for theory, though sometimes for practice. (You might want to emphasize the important political consequences of your line of thinking.) An important class of "theoretical" implications here is "method". If you're right that we have to see the world in a new way (a theory is always a way of seeing the world) then perhaps we will have to do things differently too?

The conclusion should consist of two paragraphs, one of which states your conceptual argument in the strongest, simplest terms you can imagine. You may want to use the sentence that completes the key sentence of paragraph three (i.e., everything after "I will argue that") as a key sentence here. The last paragraph could suitably extend the history of the field that you presented in paragraph 1 and elaborated in part 2 by imagining a possible future.

I hope that's useful. Don't hesitate to add your own suggestions or questions in the comments.

Monday, November 03, 2014

What We're Doing

I'm grateful to Jonathan for bringing The Universal Mind of Bill Evans to my attention. As I point out in the comments to Jonathan's post, the difference he demonstrates may not be apparent to everyone. If we had not been told, we might experience all three improvisations simply as much better than anything we're capable of ourselves. The same goes for writing. We're not always paying close enough attention to be precise. We "overwrite", let's say.

It's interesting to see Evans's brother Harry push back on the demand for simplicity and accuracy. "To thousands of musicians such as myself: we have to overplay," he says, "because we don't have time to even get to the keyboard to sustain the rudimentary thing." Maybe I'm projecting, but I can feel exasperation in Bill's response. He can only repeat himself: "It's better to do something simple which is real ... It's something you can build on because you know what you're doing." When people explain their faults by saying they don't have time to do it well, I get a little a sad. If we were half as "productive" in academia, half as "advanced", but twice as real and precise, we would be so much better off. We would, precisely, know what we're doing.

Sunday, November 02, 2014