Friday, May 20, 2016

The End

Martinus Rørbye, Scene Near Sorrento Overlooking the Sea, 1835.
(Source: Nivaagaard Collection)

This is my last post at RSL. For more than ten years, the blog has served as a semi-public place in which to develop my ideas about the art of academic writing, which is not, I stress, merely the business of "writing for publication". My aim here has been to help people develop their ability to write down what they know for the purpose of discussing it with other knowledgeable people. Unfortunately, this art does not stand in a simple relation to the purpose of advancing their careers as scholars.

In 1837, Bernard Bolzano conceived of his Wissenschaftslehre as the principles of composition of a book containing the totality of human knowledge. Not long after, Søren Kierkegaard described his age as "confused by too much knowledge". A hundred years later, Heidegger announced that, with the rise of modern research, the scholar would disappear. In 1972, about a year after my own birth, Deleuze and Foucault decried "the indignity of speaking for others", marking (in my mind at least) the high point of "the crisis of representation".

Twenty years ago, I read Steve Fuller's Philosophy, Rhetoric, and the End of Knowledge, interpreting its title somewhat cynically, at times nihilistically. With time, I came to see the double meaning of the word "end"—its teleological, not just (if you will) escatological, sense. Our discourse is never finished, never perfect. But it does have an end, a purpose. Perhaps it will always be unclear to us, will come to us in flashes of insight always slightly beyond the reach of reason. Here in the blogosphere, Jonathan Mayhew has helped me to see that everything depends on how we appropriate this sense of purpose, how we make it, irreducibly, our own.

And to that "end", then, I leave these social media behind and retreat to a calmer, more autonomous place. There are no cookies there and you cannot leave a comment. But you are welcome to have a look around.

The Library

The archetype of a library is a collection of books. But the focus is not so much on the books as on their collection. A pile of 4000 books in the middle of a warehouse is not a library. But if they are catalogued and put on shelves everything changes. The same 4000 books arranged for easy access in that same warehouse is, in fact, a library. We might want to add that the books should be collected on some principle, which is just to say they should be brought together for some reason.

Books have never really been the only things in a library collection. There were libraries full of scrolls, no doubt, even before there were books. For centuries, libraries have also collected a wide range of documents and other materials, some of which are less portable than books, less amenable to being taken out. As a result, libraries have become not just places where these materials are stored, but also places where they are studied. Libraries normally include a reading room or study spaces for this purpose. You don't just come to the library to find books, you come there to read them. You come there to work with the materials that are collected there.

With the invention of the Internet, the "there" of the library has been challenged. There no longer seems to be a direct need for places that collect books (broadly understood as representations of knowledge you can hold in your hand). Rather, a library needs mainly to provide access to the knowledge that is available online. While it should, perhaps, also provide an access point—i.e., a physical interface with the internet, a place where you can go to get online—most people will "go to the library" online as well, from home. This lets us ask the question, If a library were just a website, what would it provide? What would the user find there? What would a librarian do?

Today, the answer is that it buys access to a range of proprietary databases. Research libraries, especially, buy access on behalf of their users to the databases of the major academic publishers. A growing portion of their book collections also consist of e-books, which is to say, books that are accessible online but only through the subscription arranged by the library. That is, a library is, increasingly, a budget for buying access to materials that have been hoarded behind a pay wall.

Libraries used to make relatively scarce materials (there can only be so many physical copies of a book) available to the public (or a student body). Libraries are now in the business of managing—indeed, restricting—access to materials that exist in superabundance (there is no limit on the amount of times a text can be downloaded from a website). That is, libraries today help us to maintain the illusion that knowledge is a scarce resource and, thereby, the privileges of the the select few to present themselves as knowers.

But here's the thing. Knowledge isn't actually "stored" in the texts in the library. Knowledge has always existed mainly in the living conversation that goes on between knowledge-able people (people who are able to know). Traditionally, the library supported that conversation. It provided a place for it. It brought knowers together with the resources they needed to examine each others' ideas. I hope the library can regain its place again one day.

Thursday, May 19, 2016

The University

Almost ten years ago, I found myself proposing that we stop complaining about the demand to "publish or perish". Instead, I suggested a "more constructive" approach: we could accept that our administrators have time to take only a superficial interest in our work; then we could set ourselves to the task of addressing our readers. This morning I took the radical further step of proposing we do away with academic publishing. This raises the question of how academics should be evaluated for purposes of hiring and promotion. The role of publishing in these decisions, after all, is the main source of its power.

The system I propose is one in which candidates for positions and tenure submit hypertext CVs that link to work they have published on a personal website as well as work that cites them (published on other, also personal, websites). The CV would, ideally, have a "narrative" form, in which the candidate summarizes "the story so far", noting the engagements during which they made their most impressive contributions, acknowledged by the most prominent members of their field. This process would in fact begin with the PhD defense, which could take as its point of departure a "dissertation", also published online as a hypertext. Since post-graduate degrees, as well as academic hiring and promotion, is all done by committee anyway, all of this writing, along with the candidate's online demeanor and rhetorical posture, would be "peer-reviewed" in a highly transparent manner. I do not think any information that is relevant to finding and rewarding competent academics would be lost by removing the evidence of having survived the torture of the increasingly arbitrary process of soi-disant "peer-review" that does the gatekeeping in our for-profit journal literature at present.

"If its object were scientific and philosophical discovery," said John Henry Newman in 1852, "I do not see why a University should have students." A century and a half later, I found myself making the same point when the marketing director of a global consultancy helpfully suggested that academics need to have a "product launch". We already have one, I said: it is called "commencement". Universities do not, at least not primarily, "produce knowledge" in the form of novel ideas that can transform reality, or radical theories to guide new practices. Rather, universities produce knowledge-able students, people who are able to to know things. The graduates of the variously famous, variously "elite", schools are well-known for their relative competence in this regard, though I'm sure employers are looking at our universities with increasing concern.

We might say that "academic knowledge" is precisely the sort of thing that can be imparted to young people between the ages of, say, 17 and 25 during four more or less consecutive years of full-time study. After such a program a certain degree of ignorance (on particular subjects) should rightly be an embarrassment for the individual and a scandal for the institution that conferred a certificate of academic achievement (i.e., a university degree). The primary duty of university administrators is to arrange those four years of study in the most effective way possible.

This requires bringing the right sorts of students together with the right sorts of teachers. A university, let's say, exists at the intersection of a set admissions requirements and a set of hiring practices. This intersection is then governed by what is called "the curriculum". It is essential that the student's mind is prepared to receive the curriculum, but it is also essential that the teacher's mind is qualified to present it. We look to university administrators to make sure that the right students arrive in the right classrooms to sit before the right teachers. Though I won't belabor the point here, I am increasingly worried that administrators today see their mission more as micromanaging what happens in the classroom on the (increasingly accurate) presumption that a good many of the students and teachers don't really belong there.

Universities should be places where intelligent people satisfy their curiosity. The degree of intelligence and the intensity of curiosity simply indicate the quality of the university. The reason that university teachers should also be researchers (scholars, scientists) is that the students should be guided in the satisfaction of their curiosity by people who have made a habit of satisfying their own. They should not be guided by people who have made a habit merely of "writing for publication". Like I say, it is not difficult to imagine a system of hiring and promotion that would put the right sorts of mind in front of students. Sadly, it is also not difficult to see that we are a far cry from it.

(continues...)

The Internet

The Internet as we know it, i.e., the World Wide Web, is the brain child of Tim Berners-Lee (1989). He invented it specifically to make communication between scientists easier. Indeed, it is my view that, in the early 1990s, technology had made academic publishers largely obsolete. The scientists themselves, supported by their librarians, now had the means to communicate their results to each other, and to discuss their validity, without the need of a for-profit press. No doubt there would still be a market for books of exceptionally high quality. But there was no longer any need to maintain an expensive publishing and indexing infrastructure to support the work-a-day communication of ideas and observations.

Consider what the World Wide Web is. It is a collection of "pages" that is accessible to anyone with an internet connection. "Publishing" something is no more complicated than saving a file of a particular type in a particular directory on a particular server. The skills needed to make the file (the "web page") are not hard to learn. Coding a document in html is no more difficult than using Word to set up a document to conform to a journal's formatting guidelines or constructing a proper APA-style reference. Most importantly, any page on the Web can be linked easily to any other page. That is, any result that is being communicated can be linked to any previously published result. Everything is available at the click of the proverbial mouse.

That this technology has not brought about a revolution in scientific communication—like I say, effectively the end of academic publishing—is a scandal and one that scientists—or should I say academics?—are complicit in. Berners-Lee invented a way to expose every idea currently held by scientists and scholars to the criticism of every other scholar, plainly and directly. It allowed academic communities to sort the wheat from the chaff of their research in their own way and at their own pace, while making both their knowledge and the basis of that knowledge available to anyone with the requisite interest and expertise, whether inside or outside the academy.

Instead we have a system that is better suited to extracting rent from scientific research than contributing its results, as knowledge, to the culture. While the economic rents of course accrue to the publishers and their owners, the academics themselves are, like I say, complicit in the business. By tying their careers to their success in the for-profit "publish or perish" system, they ensure that the work of others becomes robustly path-dependent on their own work, and they allow each other to free-ride on a network of gratuitous citations that have very little to do with actually understanding how the particular corner of the world they are interested in works.

It is claimed that we can't imagine anything better. But this is simply not true. I talk to academics all the time, especially early career academics, whose complaints about the problem of getting published most certainly imply a better way of doing things. It is simply this: PhD students should publish all their results on a webpage (hosted by their university, of course). Their committee should evaluate this page and grant or withhold their degree on this basis. The committee should then link, as they see fit, their own pages to the pages of the new doctor. The new doctor would immediately create a page that links back to these engagements with their work, which may of course be very critical, and which offers insight into how useful the contribution is.

One of the pages that an academic will publish online will, of course, be their CV. It will include links to places on the Web where their ideas have been discussed. A hiring or tenure committee should have no difficulty evaluating these engagements, both in terms of the content and the context in which they go on. It will matter both what is being said, and who is engaging with those statements. Whether someone should be promoted or not, whether they should perish or persist, will be as clear as it could possibly be from their online presence. The correspondence between scholars and the coherence of their ideas would be made entirely transparent.

To implement this system would require the cooperation of university administrators and librarians. I'll write about this in my next two posts.

(continues...)

Tuesday, May 17, 2016

Against Social Media

Since New Year's I've been having increasing doubts about the so-called social media. I include under this heading Twitter, blogging, YouTube and, of course, Facebook, but I think I can also offer a more functional definition. A website is part of the social media if it allows people to post, comment, and evaluate ("like", etc.) content as they please, with no editorial oversight. (A "moderated" forum or comment field does not count as editorial oversight.) Social media also implies an ability to track views. That is, when writing (or filming) for social media you are making a unilateral decision to express yourself, and will receive feedback (or not) in real time. Your audience's interest in your ideas is known to you, in an important sense, immediately.

When I say I am "against" social media, this is the sort of thing I'm thinking of. I've come to my low opinion of it honestly, of course. This blog is very much a social media experience, and until a few months ago I was also an avid twit. There are number of blogs that I've at one time or another regularly commented on—indeed, I got my start in the blogosphere as a commenter on the blog of my favorite poet. I had a Facebook account for about a week at the very beginning, but something about it immediately put me off. I've tried Reddit and have, albeit very rarely, posted comments in the articles of news websites. I've also contributed to Wikipedia. In short, I know what I'm talking about.

What I've come to realize is that writing for an audience that is immediately present undermines my ability to finish a thought. We talked about this here at RSL a few years ago, when Oliver Reichenstein brought Kleist's essay on "The Gradual Perfection of Thought while Speaking" to my attention. My ideas are forming mainly to engage with what is "in the talk" (as Heidegger might put it) at the moment, not to make a more or less permanent contribution to the totality of what is known. I find myself writing and posting and then "bracing" myself for the reaction. I must confess I am bracing myself even as I write. That must have consequences for my style, and it has certainly had consequences for my thinking.

Back in the days of the Snowden exposure, I compared the feeling of being watched by the NSA to living in George Orwell's dystopian 1984. The most effective surveillance, however, is not that of the state but that of our fellow citizens. There's nothing really wrong with being accountable to our fellow humans, of course. The point is just to give each other some privacy. Social media, I fear, is eroding that privacy. In 1984, we should remember, "Nothing was your own except the few cubic centimetres inside your skull."

Monday, May 16, 2016

The Last Week

"We make ourselves pictures of the facts."

This sentence from Wittgenstein's Tractatus Logico-Philosophicus can serve as a fitting start to the last week of this blog's active existence. It aptly summarizes the problem that I have been trying to solve for more than ten years. It identifies the difficulty of representation. For how, after all, do we make these pictures? And who do we think we are? These questions indicate the crisis of modern scholarship. Answering them amounts to learning the language of research.

"What relation must one fact (such as a sentence) have to another in order to be capable of being a symbol for that other?" asked Bertrand Russell in his introduction to the Tractatus. In this blog I have asked a related question. What does it take to become capable of representing a fact in the world? The short answer is that it takes discipline. Indeed, it takes a discipline to represent a fact, just as "it takes a village" to raise a child. Representation is the difficult business of relating one's self to one's peers in one's prose. It is a craft and, sometimes, it is an art.

Friday will be my last post, after which I will complete, if I can, my withdrawal from social media. I'll try to explain why in the coming days.

Sunday, May 15, 2016

Apperception

It has been demonstrated by Sernier (and others, although without violence) that the outer gaze alters the inner thing, that by looking at an object we destroy it with our desire, that for accurate vision to occur the thing must be trained to see itself, or otherwise perish in blindness, flawed. (Ben Marcus, The Age of Wire and String)

It looks like the Young Americans for Liberty at the University of Oregon found a way to maintain exactly the sort of order I've been talking about in my last few posts. They appear to have completely avoided the disruptive protests that have been marking Milo Yiannopoulos's talks on US college campuses. You can see the whole thing on YouTube. The most famous contrast case is, of course, the "triggering" of UMass Amherst students. While it is possible that Oregon students just have a much better sense of decorum than their Massachusetts counterparts, or are just more polite and reasonable, or, I guess, just hold more "offensive" opinions, I have a theory about what happened that I want to note down.

First of all, it must be kept in mind that the UMass protests reached a sort of high water mark of unreasonableness. I imagine many protesters lost the taste for the tactic of disrupting speaking events after that debacle. But the Oregon students apparently hit on a brilliant way of amplifying the threat of a similar embarrassment at their event. They put a camera at the front of the room pointed at the audience. This promised high quality, continuous footage of disruptive behavior and therefore set an impossibly high bar for maintaining one's dignity while protesting.

I've said that decorum is "the condition of the possibility" of rational debate. The Kantian language is deliberate, and I think we here see another sense in which I'm offering a "critique" of the protests. What was installed at the Oregon event, and not, it seems, at UMass, was a moment of "apperception": the act of perceiving oneself in the act of perceiving. It can be extended to catching oneself in the act of thinking, i.e., of applying a concept. Milo Yiannopoulos clearly has a high degree of self-awareness. The Oregon students found a way of distributing it to the audience. The effect was almost transcendental.

Protesting is a kind of theater. What has been so disappointing this past school year is that the "drama" has been so poor. It's been series of bad melodramas, not clever absurdist happenings. In a comment to my first post on this subject, Thomas Presskorn reminded us that there is much to appreciate in a good protest. In Oregon, I hope, we saw what happens when they are held to higher standard.

Friday, May 13, 2016

The Arbitrariness of Politeness

One of the most familiar pieces of advice in coaching, I imagine, is what Barry Michels calls "the arbitrary use of time". Readers of this blog know I recommend writing in well-defined 18- or 27-minute "writing moments", each devoted to a single paragraph. I also like to hold my coaching meetings for 27 or 54 minutes, never stopping early and never going late. When I take a break in a lecture or seminar, it always lasts exactly 7 or 11 minutes. We always start exactly when the decided amount of time has passed.

The reason for this is simple. If you know how long a moment is going to last, you are more likely to concentrate on the task it has been assigned. Imagine listening to a boring lecture, for example. If you know the speaker has been given only 20 minutes and will be stopped by the moderator when that time is up, you can tolerate the passage of time. If you think that only when the audience's patience has been exhausted will the speaker stop, then you're going to get antsy. The same goes for working on a tricky passage of your own prose. If you are only allowed to stop when you give up in frustration, that will affect the mood you work in. If you know you're going to be stuck with the task for 27 minutes but then, just as arbitrarily, freed from it, you are more likely to give it your best attention. Try it. It think you'll find I'm right.

And the idea applies not just to facing boredom or difficult material. It also applies to managing your outrage. There is a difference between the experience of a family member dominating a dinner conversation with his offensive political views and listening to a formal lecture on the subject. In the first case, there are no rules for making him stop, in the second, it is not your job to make him stop at all. This lets your mind shift into a more intellectual space.

The 45-minute lecture followed by an orderly Q&A is a way of structuring your attention. More complicated schemes can be considered. A lecture, a formal discussant or two, and then questions from the floor. Or the even more dynamic form of the Oxford-style debate, with equal time given to two sides of an issue, and a moderator to enforce the rules.

All of these are orderings of time. They are deliberate and therefore in an important sense not "arbitrary". They serve a particular purpose and can be evaluated accordingly. But they are arbitrary in an important respect: they are insensitive to the quality and content of the speech itself. Once the speaker has been given those 45 minutes it doesn't matter what she says or how badly she says it. (The exception, of course, is where the speaker says something so puzzling or offensive that there is doubt about her sanity or whether she is in the wrong room. The host can apologize and say that there must be some mistake.) I've wasted many hours in an auditorium being bored for 45 minutes out of simple politeness. That politeness is not a trivial matter. It is the condition of the possibility of all the interesting lectures I've attended that were not interrupted by people who merely disagreed with the speaker. They were being polite too.

Wednesday, May 11, 2016

Freedom from Consequences of Speech

It often said that "free speech does not mean freedom from consequences of speech". The idea is that only a government restriction on your ability to speak counts as a violation of free speech. I want to challenge this piece of conventional wisdom, which I have in fact deployed in my own writing in the past. Here's how I put it when discussing some aftershocks of the Cartoon Crisis in Denmark:

freedom of speech is threatened when the state prevents people from speaking freely. When your neighbour punches you in the mouth for saying something he finds offensive your freedom of speech has not been threatened.

I still stand by that. But there's a corollary that I think we too often forget. If my neighbor punches me in the mouth because I offend him, he is still guilty of assault. If the state considers my "offensive" remark to be a mitigating circumstance and therefore does not prosecute the assault charge as it would otherwise do, then my free speech rights have in fact been violated by the state, since it is essentially declaring me fair game for violent reprisals. Similarly, I would argue, if the state does not help me secure the site of a peaceful assembly, minimally by threatening to enforce trespassing laws against disruptive protesters, then my right to free speech has been violated.

Alice MacLachlan says she teaches John Stuart Mill's conception of free speech. As I pointed out in another context, Mill made the important point that free speech isn't just the right of a person to speak; it's also the right of an audience to hear. Accordingly, if a state, or a university, values free speech, it must enforce rules of decorum. It must protect what Mill called, not just "free speech", but "the liberty of thought and discussion".

That is, the state must not take a characterization of a speaker's ideas (whether as "false" or as "drivel" or as "hurtful") as a good reason to prevent that speaker from speaking. It is especially the right to express false notions that Mill would have wanted us to protect. How else could they be corrected?

One last point. While it is true that constitutionally protected speech is only respected or violated by the state, there's nothing to prevent a university from declaring itself a "free speech zone", meaning that it refrains from punishing its students and faculty for speaking their minds and enforces the rules of decorum that make rational debate possible. Even a bold corporation could guarantee its employees free speech rights, meaning simply that it would levy no consequences against people merely on the basis of what they say in public, even about the company. In that sense, "free speech" is actually freedom from (and protection against) at least some of the consequences of speaking. It is also, implicitly, a promise of the maintenance of order.

__________
Update: one could probably always footnote a post on this issue with a post by Ken White at Popehat.

Assembly

Continuing my thoughts on "decorum", and inspired by an oldish conversation on the Agenda, a simple argument against efforts to prevent speakers from speaking on campus just occurred to me. It is sometimes said that your right to free speech does not guarantee you a platform to express your views. When scheduled events are cancelled or interrupted, it is said, these are merely "consequences" of free speech, not the prevention of free speech. I don't think that argument works, or perhaps it is simply trumped by another line of thinking, which has to do with protecting the right to peaceful assembly.

Consider the following situation. A university professor invites a controversial intellectual to come by his office for an exchange of ideas. The guest arrives, they close the door, and the conversation begins. I think we can all agree that some sort of right would be violated if a student group physically prevented this meeting from happening. Now, let's suppose that the professor shares some of the ideas that were discussed in the meeting with colleagues and these colleagues are sufficiently intrigued to suggest that the intellectual be invited back for department seminar. Faculty and graduate students in the department are invited to attend, and those who are interested in fact show up. Again, I think we would agree that there's nothing admirable in a student group that attempts to either have the seminar cancelled by some higher authority or, that failing, undertakes to prevent the seminar from happening on the day, by blocking entrances, or storming the room.

Now that our controversialist's ideas are becoming more familiar around the department, let's suppose that one or more of the faculty invite him as a guest speaker into their classes. Again, I think we would agree that preventing this class from happening would be a violation of the teacher's academic freedom. (Keep in mind that if the content is actually inappropriate to the course, the teacher can be held accountable in other ways.) The teacher would, moreover, have the authority to moderate the discussion, i.e., to make an agenda and keep order—a thirty minute talk, followed by a Q&A for example. Students who don't observe the usual classroom decorum would, of course, be asked to leave, backed up by a threat of disciplinary action. (Here we can also talk about "free speech with consequences," I suppose.) Likewise, if a student group unrelated to the class were to protest or disrupt the class, this would, I think we can agree, be deemed entirely unacceptable. Students on a university campus must respect the classes of their fellow students, again on pain of disciplinary action.

Now, let's suppose that some of the students who were present at the guest lecture get interested enough in the controversial ideas to invite the intellectual to speak to their campus club. The leadership of the club determines that this would be of broad interest to the student body, and a good opportunity to promote their club. They book a suitable auditorium, and promote the event with posters. My question now is whether this opportunity for free speech and assembly is any way different than the previous ones. I don't think it is. The inviting club and the speaker have the right to hold an event structured as they please. If they want a 45 minute lecture, followed by an invited response or two, followed by a Q&A with the audience, they have the right to make observing this order a requirement for being the audience. Merely showing up on the day does not entitle you to decide that a "lecture isn't a fair and balanced debate" or whatever criticism of the form no-platformers sometimes claim is the real basis of their protest to avoid having to admit that they are trying to prevent the expression of ideas they don't like.

The rules of decorum for large events are merely ways of making the exchange of ideas possible. They are less necessary when two people have a conversation, but even a small group discussion sometimes needs a moderator that everyone respects. Assembly under mutually agreed upon rules of decorum is not less free just because it is orderly. In fact, freedom requires such order. This right of assembly and right to determine the style of a particular encounter seems to me to be fundamental to orderly discourse. Everyone has a right to attend or not as they choose, of course. But no one has an equal and opposite "free speech" right to protest in such a way as to disrupt the proceedings. Just as you have no right to come into my office and prevent me from exchanging ideas with an invited guest. There's just no way to justify such a thing. If you cannot be persuaded to go away, my only recourse is to call the police. That's exactly what the police is for—to make the space around me safe for my activities with mutually consenting adults.

My point here is that any protest that makes it necessary to call in security in order for a conversation (even when it is as one-sided as a lecture) to proceed must be held accountable for violating the conditions of rational discussion, and therefore the founding principles of a university. If students are protesting in this way, they should do so at the risk of being expelled. If the protesters are not (or will not identify themselves as) students, they should be removed, and possibly arrested, as any other trespasser. Those "consequences" are always what civil disobedience have been about. If you're not willing to be punished for your civil disobedience, you're not doing it right. Or am I missing something here?

Monday, May 09, 2016

Trigger Warnings

[Update 04.09.16: The relevance of this discussion can be seen in the discussion around the University of Chicago's explicit non-support of trigger warnings. Good coverage by Samantha Harris at The Fire and Robby Soave at Reason. For a defense of trigger warnings, see Angus Johnston, here and here. Freddie deBoer is always worth reading too.]

Let me try out a simple argument against trigger warnings. It has two simple premises, which taken together lead, I think inescapably, to the conclusion that trigger warnings have no place in education. Not only are trigger warnings bad psychology, they are poor pedagogy.

1. If trigger warnings are not mandatory, they are pointless at best, dangerous at worst. That is, if a teacher cannot be disciplined for failing to warn students of potentially triggering content, then students are effectively not protected against being triggered. Moreover, the use of a trigger warning in one course is essentially a false promise that the student will be protected from uncomfortable reading in other courses. If they are not protected in those other courses, the discomfort may be amplified by a sense of betrayal. If there is any risk of exposure to triggering content, then, this risk must be part of the deal going in, i.e., it is simply what you sign up for when you enroll at a university.

2. Trigger warnings cannot be made mandatory without ruining the educational experience. A simple example should suffice to show this. Ernest Hemingway's "Up In Michigan" is essential reading in a course on early twentieth-century American literature, or literary modernism in general, or the modern short story, or countless other courses. It includes a scene that Gertrude Stein famously called "inaccroachable", as recounted in Hemingway's A Moveable Feast (which is another reason to include it in the syllabus). Anyone who reads this story will, I hope, agree that (a) if any story requires a trigger warning then this one does and (b) if the reader is warned of the triggering content the story is ruined.

It might be argued that universities have no obligation not to ruin a piece of literature, that students should in any case re-read the story, or that a truly great piece of literature can survive a spoiler. But I would argue that a course in American literature should, among the many things it does, expose students to literature they might not otherwise read, and it should then reward the student by giving them the full literary experience of letting a story proceed toward its uncertain and, in this case, ambiguous conclusion.

I'm leaving aside the question of whether anyone would be seriously harmed by reading "Up In Michigan" unprepared. I think very few people would be "re-traumatized" by it, and no one, of course, would be traumatized by it. But there is a very definite, very important experience to be had by reading it. A trigger warning here would rob all the students of that experience. Indeed, a student who opts out on the basis of the trigger warning might also be unfairly robbed since the triggering content may not be severe enough to have actually caused any discomfort, but the careful (and now frightened) student would not know this. For this reason, the demand for trigger warnings should have been summarily rejected by universities from the outset, just as a demand for "easier math" in a physics curriculum should be rejected. Such demands simply misunderstand the nature of higher education. It is natural, namely, that some people will turn out to be unfit for particular courses of study, owing to the "difficulty" (in whatever sense you like) of the material.

Friday, May 06, 2016

Decorum

I hadn't noticed it at the time, but the story of Ray Kelly's cancelled lecture at Brown University in October 2013 got me thinking. This was an extreme case of the sort of protests that have been seen in various forms by social justice activists on university campuses. Most cases I'm aware of are merely disruptive; the protesters are voicing their disapproval by making it difficult for the speaker to speak. In this case, they succeeded in preventing the speaker from speaking.

I'm among those who think this sort of behavior is unbecoming of university students.** In this case, they prevented the statement and defense of a policy to which they are opposed. It was to be made by someone who is exceedingly qualified to mount such a defense, i.e., a top-ranking enforcer of the said policy. In a democracy, it seems to me, you would want universities to provide a "safe space" for discussion of policies that are actually governing practices. That someone like Kelly would be willing to contribute to that discussion should have been valued, not denounced.

As Christina Paxson, president of Brown, wrote afterwards, “The conduct of disruptive members of the audience is indefensible and an affront both to civil democratic society and to the University’s core values of dialogue and the free exchange of views.” There was, to my mind rightly, some discussion of reviewing the "policy of allowing all members of the community, as opposed to only individuals with Brown IDs, into the event."

The more recent protests gives this idea scope. What students seem to lack these days is a sense of decorum, a respect for the modicum of order that allows a free intellectual discussion to happen. In the course of a speech there can, of course, be booing and hissing just as well as clapping and laughter. An audience is not expected to just sit quietly and passively and listen. But there is a point at which individuals can be said to be "out of order" and at this point rules of decorum can be appealed to. First there would be a warning, and thereafter the disruptive person would simply be asked to leave.

Now, what if they say, "Hell no! We won't go!"

This is where requiring university IDs becomes an important mechanism for avoiding the need to use force. Upon being given the warning, the audience member could be asked to prove that they have the right to be in the room. Once this right is established they have identified themselves by name, and can now be subject to disciplinary action, including expulsion, for not complying with the rules of decorum of the university.*** These rules, it should be noted, are an integral part of the right to be in the room. They stipulate the responsibilities that go with the right, and amount to respecting the rights of the others in the room to hear what the speaker has to say.

It would be possible to extend the same rights and responsibilities to guests, for whom a host or student would be personally responsible. If a disruptive person can find no one in the room to vouch for them, they would be immediately evicted. If someone did vouch for them, that student would now be responsible for the behavior of the guest. This means that in practice only unaccompanied non-students who draw attention to themselves by being disruptive would ever need to be forcibly removed. The authority to do this would derive from trespassing laws.

In the case of the Brown protests, I think the campus police did the right thing by not forcibly ejecting anyone. The procedure should have been to ID the protesters and ask them to leave if they didn't identify themselves, arresting them for trespassing only if they refuse. Students who provide ID would be noted (and perhaps filmed) and would be at risk of being expelled (depending on their behavior*).

In my view, speakers who have been invited, either by the university, a department, or a student organization, should be treated with respect. They should be treated as effectively guests of the president of the university, and the audience (including protesters) are answerable to that president for failures to maintain decorum.

Indeed, I sometimes think that protests should not be allowed on the day of the speech itself, since this is simply rude and inhospitable. To "demonstrate" to someone who has been invited to speak on your campus that they are "not welcome" there is incoherent. The offended students should be directing their protest at those who extended the invitation and the president who approved the event. This can be done on any other day, before or after the event. There's no need to be rude.

But to actually disrupt or prevent the event is more than rude; it is indecent, or, more formally, it is indecorous. It is, as Christina Paxson rightly put it back in 2013, "an affront both to civil democratic society and to the University’s core values". Students who do such things do not belong at a university. Once the invitation has been approved, extended and accepted, there can be all kinds of discussion within the community. But the speaker must now be allowed to speak. That's the only right and proper way to proceed.

___________

[A video about the situation at Brown here.]

*The punishment should be commensurate with the behavior and take account of its actual effects. If the disruption actually succeeds in preventing the event from going forward, the students who have been identified as disruptive should face severe punishment, like suspension. A student that merely erupts in anger and leaves the room, might get off with a warning—if it's a first time offense, etc. In any case, all this would be sorted out by the usual disciplinary processes after the event. The students should be maintaining decorum for its own sake, of course. But also under this sort of threat.

**I think Fran Lebowitz is also among us. It was interesting to hear her call Ray Kelly a friend in this talk (she also uses the word "decorum" earlier in the talk.) It's also interesting that she was speaking at Claremont McKenna, conceivably to some of the same students that were responsible for this unseemly display. If the students at Oregon figured out how to avoid disruption through apperception, the organizers of the Lebowitz Q&A were also onto something with the simple notion of not giving the audience a microphone. Instead, all the questions needed to be understood and restated by Lebowitz, which is literally to say she had to respect the question. That's also a way of assuring decorum.

***Ohio State appears to have gotten this exactly right.