Friday, May 20, 2016

The End

Martinus Rørbye, Scene Near Sorrento Overlooking the Sea, 1835.
(Source: Nivaagaard Collection)

This is my last post at RSL. For more than ten years, the blog has served as a semi-public place in which to develop my ideas about the art of academic writing, which is not, I stress, merely the business of "writing for publication". My aim here has been to help people develop their ability to write down what they know for the purpose of discussing it with other knowledgeable people. Unfortunately, this art does not stand in a simple relation to the purpose of advancing their careers as scholars.

In 1837, Bernard Bolzano conceived of his Wissenschaftslehre as the principles of composition of a book containing the totality of human knowledge. Not long after, Søren Kierkegaard described his age as "confused by too much knowledge". A hundred years later, Heidegger announced that, with the rise of modern research, the scholar would disappear. In 1972, about a year after my own birth, Deleuze and Foucault decried "the indignity of speaking for others", marking (in my mind at least) the high point of "the crisis of representation".

Twenty years ago, I read Steve Fuller's Philosophy, Rhetoric, and the End of Knowledge, interpreting its title somewhat cynically, at times nihilistically. With time, I came to see the double meaning of the word "end"—its teleological, not just (if you will) escatological, sense. Our discourse is never finished, never perfect. But it does have an end, a purpose. Perhaps it will always be unclear to us, will come to us in flashes of insight always slightly beyond the reach of reason. Here in the blogosphere, Jonathan Mayhew has helped me to see that everything depends on how we appropriate this sense of purpose, how we make it, irreducibly, our own.

And to that "end", then, I leave these social media behind and retreat to a calmer, more autonomous place. There are no cookies there and you cannot leave a comment. But you are welcome to have a look around.

The Library

The archetype of a library is a collection of books. But the focus is not so much on the books as on their collection. A pile of 4000 books in the middle of a warehouse is not a library. But if they are catalogued and put on shelves everything changes. The same 4000 books arranged for easy access in that same warehouse is, in fact, a library. We might want to add that the books should be collected on some principle, which is just to say they should be brought together for some reason.

Books have never really been the only things in a library collection. There were libraries full of scrolls, no doubt, even before there were books. For centuries, libraries have also collected a wide range of documents and other materials, some of which are less portable than books, less amenable to being taken out. As a result, libraries have become not just places where these materials are stored, but also places where they are studied. Libraries normally include a reading room or study spaces for this purpose. You don't just come to the library to find books, you come there to read them. You come there to work with the materials that are collected there.

With the invention of the Internet, the "there" of the library has been challenged. There no longer seems to be a direct need for places that collect books (broadly understood as representations of knowledge you can hold in your hand). Rather, a library needs mainly to provide access to the knowledge that is available online. While it should, perhaps, also provide an access point—i.e., a physical interface with the internet, a place where you can go to get online—most people will "go to the library" online as well, from home. This lets us ask the question, If a library were just a website, what would it provide? What would the user find there? What would a librarian do?

Today, the answer is that it buys access to a range of proprietary databases. Research libraries, especially, buy access on behalf of their users to the databases of the major academic publishers. A growing portion of their book collections also consist of e-books, which is to say, books that are accessible online but only through the subscription arranged by the library. That is, a library is, increasingly, a budget for buying access to materials that have been hoarded behind a pay wall.

Libraries used to make relatively scarce materials (there can only be so many physical copies of a book) available to the public (or a student body). Libraries are now in the business of managing—indeed, restricting—access to materials that exist in superabundance (there is no limit on the amount of times a text can be downloaded from a website). That is, libraries today help us to maintain the illusion that knowledge is a scarce resource and, thereby, the privileges of the the select few to present themselves as knowers.

But here's the thing. Knowledge isn't actually "stored" in the texts in the library. Knowledge has always existed mainly in the living conversation that goes on between knowledge-able people (people who are able to know). Traditionally, the library supported that conversation. It provided a place for it. It brought knowers together with the resources they needed to examine each others' ideas. I hope the library can regain its place again one day.

Thursday, May 19, 2016

The University

Almost ten years ago, I found myself proposing that we stop complaining about the demand to "publish or perish". Instead, I suggested a "more constructive" approach: we could accept that our administrators have time to take only a superficial interest in our work; then we could set ourselves to the task of addressing our readers. This morning I took the radical further step of proposing we do away with academic publishing. This raises the question of how academics should be evaluated for purposes of hiring and promotion. The role of publishing in these decisions, after all, is the main source of its power.

The system I propose is one in which candidates for positions and tenure submit hypertext CVs that link to work they have published on a personal website as well as work that cites them (published on other, also personal, websites). The CV would, ideally, have a "narrative" form, in which the candidate summarizes "the story so far", noting the engagements during which they made their most impressive contributions, acknowledged by the most prominent members of their field. This process would in fact begin with the PhD defense, which could take as its point of departure a "dissertation", also published online as a hypertext. Since post-graduate degrees, as well as academic hiring and promotion, is all done by committee anyway, all of this writing, along with the candidate's online demeanor and rhetorical posture, would be "peer-reviewed" in a highly transparent manner. I do not think any information that is relevant to finding and rewarding competent academics would be lost by removing the evidence of having survived the torture of the increasingly arbitrary process of soi-disant "peer-review" that does the gatekeeping in our for-profit journal literature at present.

"If its object were scientific and philosophical discovery," said John Henry Newman in 1852, "I do not see why a University should have students." A century and a half later, I found myself making the same point when the marketing director of a global consultancy helpfully suggested that academics need to have a "product launch". We already have one, I said: it is called "commencement". Universities do not, at least not primarily, "produce knowledge" in the form of novel ideas that can transform reality, or radical theories to guide new practices. Rather, universities produce knowledge-able students, people who are able to to know things. The graduates of the variously famous, variously "elite" schools are well-known for their relative competence in this regard, though I'm sure employers are looking at our universities with increasing concern.

We might say that "academic knowledge" is precisely the sort of thing that can be imparted to young people between the ages of, say, 17 and 25 during four more or less consecutive years of full-time study. After such a program a certain degree of ignorance (on particular subjects) should rightly be an embarrassment for the individual and a scandal for the institution that conferred a certificate of academic achievement (i.e., a university degree). The primary duty of university administrators is to arrange those four years of study in the most effective way possible.

This requires bringing the right sorts of students together with the right sorts of teachers. A university, let's say, exists at the intersection of a set admissions requirements and a set of hiring practices. This intersection is then governed by what is called "the curriculum". It is essential that the student's mind is prepared to receive the curriculum, but it is also essential that they teacher's mind is qualified to present it. We look to university administrators to make sure that the right students arrive in the right classrooms to sit before the right teachers. Though I won't belabor the point here, I am increasingly worried that administrators today see their mission more as micromanaging what happens in the classroom on the (increasingly accurate) presumption that a good many of the students and teachers don't really belong there.

Universities should be places where intelligent people satisfy their curiosity. The degree of intelligence and the intensity of curiosity simply indicate the quality of the university. The reason that university teachers should also be researchers (scholars, scientists) is that the students should be guided in the satisfaction of their curiosity by people who have made a habit of satisfying their own. They should not be guided by people who have made a habit merely of "writing for publication". Like I say, it is not difficult to imagine a system of hiring and promotion that would put the right sorts of mind in front of students. Sadly, it is also not difficult to see that we are a far cry from it.

The Internet

The Internet as we know it, i.e., the World Wide Web, is the brain child of Tim Berners-Lee (1989)*. He invented it specifically to make communication between scientists easier. Indeed, it is my view that, in the early 1990s, technology had made academic publishers largely obsolete. The scientists themselves, supported by their librarians, now had the means to communicate their results to each other, and to discuss their validity, without the need of a for-profit press. No doubt there would still be a market for books of exceptionally high quality. But there was no longer any need to maintain an expensive publishing and indexing infrastructure to support the work-a-day communication of ideas and observations.

Consider what the World Wide Web is. It is a collection of "pages" that is accessible to anyone with an internet connection. "Publishing" something is no more complicated than saving a file of a particular type in a particular directory on a particular server. The skills needed to make the file (the "web page") are not hard to learn. Coding a document in html is no more difficult than using Word to set up a document to conform to a journal's formatting guidelines or constructing a proper APA-style reference. Most importantly, any page on the Web can be linked easily to any other page. That is, any result that is being communicated can be linked to any previously published result. Everything is available at the click of the proverbial mouse.

That this technology has not brought about a revolution in scientific communication—like I say, effectively the end of academic publishing—is a scandal and one that scientists—or should I say academics?—are complicit in. Berners-Lee invented a way to expose every idea currently held by scientists and scholars to the criticism of every other scholar, plainly and directly. It allowed academic communities to sort the wheat from the chaff of their research in their own way and at their own pace, while making both their knowledge and the basis of that knowledge available to anyone with the requisite interest and expertise, whether inside or outside the academy.

Instead we a have a system that is better suited to extracting rent from scientific research than contributing its results, as knowledge, to the culture. While the economic rents of course accrue to the publishers and their owners, the academics themselves are, like I say, complicit in the business. By tying their careers to their success in the for-profit "publish or perish" system, they ensure that the work of others becomes robustly path-dependent on their own work, and they allow each other to free-ride on a network of gratuitous citations that have very little to do with actually understanding how the particular corner of the world they are interested in works.

It is claimed that we can't imagine anything better. But this is simply not true. I talk to academics all the time, especially early career academics, whose complaints about the problem of getting published most certainly imply a better way of doing things. It is simply this: PhD students should publish all their results on a webpage (hosted by their university, of course). Their committee should evaluate this page and grant or withhold their degree on this basis. The committee should then link, as they see fit, their own pages to the pages of the new doctor. The new doctor would immediately create a page that links back to these engagements with their work, which may of course be very critical, and which offers insight into how useful the contribution is.

One of the pages that an academic will publish online will, of course, be their CV. It will include links to places on the Web where their ideas have been discussed. A hiring or tenure committee should have no difficulty evaluating these engagements, both in terms of the content and the context in which they go on. It will matter both what is being said, and who is engaging with those statements. Whether someone should be promoted or not, whether they should perish or persist, will be as clear as it could possibly be from their online presence. The correspondence between scholars and the coherence of their ideas would be made entirely transparent.

To implement this system would require the cooperation of university administrators and librarians. I'll write about this in my next two posts. Then I will withdraw from these "social"* media and practice what I breach on my own website*. It is a work in progress. As it should be.

_______
*One of these links is not like the other. Berners-Lee's proposal and my own website's homepage are simple html pages. Notice how quickly they load. Web designers talk a great deal about the look and feel of the "user experience". But their designs are inefficient and clumsy by comparison to the straight presentation of what you think. (A lot of it no doubt has to do with the need to "track" your browsing. And, often, the demands of advertising.) You can feel the difference simply clicking on the "social" link to this blog itself.

Tuesday, May 17, 2016

Against Social Media

Since New Year's I've been having increasing doubts about the so-called social media. I include under this heading Twitter, blogging, YouTube and, of course, Facebook, but I think I can also offer a more functional definition. A website is part of the social media if it allows people to post, comment, and evaluate ("like", etc.) content as they please, with no editorial oversight. (A "moderated" forum or comment field does not count as editorial oversight.) Social media also implies an ability to track views. That is, when writing (or filming) for social media you are making a unilateral decision to express yourself, and will receive feedback (or not) in real time. Your audience's interest in your ideas is known to you, in an important sense, immediately.

When I say I am "against" social media, this is the sort of thing I'm thinking of. I've come to my low opinion of it honestly, of course. This blog is very much a social media experience, and until a few months ago I was also an avid twit. There are number of blogs that I've at one time or another regularly commented on—indeed, I got my start in the blogosphere as a commenter on the blog of my favorite poet. I had a Facebook account for about a week at the very beginning, but something about it immediately put me off. I've tried Reddit and have, albeit very rarely, posted comments in the articles of news websites. I've also contributed to Wikipedia. In short, I know what I'm talking about.

What I've come to realize is that writing for an audience that is immediately present undermines my ability to finish a thought. We talked about this here at RSL a few years ago, when Oliver Reichenstein brought Kleist's essay on "The Gradual Perfection of Thought while Speaking" to my attention. My ideas are forming mainly to engage with what is "in the talk" (as Heidegger might put it) at the moment, not to make a more or less permanent contribution to the totality of what is known. I find myself writing and posting and then "bracing" myself for the reaction. I must confess I am bracing myself even as I write. That must have consequences for my style, and it has certainly had consequences for my thinking.

Back in the days of the Snowden exposure, I compared the feeling of being watched by the NSA to living in George Orwell's dystopian 1984. The most effective surveillance, however, is not that of the state but that of our fellow citizens. There's nothing really wrong with being accountable to our fellow humans, of course. The point is just to give each other some privacy. Social media, I fear, is eroding that privacy. In 1984, we should remember, "Nothing was your own except the few cubic centimetres inside your skull."