"The Only Responsible
Intellectual is One Who
is WIRED"

Mark Taylor/Esa Saarinen, Imagologies

Introduction

The tendentious title of my talk is taken from Mark Taylor's 1994 book Imagologies, a kind of hyper-platonic dialogue that aims to exemplify media-oriented philosophy. My purpose in citing the assertion is to interrogate it, and not necessarily to affirm it. The question it begs, and that I want to consider, is this: What is it to be an intellectual, what is it to be wired, and what does each mean in the light of the other?

What are my qualifications to address this question? I am a member of the English department: I teach graduate and undergraduate students, I edit a peer-reviewed scholarly journal, I direct a humanities research institute, and I write essays for publication in scholarly journals and books. On the wired side (and I hope that the remainder of this talk will convince you that this a confession, not a boast) I have five phone numbers; I also have several times that many login accounts on networked computers, plus three or four personal computers (none of which I own). When I go on a trip, I take a laptop, and I check voicemail and email once or twice a day. I read two newspapers a day, watch TV news in the morning and listen to radio news in the evening. I get filtered newsfeeds daily by computer, too. When I'm home, I'm rarely out of touch with these forms of communication for more than six hours at a time: I receive fifty or more email messages a day, probably a dozen phone calls, and one or two faxes. I spend something more than twelve hours a day intermittently interacting with computers. Most of my reading is done from the computer screen, often while doing something else at the same time (meeting, talking on the phone, watching television).

Frankly, I find it difficult to concentrate--and I'm not always sure that's a bad thing. In fact, I feel a pang of bad faith when I tell my thirteen-year-old son to turn off the rock and roll when he's doing his homework: in many ways, I feel that the most important skill he could acquire would be the ability to do his homework with the music--and the television, and the radio, and the phone--going at the same time. He's not growing up in a world where attention can be devoted to one thing at a time.

So: "the only responsible intellectual is one who is wired"? I believe this is true, in certain respects, but I also think it needs to be carefully examined, under the heading of three different, and sometimes conflicting, imperatives--namely:

These three--communicate, collaborate, contemplate--form a kind of mantra that helps, for me at least, to define the pressures that form and deform intellectual life.

The Cultural Imperative: Communicate

In Keywords, Raymond Williams notes that the word "culture" comes from a Latin root, colere, that means to inhabit, cultivate, protect, honor with worship; from the same root we get the words "colony" and "couture." The modern sense of the word "culture" as an independent, abstract noun describing "the works and practices of intellectual and especially artistic activity" does not become common until the mid 19th century, developing slowly and (as it were) organically from the original meaning of cultivating natural resources. Of particular interest here is the early example Williams cites from Milton's last pamphlet, "The Readie and Easie Way to Establish a Free Commonwealth," in which Milton writes of spreading "much more Knowledg and Civility, yea, Religion, through all parts of the Land, by communicating the natural heat of Government and Culture more distributively to all extreme parts, which now lie num and neglected." In this passage, the metaphorical sense of cultivation is present in the phrase "natural heat" but culture itself has migrated to the side of Government, and is yoked to the idea of education as an instrument of social control. In fact, culture's root-word cousin, "colonize" is present here as well, in the image of "extreme parts...now...num and neglected" and awaiting a civilizing influence.

Leaving aside the irony of the pamphlet's date (two months before the restoration of Charles II), what I would like to focus on is the early connection between our modern sense of the word "culture" and the idea of "communicating the natural heat of government and culture more distributively to all extreme parts"--the function now assigned to our various and layered communications networks, including book and magazine publishing but more literally the networks of broadcast radio and television, and most recently the internet.

What I want to propose is that the imperative to communicate is properly and almost pleonastically an imperative of culture--not of our cultural moment, but of the root idea of culture itself. The form that this imperative takes at present is the social imperative to "be wired"--with five phone numbers, internet access, cable television, and all the rest. If you doubt that this is a pervasive requirement, consider some anecdotal evidence: Bob Dole--Bob Dole--takes questions from voters online; in the comics, BC picks up a "shellular phone" off the prehistoric beach; in the back of a New York City cab, you will notice that the cab company has an address on the World-Wide Web.

Without a doubt, the most concise and elegant statement of the cultural imperative to communicate was AT&T's "you will" advertizing campaign (on the air in the same year, 1994, in which Taylor's book was published, and accompanied with the inevitable URL: http://youwill.com, now defunct). You may remember the question-and-answer form of Tom Selleck's voiceover in these ads: "Have you ever wanted to go to a meeting in your bathrobe? You will!" It's worth noting the grammatical finesse that's practiced here: strictly speaking, the ad must mean "you will want to go to a meeting in your bathrobe," but in fact it poses a rhetorical question, the answer to which must be yes, and then offers the products to meet that hypothetical need. The point of the finesse is that, in fact, we should not ask ourselves "do I want to go to a meeting in my bathrobe," but should be swept up in the nervous realization that we ought to want this, that we need this need.

The ad campaign was intended to suggest the many ways in which AT&T's futuristic products and services (some actually available, all at least in development) would make our lives better: the spots were notable for their implicit definition of "better" as "more human"--almost all of them featured technology enhancing human contact: videophoning home to see the baby, distance education with an inspiring teleprofessor. DN Rodowick has a performed an excellent analysis of these film texts in "Audiovisual Culture and Interdisciplinary Knowledge," originally published in 1995 in New Literary History and now available also on the Web, in which he argues that there are

"three fundamental questions we need to ask in order to understand audiovisual culture:
  1. First, how is the form of the commodity changing along with its determinations of the space and time of the market, and the nature and value of exchange?
  2. Second, how is the nature of representation and communication changing with respect to the digital creation, manipulation, and distribution of signs?
  3. And finally, how is our experience of collectivity changing in this new audiovisual culture, and how is our collective experience of social time and space restructured by the communicational architecture of audiovisual culture?"

In addressing the first of these questions, Rodowick makes a point I would like to underscore here:

"...the idea of "free" or measureless time is disappearing. In fact, time is becoming increasing commodified in a number of ways. Commercial broadcasting and telephony are again the innovators here....the value of access to information or entertainment via cable or telephone lines is determined not by spatial quantity--weight, volume, or number--rather, it is measured by units of time. Alternatively, the value of services is measured by the time they "create." The idea of "free" time as a commodity has a paradoxical status, then, since it assumes that time indeed has a value that is quantifiable and tenderable in a system of exchange."

Those of you who are familiar with the writings of the Frankfurt school will recognize in Rodowick's assertion an idea first expressed by those cultural critics, coincident with the birth of mass media, namely, the idea of leisure as a commodity: take, for example, this passage from Adorno and Horkheimer's essay on "The Culture Industry" in The Dialectic of Enlightenment:

"Something is provided for all so that none may escape.... Everybody must behave (as if spontaneously) in accordance with his previously determined and indexed level, and choose the category of mass product turned out for his type."

In short, "You WILL." For Adorno and Horkheimer, "culture as a common denominator already contains in embryo that schematization and process of cataloging and classification which bring culture within the sphere of administration"--an administration most effectively managed by mass media, which communicate "the natural heat of Government and Culture more distributively to all extreme parts, which now lie num and neglected."

The cultural imperative to communicate is also, then, an economic imperative, and as such it embodies capitalism's drive to colonize and create new markets: in the present moment, our private lives, our "free time," are the "num and neglected" territory, and the wiring of that territory is late capitalism's project. The issue is not whether you will have access to information, but whether information will have access to you--at the beach, on the road, anywhere anytime. That's not to say that access will be uniform, universal, and classless: it is still true, as Adorno and Horkheimer point out, that "the standard of life enjoyed corresponds very closely to the degree to which classes and individuals are essentially bound up with the system....and apart from certain capital crimes, the most mortal of sins is to be an outsider." But academics are without question an information-rich class, and the enterprise of academic humanism is inescapably tied up in this same imperative to communicate. Therefore, far from being able to turn away from the problems of being wired, we have to face the contradictions, compromises, and constraints it entails.

The Professional Imperative: Collaborate

The word "collaborate" has an interesting double meaning: to cooperate traitorously with the enemy, and to work in conjunction with others, especially in literary or artistic endeavors. Both senses are available in the professional imperative I have in mind, and the key is to understand which is which.

If the current meaning of the word "culture" precipitated in the 19th century, as Williams suggests, then its most effective proponent was certainly Matthew Arnold. In the famous "sweetness and light" chapter of Culture and Anarchy, Arnold writes that

"the great men of culture are those who have had a passion for diffusing, for making prevail, for carrying from end of society to the other, the best knowledge, the best ideas of their time; who have laboured to divest knowledge of all that was harsh, uncouth, difficult, abstract, professional, exclusive; to humanise it, to make it efficient outside the clique of cultivated and learned, yet still remaining the best knowledge and thought of the time, and a true source, therefore, of sweetness and light."

This is, if anything could be, the statement that defines the mission of the academy in the 19th and 20th centuries: in spite of post-structuralism, nihilism, postmodernism, and other forms of negative capability, most of us would still admit to sharing at least some of Arnold's ideals. Even Mark Taylor, whose Imagologies is clearly intended as a provocation to his profession--stuffed with epigrams like "The imagination must be undisciplined. That is why the university can't bear it"--even Taylor turns out to be pursuing a version of the Arnoldian program, as, for example, when he says: "if the young are not reading as much as we would like, the problem might be the way we are writing. Perhaps if we were to write televisually, kids would want to read more."

Here's the problem, then: if communication is the vector of transmission for culture, and if culture is the means by which a social and economic order reproduces itself, how are we to avoid collaborating in the injustice and mindless affirmation of that enterprise, especially considering that we are, more than most, subject to the wiring of our free time? Shouldn't we, instead, resist being consumed by communication?

Honestly, I don't think that's an option: professionally speaking, we exist to communicate, and denying that would be to deny that there is a value in cultural activity. We are responsible for interrogating, understanding, yes replicating but sometimes changing the way we value culture, and for that reason, I agree with Taylor's dictum: the only responsible intellectual is the one who is wired. That's not to say that all wired intellectuals are responsible, but it does imply that the core of our activity is communication, and therefore we have to engage the new forms and methods communication is now taking. It would be extremely irresponsible to categorically condemn or abjure any one of these channels: the effect of doing so would not be to improve the culture, but to ensure that we are left out of its continually contested development.

There is no doubt, though, that engaging with the wired world will change what it means to be an intellectual, an academic, a scholar, a critic, a humanist. I think we can already see what one of these changes will be, and I think it involves the second, more positive sense of the word "collaborate." Specifically, I think that the character of academic work in the humanities is already in the process of shifting from a cooperative to a collaborative model: in the cooperative model, the individual produces scholarship that refers to and draws on the work of other individuals; in the collaborative model, one works in conjunction with others, jointly producing scholarship that cannot be attributed to a single author. This will happen, and is already happening, because of computers and computer networks. Many of us already cooperate, on networked discussion groups and in private email, in the research of others: we answer questions, provide references for citations, engage in discussion. From here, it's a small step to collaboration, using those same channels as a way to overcome geographical dispersion, the difference in time zones, and the limitations of our own knowledge.

There is another reason, more compelling than the ease of communication, for predicting that computers will make us work collaboratively. Computers make it possible to pose questions, to frame research problems, that would otherwise be impossible to imagine. The computer provides us with the ability to keep track of enormous amounts of information, to sort and select that information rapidly and in many different ways, and to uncover in reams of mute data the aesthetically and intellectually apprehensible patterns on which understanding depends. But in order to take advantage of these capabilities, we first have to gather and structure the data: this requires collaboration of two sorts. First, because of the sheer size of the undertaking, it requires collaboration with colleagues in one's discipline: it takes many hands to assemble the enormous quantities of raw data on which this kind of research depends. Second, it requires collaboration with professionals of another sort, namely computer professionals. It may be the case, at some point in the utopian future, that computers will understand people; for now, we need to do this work in conjunction with people who understand computers, and who can help us to make them do what we want them to. This is more true the more we depart from the kind of operations on data that are current in the world of business, science, and entertainment. Spreadsheets, database programs, and multimedia authoring systems are tools adapted to the analytical and communicative practices of those worlds, and to the extent that our needs fail to fit those models, these tools will be useless. The computer, however, is at bottom a general purpose modelling machine, and with the right collaborators, we can use it to model analytical and expressive practices not yet imagined by Lotus, Microsoft, or Disney.

Collaboration may well make us uncomfortable, since it implies dependence on others and the consequent loss of autonomy. In exchange, though, we get a vastly expanded territory of intellectual inquiry: instead of concentrating on major events, historians can examine and compare the lives of individuals; instead of establishing a single text, editors can present the whole layered history of composition and dissemination; instead of opening for the reader a single path through a thicket of text, the critic can provide her with a map and a machete. This is not an abdication of the responsibility to educate or illuminate: on the contrary, it engages the reader, the user, as a third kind of collaborator, a collaborator in the construction of meaning. This third kind of collaboration is, I want to emphasize, a game with a net: the reader's participation is bounded by the perspective of the researchers and the availability of information; the result can be an understanding of the practice of the discipline, on the part of the reader, that is experiential rather than received; it can also be a conclusion unforseen by the researchers yet supported by the data. I'd go even further and argue that it is our responsibility, not only to provide the opportunity for this kind of collaboration in our research, but to teach our students to work collaboratively with one another in our classes: this will be the way they work when they leave the university, even if they enter our profession.

These are good things, and good reasons to collaborate with one another, with computer professionals, and with our readers. To do that, we need to be wired. Unfortunately, the cultural imperative to communicate and the professional imperative to collaborate are in more or less direct conflict with the final, vocational imperative to contemplate.

The Vocational Imperative: Contemplate

I'm using the word "vocational" here not to refer to a job, but to refer to a calling. The academic, the intellectual, the monk, the philosopher (to rewind the job title from the present to its origins) has always been a contemplative calling. That this is still the case is made clear in the term "ivory tower" and in the contrast frequently drawn, by us as well, between "the real world" and ours. And in fact it is true: in order to think, and to write, you need time to yourself. And yet, it is precisely this kind of privacy that is targetted now as the last new market. The difficulty, then, is to meet the responsiblity to communicate and take advantage of the opportunity to collaborate, while avoiding the continual distraction of market-driven communication--to preserve, in short, a space for the kind of uninterrupted contemplation that is required to create or to analyze.


Give me, kind Heaven, a private station,
A mind serene for contemplation!
Title and profit I resign;
The post of honour shall be mine.

In their original context, Gay's lines suggest that we opt out of public life and its rewards in order to pursue the more honorable, if less remunerative, activity of the philosopher. But now that the private station is in fact a workstation, and the private sphere has been irreversibly colonized by the machinery of communication, we have a different problem, namely whether "a mind serene for contemplation" is a possibility at all. I take this question very seriously, because I think it is both inevitable and in many ways desirable that we should be wired, and because I understand from experience the exposure this entails.

The struggle to preserve a space for contemplation is, necessarily, a personal one, and therefore my statement of the problem may strike you as overly, and overtly, autobiographical: don't bet on it. For better or worse--for better and worse--I think it is a central problem for the humanities in general, for the enterprises of scholarship and criticism, and perhaps most of all for contemporary cultural studies, where the scene of communication itself is always in some sense the subject of analysis.

What to do? Take the phone off the hook every afternoon? Refuse to check email after dinner? Leave the laptop behind when you go to the beach? Perhaps, and perhaps the corollary to Taylor's assertion that "the only responsible intellectual is the one who is wired" would be the qualifying statement that "the responsible intellectual is not wired all the time." I'd like to think that is a possibility, that we could, individually and collectively, say "you wish!" to AT&T's "you will," without at the same time forswearing communication altogether.