Technology In Literature Essay

Information technology made Plato anxious. Writing, he feared, would lead people to abandon their memory, to trust in “external characters which are no part of themselves.” Now we find ourselves living through a new revolution in information technology, one with consequences every bit as dramatic and likely even more profound. How could we not be anxious? Our old ways of communicating are either becoming obsolete or finding themselves dramatically ‘repurposed’ before our very eyes.

            Including the grandest one of all: literature.

            Literature is one of those categories that have vexed the human intellect for centuries. Typically we think of the classics – Shakespeare, Melville, Joyce, and so on – when we think of literature. If we don’t know exactly what it is, we like to think we know what it looks like. In other words, we use resemblance as our primary criterion. And indeed when you look at the output of contemporary literary authors you find no shortage of family resemblances: lyricism of prose, thematic sophistication, quotidian subject matters, and of course the all important yen for experimentation.

            The morphology of what we like to call literature has remained fairly stable since at least the beginning of the twentieth century. The ‘norms of representation’ have been smashed and gratuitously rearranged; the protagonist has been subjected to endless sessions of existential water torture; the language has been stripped pornographically bare and heaped with gaudy ornamentation, again and again and again. All the patterns have become easily recognizable, so much so that you can typically identify a literary piece within the first few sentences of reading. Literature, as it is typically understood, is a very distinct cultural animal. Most of us can smell it even before it comes into view.

            The problem, I would like to argue, is one of habitat. The fact is, the baroque morphology of literature belongs to a far different social and technological environment than our own. We are presently witnessing what is already the most profound transformation of human communication in history (short of the written word, maybe). The internet, the smartphone, the tablet, satellite and cable on-demand television, market segmentation, algorithmic marketing: the list of game-changers goes on and on. Make no mistake, we are talking about social and semantic habitat destruction without compare. The old rainforests of culture have been cleared away, and literature, with its prehensile hands and brachiating arms, now reaches for heights it can no longer climb and stares into distances it can no longer see.            

             No generation has witnessed such a sudden change in cultural environment, period. And yet, if anything, the health of the literary animal seems entirely unaffected. When Professor John Mullan of University College was recently asked by The Guardian to provide an overview of the ‘state of British literary fiction,’ he called it “one of the most extraordinary publishing phenomena of recent decades.”

            Mullan paints his own picture of social transformation, one where the slow trickle of writers and readers through the post-secondary bottleneck has managed to rewrite the culture of reading. On the composition side, he notes the explosion in creative writing programs, and how almost all writers of literary fiction have some sort of university background. On the reception side, he notes that “there are more graduates from literature, especially English literature, degrees than ever.”

            The situation is precisely opposite what Alvin Kernan predicted in The Death of Literature some twenty years ago: far from killing literature (by adopting postmodern critiques of its rationale in a time profound social change), academia has transformed it into a cultural juggernaut. In the course of teaching theory and the classics, universities have inadvertently produced both the suppliers and the consumers of literary fiction, to the point where work that was once the province of intellectual avant garde movements now enjoys mass consumption and pride of place in many media. The results are so profound that Mullan dares imagine the unthinkable: that far from retreating “before the forces of electronic media and consumer idiotism,” higher literacy is carrying the day. 

            Assuming that this account applies to the whole English speaking world as much as Britain, you might say that the literary animal is flourishing. Somehow, the implication seems to be, the ongoing communication revolution has all but passed literature over, allowing an old institution, the university, to bring about a happy revolution all its own. Far from threatened with extinction, literature is thriving in the age of information technology…

            So why does it all feel so, well, dusty?

            To be sure, not everyone in the literary world shares Mullan’s triumphal outlook. The sales figures may be difficult to argue with, but for many this is more cause for worry than celebration. In his notorious “Where Have All the Mailers Gone?” Lee Siegel declares that “fiction has become a museum-piece genre,” that readers wanting to be challenged and illuminated had better turn to nonfiction. In his most recent interview in The Guardian, Gabriel Josipovici, author of What Ever Happened to Modernism? claims that the recent efflorescence so extolled by Mullan is little more than “prep-school boys showing off.”

            A kind of shadowy consensus has grown among certain critics and academics that something has gone drastically wrong in the world of literature, that far from healthy, the literary animal is in fact dead or on death’s door. Everyone has their own diagnosis: for Siegel it is the professionalization of what should be a vocation; for Josipovici it is a failure of nerve and imagination in the face of market temptation. But for most all of them, the problem is that literature, despite all the ways it resembles literary works from days gone by, no longer does what it once did. Where’s the scandal? Where’s the daring? The revelation?

            The tendency among these critics is to gloss the communications revolution and blame the practitioners, to think the problem is primarily one of execution. Literature isn’t doing what it’s supposed to do because contemporary literary writers and editors are too institutionalized, too timid or too inept. But what if the old morphology is to blame?

            What if information technology has so transformed the social and economic conditions of literature, that the old forms are simply no longer capable of reliably producing literary effects?

            In order to be stable, communication must mutually benefit both the sender and the receiver, otherwise the incentive to communicate evaporates. Receivers typically assess the value of any communication through what is called trust calibration, where we evaluate the motives of the sender, and coherence checking, where we evaluate the ‘fit’ between the message and our background beliefs. If a cold-calling salesperson makes a pitch, we close the door because we don’t trust their motives. If an otherwise trusted friend tells us something we think outlandish, we change the topic to avoid arguing at the dinner table. All communication is biased toward ingroup identification and a shared background of beliefs and assumptions.

            We have a strong inclination, in other words, to ‘talk amongst ourselves.’

            As antithetical to ‘unfettered creative expression’ as this social psychological approach sounds, it actually provides a clear way to understand something essential to literary communication. Literature, you could say, is the kind of narrative message that challenges rather than reinforces our background assumptions. If a given form of narrative reinforces assumptions, then it is quite simply not literature, no matter what it resembles. This is why we think literature has a special relationship with risk: a literary communication is one where the sender actively works against the coherence of his or her message relative to some reader. It is inherently unstable.

            Or should be.

            This is the reason we should be suspicious of the stability of the happy picture offered by Mullan. In Mullan’s account, literary fiction has evolved into what could only be called a spectacular ingroup exercise: thousands of university trained writers writing for millions of university trained readers. As a product of the same institution, the sender can be trusted to provide content that will readily conform to the receiver’s background beliefs. No matter what purported difficulty they encounter, they can be sure that it will fit. In Mullan’s account, the literary animal is so healthy simply because it lives in a communicative zoo, a place where no one need fear that the animal does anything really unexpected because everyone has been trained to anticipate its wiles.

            Human beings are parochial, blinkered creatures, loathe to relinquish any number of injurious views no matter what their political stripe. The social value of literature has always turned on its ability to reveal and mitigate these shortcomings, to ‘shake things up,’ and so, bit by corrosive bit, effect cultural reform. But doing this requires forming stable communicative relationships despite the absence of ‘fit’ between the sender’s and receiver’s default assumptions. Not an easy thing to do. This is why ‘finding the reader’ has always been the great problem faced by literary fiction, so much so that posterity is ritually called upon to redeem its insularity: as a form of communication antagonistic to existing conditions of communication, it often has to wait for the rest of the world to catch up.

            And this, I want to argue, is where the information revolution becomes a fundamental game-changer.

            Time and place have always been the great communicative constraints. Before the advent of writing, senders and receivers always had to communicate face to face. Writing more or less banished time from the equation, and minimized the importance of geography to a certain degree. The printing press revolutionized the economics, and therefore the efficiencies of this first great transformation. And now, with information technology, both time and place have been rendered moot, more or less. We can receive communiques from Plato anywhere at anytime.

            The great communication constraint of today has to do with sorting, finding those communications that you want in an ocean of shouting pixels. Whole industries have sprung up around the problem of finding in the internet age. And with them, the old world of connecting suppliers and buyers has been utterly swept away.

            Armed with ever more sophisticated ways of gathering consumer information, and ever more powerful mathematical tools for mining and interpreting that information, suppliers have been able to segment markets and target buyers in ways their business forebears could scarce imagine. The tools have become so powerful, in fact, that many commentators, like Stephan Baker, author of The Numerati, worry we are turning ourselves into ‘data serfs,’ slaves to the very systems that anticipate our merest desires. For the bulk of human history, need has driven the economic connection of supplier and buyer. The industrial revolution ushered in the advent of want as the main economic driver. We are now entering what might be called the Age of Whimsy.

            As a luxury good, the literary novel is an artifact of the Age of Want, a time when suppliers could only connect with buyers in bulk, lumping large populations together in the hope of hitting ‘targets’ they could never definitively define. Relying on ‘hunches’ rather than hard data, suppliers had to take a ‘shot-gun’ approach. The result was a far more amorphous marketplace, one where the chances of forming less than optimal supplier-buyer connections were relatively high.

            In the publishing industry, the connection of suppliers and buyers is at once the connection of senders and receivers, simply because this latter, communicative connection is the very commodity supplied. The ‘misses’ of the former actually facilitated the possibility of less-than-stable connections between senders and receivers. The literary writer could, as the truism goes, ‘write for themselves,’ according to their own want and whimsy, confident that the inefficiencies of the system would allow them to ‘find their reader,’ receivers with incompatible background beliefs. At the same time, you might imagine that buyer-receivers, who were accustomed to misses, would be more prone to forgive discrepancies, to ‘settle’ for less than stable communicative relationships and so be more open to literary experiences.

            The last two decades have all but swept this social and economic environment away. The kinds of preference parsing algorithms behind Amazon’s ubiquitous, ‘You might also like…’ feature allow suppliers to target buyers with uncanny accuracy and provide us with exactly what we want. The problem is that we want to be right. Even though challenging background beliefs typically benefits everyone, human beings are averse to criticism. We are literally hardwired to seek out confirmation and to overlook or dismiss incompatible information. As a consequence marketing algorithms such as those employed by Amazon typically connect readers with novels that accord with their attitudes and assumptions.

            The ‘flat world,’ it turns out, is an increasingly sycophantic one.

            In the Age of Whimsy, the ever increasing efficiency with which suppliers connect with buyers assures that ‘writing for yourself’ amounts to writing to people like yourself, to people who (thanks to the indoctrinating power of the university system) share the bulk of your values and attitudes. ‘Writing for yourself’ now means writing books entirely amenable to trust calibration and coherence checking, and so forging communicative relationships as stable as any other form of commercial fiction.

            To ‘write for yourself,’ you might say, is in the process of becoming indistinguishable from ‘selling out.’ Literary fiction is becoming precisely what you might expect given the way information technology is transforming markets: a fixed form with a dedicated audience.

            One genre among many.

            In other words, writing literary fiction today amounts to writing entertainment in the guise of writing literature. Some authors, such as Jonathan Franzen, have retreated from the lofty concepts of our recent literary past, realizing that things have changed. Others, like Tom McCarthy, persist in making the same old claims and pronouncements, and talk of ‘disrupting’ a culture of receivers with which they have little or no connection. More and more, you find references to what might be called the ‘Ideal Philistine’ in literary culture, to people with dissenting beliefs who would be challenged by literary works, were they to read them.

            Where some have given up the literary ghost, others simply pretend that nothing has changed.

            Does this mean the information revolution has rendered genuine literary communication impossible? Not at all. Just as dramatic environmental change begets evolutionary innovations (like us), literary writers actually find themselves in a time of profound opportunity. Even as technology threatens the old literary animal with extinction, it has provided powerful tools for the evolution of something new, and perhaps even better.

            The primary dilemma for the contemporary literary author is simply this: how do you find a reader who doesn’t necessarily want to find you?

            The luxury of ‘writing for yourself’ is simply no longer an option. As should be clear by this point, the worst possible thing one could do is write literary fiction, serve a market where almost no one is challenged and nearly everyone is gratified. You need to be both more expansive and more savvy.

            So how do you find readers who don’t necessarily want to find you? In the absence of all the old inefficiencies, the literary author has to exploit the efficiencies of the new marketplace. Despite the dire pronouncements of recent years, the ‘reading public’ exists the same as before: according to the American Association of Publishers, 2010 book sales actually rose 3.6% over the 2009 calendar year. What has changed is all the socio-economic machinery between the author and the reader, machinery that the former can no longer afford to ignore. Since a work only produces literary effects relative to some audience of readers, literary authors need to know their readers. They need to identify audiences possessing dissenting values and attitudes. Then they need to either hijack or embrace the narrative forms most commonly marketed to them.

            This means all the old and largely unfounded prejudices against genre fiction must be set aside. Genre only seems antithetical to ‘literature’ because the literary have turned it into a flattering foil, abandoned it, in effect, leaving a rhetorical fog of self-congratulation in their wake. In my own case, I chose epic fantasy because I knew the best way to provoke readers with a narrative meditation on the nature and consequences of belief was to reach actual believers. And provoke I did. Other writers, like China Mieville, M. John Harrison, Gene Wolfe, John Crowley, to name just a few, are doing the same thing, producing work that is obviously literary, openly provocative, yet unheard of in literary circles for the simple sin of wearing wrong generic skin. These are the writers who are genuinely shaking things up, as opposed to hawking intellectual and aesthetic buzzes inside the literary echo chamber.

            Commercial genres must be seen for what they are, relatively fixed channels of communication to relatively dedicated audiences, not as ‘cages’ preventing some mythic ‘free expression.’ All channels of communication force senders to ‘play the game’ to reach a given group of receivers. English is such a game. The rules only seem coercive, ‘like work,’ when you don’t enjoy the game or if you think it’s ‘stupid’ or ‘beneath’ you. The literary author has to move past these old and embarrassing conceits. The idea is to play the margins, to play the game well enough to be identified as a ‘trusted sender’ by the receiver, all the while exploring ways to challenge their background assumptions.

            This is no easy task. Luckily, information technology has brought about a curious and potentially revolutionary reversal of the roles traditionally assigned to writers and readers. Before the internet, writers were almost exclusively senders and readers were almost exclusively receivers. The effort required to contact an author effectively restricted communication to ‘fan mail’ and ‘kaffeeklatches.’ This assured that most of the feedback a writer received would be complimentary, something useful for motivation perhaps, but not so useful for calibrating communicative tactics. Now, every author living is simply one ‘vanity google’ away from all stripes of unfiltered feedback from blogs, messageboards, and special interest sites (such as Goodreads).

            The internet allows the contemporary author to understand their readers better than at any time in modern history, simply because it allows them to literally see the consequences of their artistic decisions. This can become something of a masochistic exercise, to be sure, but if you are serious about writing something that actually challenges actual readers without scaring them away, then access to this kind of information is invaluable. Senders no longer have to rely on blind guesswork. In my own novels I have used the internet to craft everything from storylines that collapse pulp into philosophy, to protagonists designed to simultaneously gratify and deny the kinds of wish-fulfilment that underwrite ‘character identification’–things that no English department in the world teaches, let alone considers.

            The internet, in other words, allows the contemporary literary author to run genuine experiments. The old literary use of the term ‘experiment’ was largely specious: formal innovations in the absence of consequence testing can only be ‘for their own sake,’ or the sake of readers who have been trained to expect them. Thanks to the internet, I have been able to develop a fairly detailed understanding of which experiments have failed and which have succeeded. Once you adopt a genre as a vehicle for expression, everything becomes a matter of give and take. Some points are simply not worth scoring because they crash your communicative relationship with too many readers. Some tactics allow you to get away with ideological murder, if executed with enough elegance and momentum. Others end up having the exact opposite effect you intended!

            If there’s one thing the internet shows you as a writer, it’s that there is no such thing as ‘the Reader.’ As a writer you are communicating to populations of readers. And as a genre writer, you’re communicating to populations of readers with a far more eclectic set of background beliefs than you could ever hope to find in the ‘literary mainstream.’ Genre, in fact, is where you find most all the people who disagree.

            There’s a reason why only Harry Potter gets burned anymore.

            My argument is simple: To thrive in the fluid, multifarious information habitat of today, the literary animal must become a chameleon. Authors who want to be part of the cultural solution can no longer trust in posterity or the ‘power of their art’; they have to game the new social, economic, and technological conditions of their practice. Either you stick with literary resemblance, gratify your tastes and sense of superiority, and simply entertain (which is quite alright, so long as your rhetoric reflects as much), or you get serious about literary effects and begin creating the new, many-coloured literature of the information age.

            Even if you disagree with my analysis, there can be no doubt that the consequences of information technology imperil literature in a multitude of ways, only a few of which have been considered here. The threat is existential. Literary culture must reinvent itself or risk extinction: there can be no question about this.

            But will it?

            If Mullan is right, and universities are the primary engine of contemporary literary culture, then the prospects are dim simply because of the way academia is entrenched outside the demands of mainstream society. Short of some sweeping, generational change in ideological fashion, it has the demonstrated capacity to cling to its values, no matter how maladapted, in perpetuity.

            The fact that these values are so flattering, that readers and writers of literary fiction are so prone to identify themselves (despite their complicity) against ‘consumer idiocy,’ will only make them that much more difficult to dislodge. Concepts are bigots: if you identify yourself as literary, then you will automatically and unconsciously sort the ‘serious’ from the ‘silly’ in ways that conserve the literary status quo. Thanks to the psychological mechanisms of value attribution, we pass judgement with our every breath, no matter how ‘self-critical’ we pretend to be.

            Our brains have preference parsing algorithms of their own!

            And perhaps worst of all, these values allow the so-called literary writer to be lazy, to indulge their own tastes and assumptions under the guise of ‘making the world a better place.’ Wherever you find a high opinion, hypocrisy is never far.

            These three things, institutional inertia, value attribution, and good old-fashioned laziness all but guarantee years, if not decades, of denial and rationalization from literary culture. Defectors will be dismissed, lampooned, and ignored, the same as defectors from any other vested institution. This is why the path I’m advocating is sure to remain the lesser travelled one: It involves real professional risk and real creative toil.

            Something we once expected from our literary authors.

Like this:


My Futuristic Past

I was born on 20 August 1968 – eleven months to the day before the first Apollo Moon Landing. The Space Age was always something to which I aspired rather than belonged.

For several years, between approximately 1976 and 1979, I wasn’t interested in anything earthbound. The two most important films of my boyhood were Star Wars, which showed me where I wanted to be, and Close Encounters of the Third Kind, which showed me a possible means of getting there.

In preference to Ampthill, Bedfordshire in 1979, I would have taken any technological dystopia. There was no armed rebellion against Margaret Thatcher, and even if there had been it would not have involved laser guns.

A couple of years ago, I spent three months playing World of Warcraft – partly as research for a short story I was writing, mostly because I became addicted to it. This convinced me of one thing: If the computer games which exist now had existed back in 1979 I would not have read any books, I think; I would not have seen writing as an adequate entertainment; I would not have seen going outdoors as sufficiently interesting to bother with.

Similarly, I find it difficult to understand why any eleven-year-old of today would be sufficiently bored to turn inward for entertainment.

This raises the question as to how future writers will come about, without ‘silence, exile and cunning’ – without the need for these things?

I was formed, as a writer, by the boredom of the place in which I lived. Philip Larkin said ‘not the place’s fault’ – but in my case, I think it was. And then, the being taken out of the first place into another place (boarding school) where I was unable to have any privacy. This developed a mania for privacy in me, which began to come out as poetry, as a diary. It’s not that I didn’t do these things before – they just became essentials for self-creating, self-preservation. That’s how I read myself, anyway.


The Reader and Technology

Literature isn’t alien to technology, literature is technological to begin with.

Literature depends on technology – a society needs to be able to do more than subsist before it produces a literature. An oral culture, yes, that is possible – but I am referring specifically to words on the page, words on the screen.

The internet connection offers all of us the constant temptation of snippets, of trivia. We don’t live, as other writers did in the past, without these particular temptations. They had their own temptations: Byron wasn’t undistracted. Yet there were greater acres of emptiness, surely. Travel took forever. Winters isolated. Boredom was there as a resource for daydreaming, trancing out.

I think writers will continue to occur but technology and its trivia will cause us to lose something, just as we lost something when we lost the classical education. We write worse because we cannot write classical prose. Yet classical prose is useless for describing the world of 2012, the world that is there – ready to buzz – in your pocket or bag.

Our perceptions outrun the sedentary sentence by much too much; just as we listen to mp3s to hear what an album would sound like were we actually to sit down and listen to it, so we skim-read the classic books to get a sense of what they would be like were we to sit down and dwell on them.

Readers more accustomed to screens – web pages, iPhone displays – will scan a page of text for its contents, rather than experience it in a gradual linear top-left to bottom-right way. This will make for increased speed and decreased specificity. These readers will be half-distracted even as they read; their visual field will include other things than just the text, because they won’t feel happy unless those things are there. A writer of long, doubling-back sentences such as Henry James will be incomprehensible to them. They won’t be grammatically equipped to deal with him. They won’t be neurologically capable of reading him. Their eyes will photograph fields rather than, as ours do, or did, follow tracks.

This scanning approach will have a bad effect on sentence structure. For these readers, the fact they are reading bassackwardly – constructed sequences of words won’t matter. They won’t even notice. As long as the content is there for them somewhere on the page, the job of writing will have been done.

Perhaps future writers will, therefore, create vague fields of possible meaning; more Charles Olson than Ezra Pound. The exact sequence of sounds, the precise inflection of grammar – these things will seem prissy. We will be back to the eighteenth century, pre-Flaubert.

Isaac Babel’s famous sentence from his story ‘Guy de Maupassant’: ‘No iron spike can pierce a human heart as icily as a period in the right place.’ – prissy.


The Novel and Connectivity

The people novels have conventionally been written about are gradually ceasing to exist.

Novels have always belonged to aristocrats of time; not, I say, merely to aristocrats, although they have been disproportionately represented, but to those subjects who have freedom of choice about how to act within time. The Fordist factory-line workers, performing a repetitive task all day, cannot interest the novel for more than a few moments whilst they are at work. It is only when the machine stops that the story begins. (David Foster Wallace’s The Pale King attempts to make a novel out of the dead time of insanely repetitive deskwork; and it fails, at least in the form of it he left us.)

Proposition: ‘The human race is no longer sufficently bored with life to be distracted by an art form as boring as the novel.’

Perhaps novels will continue, but instead of the machine it will be the connectivity that stops, or becomes secondary.

What we’re going to see more and more of is the pseudo-contemporary novel – in which characters are, for some reason, cut off from one another, technologically cut off. Already, many contemporary novels avoid the truly contemporary (which is hyperconnectivity).

The basic plots of Western Literature depend on separation by distance – Odysseus separated from Penelope; the Odyssey doesn’t exist if Odysseus can catch an easyJet flight home, or text Penelope’s Blackberry. Joyce’s Ulysses doesn’t exist if Bloom can do his day’s business from a laptop in a Temple Bar coffeeshop.

I don’t want to overemphasize this. You could imagine a similar anxiety over how the telephone would undermine fiction. Perhaps it is just a matter of acceleration. But I don’t think I am alone in already being weary of characters who make their great discoveries whilst sitting in front of a computer screen. If for example a character, by diligent online research and persistent emailing, finds out one day – after a ping in their inbox – who their father really is, isn’t that a story hardly worth telling? Watching someone at a computer is dull. Watching someone play even the most exciting computer game is dull. You, reading this now, are not something any writer would want to write about for more than a sentence.


The Future

In the Preface to Volume 15 of the New York Edition, Henry James writes about ‘operative irony’. It’s a long quote, but try to stick with it because it may contain the whole future of the novel.

‘I have already mentioned the particular rebuke once addressed me on all this ground, the question of where on earth, where roundabout us at this hour, I had “found” my Neil Paradays, my Ralph Limberts, my Hugh Verekers and other such supersubtle fry. I was reminded then, as I have said, that these represented eminent cases fell to the ground, as by their foolish weight, unless I could give chapter and verse for the eminence. I was reduced to confessing I couldn’t, and yet must repeat again here how little I was so abashed. On going over these things I see, to our critical edification, exactly why – which was because I was able to plead that my postulates, my animating presences, were all, to their great enrichment, their intensification of value, ironic; the strength of applied irony being surely in the sincerities, the lucidities, the utilities that stand behind it. When it’s not a campaign, of a sort, on behalf of the something better (better than the obnoxious, the provoking object) that blessedly, as is assumed, might be, it’s not worth speaking of. But this is exactly what we mean by operative irony. It implies and projects the possible other case, the case rich and edifying where the actuality is pretentious and vain. So it plays its lamp; so, essentially, it carries that smokeless flame, which makes clear, with all the rest, the good cause that guides it. My application of which remarks is that the studies here collected have their justification in the ironic spirit, the spirit expressed by my being able to reply promptly enough to my friend: “If the life about us for the last thirty years refuses warrant for these examples, then so much the worse for that life. The constatation would be so deplorable that instead of making it we must dodge it: there are decencies that in the name of the general self-respect we must take for granted, there’s a kind of rudimentary intellectual honour to which we must, in the interest of civilization, at least pretend.” But I must really reproduce the whole passion of my retort.’

In the future, all novels will invoke a kind of operative irony; post-Twitter, post-whatever-comes-after-what-comes-after-Twitter. Who are these ‘supersubtle fry’, your characters, who have all this time in which to become rich, deep selfhoods? Where do you find these interesting subjects of yours?

Or, as Henry James appears to us, so we will appear to the readers of the near future: existing in a different, slow-flowing time that they will need to make an extreme effort of deceleration to access.

I think – as a result of all this – there will be great nostalgia for the pre-trivial age, not even to mention the pre-genetic manipulation age.

Literature can accommodate nostalgia, but only as a houseguest; if nostalgia becomes the landlord, architect and psychoanalyst, literature will have to evict itself.


Photograph by Patrick Feller

0 Thoughts to “Technology In Literature Essay

Leave a comment

L'indirizzo email non verrà pubblicato. I campi obbligatori sono contrassegnati *