Thursday, November 21, 2013

Pynchon's Bleeding Edge: Life Lessons in the Age of Google

 




“Our interest's on the dangerous edge of things.
The honest thief, the tender murderer,
The superstitious atheist, demirep
That loves and saves her soul in new French books--
We watch while these in equilibrium keep
The giddy line midway: one step aside,
They're classed and done with. I, then, keep the line
Before your sages” – Robert Browning, “Bishop Blougram’s Apology” (1855)


"Imperfection, ambiguity, opacity, disorder, and the opportunity to err, to sin, to do the wrong thing: all of these are constitutive of human freedom, and any concentrated attempt to root them out will root out that freedom as well. If we don't find the strength and the courage to escape the silicon mentality that fuels much of the current quest for technological perfection, we risk finding ourselves with [...] humans who have lost their basic capacity for moral reasoning, with lackluster (if not moribund) cultural institutions [...] with a perfectly controlled social environment that would make dissent not just impossible but possibly even unthinkable." – Evgeny Morozov, To Save Everything Click Here: The Folly of Technological Solutionism (2013)



For more than 50 years, Thomas Pynchon has explored how paradigms of rationality such as the Enlightenment have entailed irreparable losses. The eponymous surveyors in Mason & Dixon – hired to settle a border dispute between Pennsylvania in the north and Maryland and West Virginia in the south – realize only to late that the march of progress is more militaristic and mercenary than they at first believed, as they go about:

changing all from subjunctive to declarative, reducing Possibilities to Simplicities that serve the ends of Government, winning away from the realm of the Sacred, its Borderlands one by one, and assuming them unto the bare mortal World that is our home, and our Despair. (345)

There is a grammar at work here, as elsewhere in Pynchon. An adjective that crops up with surprising frequency in Gravity’s Rainbow is ‘preterite’, which the OED defines as "pertaining to a bygone time." As a noun it is still used in some grammars to denote completed events, but the meaning Pynchon has in mind is shaded by the original metaphor. Coined by Roman grammarians from præter + itus, one possible translation would be: 'having been outrun by the march of events.' As the present becomes history, the fireworks of possibilities petrify into the scoria of the past. And if you turn your gaze back towards the eruption – when everything was in the air – you are, as Ezra Pound knew first hand, branded “out of key” with your time. From present to past, from the subjunctive to the indicative, the novels are concerned with the collateral damages of this grammar as historical-capitalist teleology. 

The geography is one of borderlands, of the soon-to-be-enclosed commons of human existence, and the novels are populated by its denizens trying to carve out a dignified life in the margins: the New York schlemiel, the luddite, the obscurant Jesuit, the doper detective living the 60s dream even though the Manson Family murders and Nixon’s war on drugs will soon spell an end to peace, love and stratospheric trips.  

The shadow of what is soon to be looms over Pynchon's narratives told in the present progressive aspect. Mason and Dixon’s bumbling beeline business is one of “changing”, “reducing”, “winning” and “assuming.” This temporality shares more than a passing resemblance with Thomas Kuhn’s notion of paradigm shifts as the prime mover of scientific progress – not turning points but drawn-out processes or crises; ‘shiftings’ rather than shifts:

Though history is unlikely to record their names, some men have undoubtedly been driven to desert science because of their inability to tolerate crisis. Like artists, creative scientists must occasionally be able to live in a world out of joint—elsewhere I have described that necessity as ‘the essential tension’ implicit in scientific research. (The Structure of Scientific Revolutions 78-79)

The new and the old exist side by side in puzzling permutations and kaleidoscopic configurations, and every scientist vows, like Hamlet, to set it right again. The crisis is resolved when a critical mass of scientific consensus has been reached – when the subjunctive of conflicting hypotheses has been inflected into the declarative of scientific laws and theories. While Mason and Dixon not only find themselves on the “winning” side but with their bisection actually define what the winning side is, they realize that this comes with a huge cost. To define is to limit. All attempts to divide territory (according to demographics, language, ethnicity, culture, natural features etc) are biased. It is a form of gerrymandering. Thus, the point is not that there may be another latitude or length of the line which is more correct or equitable; rather, it is the realization that all other possibilities are nipped in the bud. Fifty years after their line was drawn, the British critic William Hazlitt, at a loss on how to account for Coleridge’s ambling walk, also drew the line: 

In digressing, in dilating, in passing from subject to subject, he appeared to me to float in air, to slide on ice. I observed that he continually crossed me on the way by shifting from one side of the footpath to the other. This struck me as an odd movement; but I did not at that time connect it with any instability of purpose or involuntary change of principle, as I have done since. He seemed unable to keep on in a straight line. (“My First Acquaintance with Poets”)

The connection might seem frivolous, but let us read the shifting sentiments of the good critic as indicative of a paradigmatic shift in society as a whole. What Hazlitt did not quite understand as a youth is clear to him on mature reflection as he pits digression and dilation against purpose and principle. In many of Pynchon’s novels there is a similar tension. The hero or heroine is cast a single-minded quester, not in pursuit of the holy grail but of a Macguffin (the elusive woman/notion V, the subterranean Trystero network, the shady dealings of the dot-com enterprise Hashlingrz). Carried to the extreme, the paradigm of the straight line weaves a spider's web of the world; when fine tendrils connect everything to everything else, a single movement will make the whole shudder. Nothing happens in isolation, hidden meanings and patterns abound. But the paranoid pursuit of the characters is kept in check by the narrative spiraling out of control in centrifugal movements of encyclopedic digression. This tension is never resolved, not because Pynchon is a card-carrying Postmodernist rallying against narrative closure, but because dilation and digression – the present progressive rather than the preterite – are proposed as ways of throwing powerful historical actors off script. From cartels of lightbulb manufacturers in the Weimar Republic and instigators of McCarthy witch-hunts to the NSA and Google, these all have a vested interest in us thinking in straight lines, acting on cues and following stage directions.

The novels portray brief windows of opportunity (in Against the Day there is a fascination with Icelandic spar, a mineral that refracts every image twice) that will not be closed as much as shattered. In this upcoming defenestration, many of the character will be thrown out and, unlike Mason and Dixon, find themselves on the dust heap of history. This sounds bleak, and it is easy too see in Gravity's Rainbow (or Against the Day) "a magnificent necropolis", as Richard Locke put it in his 1973 New York Times review. It is an apt metaphor for a grim picaresque in which death in its many guises – colonial past and genocidal present, the demise of free will and thought in the wake of behaviorism and book burnings – render the German towns Tyrone Slothrop passes through as sepulchral as Conrad's Brussels. But if we, as one character in Against the Day yearns to do, withdraw "angelwise and soar high enough to see more, consider exits from" we gain a bird's-eye view and realize that the cityscape is not necessarily one of death or dying.

It is important to realize that the battle lines are frequently obscure – whether between quaternions and vectors in 19th century mathematics or between tables and CSS in turn-of-the-millennium web design. This is precisely the point. The narrative digressions show us fault-lines everywhere; in every moment of time there will be scientific, technological and social struggles with zealous supporters on every side. When the subjunctive turns into the declarative, a sentence will indeed be passed on the loser (the scientist becomes an embarrassment for faculty, the tech user has to scour the web for nostalgic computer fora, and the activist is deemed a die-hard reactionary or revolutionary). But the lesson Pynchon teaches us is this: when one sentence is passed, others are barely begun, still inflected in the subjunctive. We can keep the vertiginous possibilities alive and jump between the cracking ice floes of opportunity. This, at least, is the lesson to be drawn from Bleeding Edge.

A pun on ‘cutting’ edge, ‘bleeding’ edge makes the former seem conventional and anemic. This late-90s phrase was used to describe untried technology that was potentially game changing but currently without any real demand, meaning make or break for investors. Nothing, however, has a shorter shelf life than slang and corporate buzzwords, and today it has a curiously obsolete ring to it. This is precisely the point. A Google Ngram search reveals a phrase at the height of its popularity in 2001, the year in which most of the novel is set. It is a brilliant title that both signifies and illustrates the present progressive aspect in Pynchon’s grammar of history and paradigm shiftings; the bleeding edge of the novel spans, so the dust jacket informs us, “the lull between the collapse of the dot-com boom and the terrible events of September 11th”, when Web 1.0 was having “adolescent angst”, Google had yet to IPO and Microsoft was “still considered the evil empire” – yet more windows of opportunity that would soon be shattered. In this case quite literally as Boeing 757 and 767s blasted into the Twin Towers and the Pentagon. One of the characters, a "professional nose" gifted with olfactory divination, senses an upcoming never-before-smelled event, comparing it to "breathing in needles." Perhaps she also smelled Guantanamo prisoners "left on the cold floor to urinate and defecate on themselves." And perhaps she followed the same trace forward in time and also smelled a rat – the same rodent a young NSA contractor would smell some ten years later. In Bleeding Edge, the events of 9/11 are used symbolically as the first link in a chain of causality that would bring repressive governmental and commercial interests into a dangerous liaison. But it is also a very real tragedy, vividly, through briefly, portrayed from the point of view of confused New Yorkers. Are we at war, mom? The "Wolf Blitzer guy says so." 

The mom in question is the protagonist Maxine Tarnow – an Upper West Side “defrocked” fraud investigator who, since her license has been revoked, no longer needs to think twice before packing a Beretta in her Kate Spade handbag as she goes investigating shady financial dealings. And on top of that she is doing her best to spare her boys on the verge of teenagehood from her cynicisms. In the course of her investigations into the goings-on of Hashlingrz, a dot-com company investing in fiber cable and buying up start-ups like it's 1999 all over again, she comes across a Canadian wunderkind hacker. Felix Boïngueaux used to program and sell phantom-ware (programs to skim credit cards and evade automatic tax reporting) to unscrupulous vendors. When Maxine meets him, however, he has switched sides and is now developing phantom-ware detection software for the IRS: “We build it, we disable it. You’re frowning. We’re beyond good and evil, the technology, it’s neutral, eh?” 

Felix is what is known as a “black hat hacker,” someone who commits computer crime for personal gain, and has few scruples snitching on former friends or working legally as a consultant if the paycheck happens to be fatter. He is only a better coder than the pimply script kiddies downloading ready-made rootkits to steal credit card information, or wreak havoc on websites to gain the respect of their peers. They are emblematic of a new brood of hackers for whom the rallying cry of the free software coders of the 80s and 90s has little meaning. Why would information want to be free when it can be sold to the highest bidder? Even those of the older generation, steeped in West Coast counter culture, are under intense pressure to sell out. Gabriel Ice, for example, is eager to lay his hands on DeepArcher (read: departure) – a community built 3D reality world, and the brainchild of Californian transplants Justin and Lucas who now face the: “Same old classic dotcom dilemma, be rich forever or make a tarball out of it and post it around for free, and keep the cred and maybe self-esteem but stay more or less middle income.”

The times they are a-changing. In the wake of the crash comes the realization that the just-because-it-can-be-done ethos of the first dot-com era would no longer cut it. The start-ups and scattered survivors still had the trappings of cool, hip and friggin awesome, but this was coupled with solid business plans and hordes of Ivy League MBAs on the payroll. As Microsoft exploited its monopoly and got entangled in antitrust lawsuits for pushing its browser on every Windows user, there was a new kid on the block who stood up to the bully with the unofficial motto: Don’t be evil. Two Stanford PhD students had developed an algorithm for ranking the importance of websites according to how many pages linked back to them. This holistic approach was a game changer. Previous search engines did not take the dynamic relations between different pages into account (and many of the most popular ones were just searchable indices of cataloged webpages). Google.com was launched in 1998 with a mission statement that could have been penned by Diderot: “to organize the world’s information and make it universally accessible and useful.”

Information wants to be universally accessible and free! Had the hacker dream finally come true? If this, truth be told, were the only guiding light the company would not have survived the burst of the bubble, much less posted a gross revenue of $50.1 billion for 2012. The much-touted universal access was never a goal in itself. At least not after the Stanford campus nights of Mountain Dew-fueled coding and balls-to-the-wall tech feats had turned to early-morning meetings with investors and backers. While the highfalutin rhetoric exploded after the IPO, as the company made forays into cultural or humanitarian causes – book digitization and disease eradication to name a few – altruism had very little to do with this. Rather, the goal was to expose as large an audience as possible to ads by anticipating and grabbing market shares in the technology of tomorrow.

Whereas the search engines of yesteryear ranked pages according to how exact the match between the search term and the text on the web page was, Google’s top hits are determined by cash. The AdSense and the AdWords programs form a highly profitable marketplace between corporate interests, both paying for exposure and getting paid to show ads, with Google as the middleman cashing in on these exchanges. Personal emails are eavesdropped upon and searched for specific terms to customize ads (an "ordinary business practice," it seems) and search queries are automatically, and sometimes very discreetly, changed if they are too specific or idiosyncratic, meaning that you can’t be pigeonholed and directed to a specific web site generating revenue for Google. Behind the ever-refined algorithms lies the idea that consumer (or indeed human) behavior is predictable – the philosophical corollary being that the world is finite and computable. In fact, the hip rhetoric about innovation translates into the jaded cynicism of Ecclesiastes: "there is nothing new under the sun." Google Translate regurgitates tidbits from previous human translations, if someone once clicked on an ad after reading a certain phrase in an email, it is assumed that you might be in the market for the same product if you use or come across the same phrase... "If I know your sect, I anticipate your argument,” Emerson once claimed. To him groupthink conformity was anathema to self-reliance and personal innovation. In contrast to this, and for all its talk of global empowerment, Google's business plan hinges on conformity and predictability. In the brave new world they envision everyone is a 'sectarian' whose argument can be algorithmically predicted along the lines of: “If I know where you are (and I do know that), what pages you have been visiting (yes) and what you write in your emails (check!) I anticipate your desires. Think of all the trouble it saves you!"

Evgeny Morozov notes how Silicon Valley's favorite slogan has silently changed from "Innovate or Die!"' to "Ameliorate or Die!" Click on the search result, the ad, the "I agree" button and everything will be taken care of. Buckle up and enjoy the ride (in a self-driven Google car) to the picturesque countryside of data farms and clouds to which your mental faculties have been outsourced. No need to get all sentimental about it. Once back home, a humanoid Jeeves is at your service, ready to lend a sympathetic ear to your concerns – should you have any – as he irons out the creased newspaper of yesterday into a starched digital copy. It does not stop there, of course. In fact, it doesn't stop until your digital valet has ironed out every single snag of your existence. Ah, how beauteous technology is! But as writers from Dostoevsky to Pynchon have tirelessly explored, there is a flip side: Enlightenment logic does not always go hand in hand with human logic. One of the most poignant illustrations of this disconnect is found in the 1873 autobiography of John Stuart Mill. Raised as a prodigy by his Benthamite father who believed that utilitarianism would lead to a perfect society, he suffered a mental breakdown in his 20s after making the following thought experiment:

Suppose that all your objects in life were realized; that all the changes in institutions and opinions which you were looking forward to, could be completely effected this very instant: would this be a great joy and happiness to you? And an irrepressible self-consciousness distinctly answered, "No!" At this my heart sank within me: the whole foundation on which my life was constructed fell down. 

Having been force-fed with political-science tracts since he was barely out of diapers, he now realized that there was a fatal flaw – something incomplete, a je ne sais quoi – even to the most well intentioned scheme. Poetry, which he now read for the first time, provided him with a way out of the crisis and gave him a more nuanced understanding of the human condition. His attempts to better it remained unflagging, but he came to reject the idea of a panacea, a one-size-fits-all cure: "If I am asked, what system of philosophy I substituted for that, I had abandoned, I answer, No System: only a conviction that the true system was something more complex and many-sided than I had previously had any idea of." It is unlikely that Mill will ever make it to a Google Doodle, but he ought to be required reading for every glib Ted Talker. Not in order to throw a damper on ideas that might prove beneficial, but to give a new perspective on things, to allow him to "soar high enough to see more" than the paradigm of perfectibility.

In Bleeding Edge there is a sanctum of sorts; the Deep Web, where the DeepArcher servers are, is still beyond the reach of search engines. Here it is possible to get "constructively lost," but the risk of being found and dragged from its shady alleys and colonnades into the open marketplace, looms over this anarchic space, home to social activists, cyber criminals and the occasional flaneur. In the quest for universal accessibility (and a limitless market) all blank spots on the map must be discovered and colonized. Though not mentioned explicitly in the novel, Usenet is a case in point. Once a lively discussion network with tens of thousands of groups specializing in everything from anarcho-syndicalism and sexual paraphilia of every stripe and color, to the arcana of 70s game shows, it is today largely defunct. Excavated from the unruly realms of the internet and turned into a museum, it exists today as Google Groups. Posts from the early 80s and onward are now fully indexed and searchable, but to gain entrance and participate in the discussions you must be an ad-abiding Google member. Incidentally, the takeover was finalized in 2001. If we are looking for turning points, this one is as good as any. The fledgling company with its anti-bullying ethos would soon take a hawkish turn – aiding and abetting the NSA as the corporate entity with the second highest market capitalization in the United States. It is probably against this specific backdrop that we should see the more metaphysical struggle between anarchy and enclosure, between the undercurrents of the Deep Web and its water lily surface, in Bleeding Edge.

Much like Mason and Dixon, "winning away from the realm of the Sacred, its Borderlands,” automated web spiders (read: Google's webcrawlers) are "itching to corrupt another patch of sanctuary for their own far-from-selfless ends." Given the back story Pynchon tells, Edward Snowden's revelation that major tech companies are "serving the ends of Government" by being bedfellows with the NSA is hardly surprising. Their goals might be miles apart, but their means are the same. If you are pushing either the neo-liberal agenda of everyone being a potential consumer, or a more repressive one in which everyone is a potential terrorist, and add state-of-the-art technology to the mix, you get an odd sort of cocktail (and here Felix Boïngueaux quip about technology being neutral gains an eerie resonance.) In both cases you have a vested interest in analyzing the behavior patterns of individuals from the flow of big data, and to find out what the contents of their communications or the web pages they visit reveal about their political affiliations or consumer preferences. Maxine's father, an old school Marxist, offers a bleak glimpse of an Orwellian future in which freedom is slavery and it is no longer possible to tell Google and the NSA apart:

Call it freedom. It's based on control. Everybody connected together, impossible anybody should get lost ever again. Take the next step, connect it to these cell phones, you've got a total Web of surveillance, inescapable. You remember the comics in the Daily News? Dick Tracy's wrist radio" It'll be everywhere, the rubes'll all be begging to wear one, handcuffs of the future.

This is a canny flash-forward to the author's present. The nascent field of wearable technology (i-prefixed watches and spectacles) is hyped as the greatest thing since, well, if not sliced bread at least since the tablet computer. But even though Edward Snowden has shown that this prediction of a dot-com--dot-gov technocracy (though exaggerated) is not necessarily off the mark, it would still be a mistake to see in the old curmudgeon a spokesperson for the septuagenarian author. The verve of the narrative, its delight in 2001 ephemera, pop culture and godawful puns, not to mention lyrical interludes of heartbreaking beauty – these all serve to suggest that all is not doom and gloom. Readers primed for a Postmodern tour-de-force, a ditty of self-reflexive riffing (read: acoustic feedback that never gains resonance from or reaches out into the world) will turn a deaf ear to this, but Pynchon is actually trying to teach us how to live with preserved dignity and sanity in a world in which governmental repression, technology and consumer free-choice have converged. Bleeding Edge will never be shelved in the self-help section, but let us imagine an Alain de Botonian "How Pynchon Can Change Your Life". The chapters might look something like this:

Get Lost! Get Constructively Lost!
When Maxine first descends into DeepArcher she is at a loss of what to do. The surface web with its "farmland, subways, expressways" recedes on the screen to make way for a beautifully rendered lounge with passengers waiting for trains or buses. She is not quite sure which. There is no music score to help set the tone, only the ambiance of "a thousand train and bus stations and airports." What is the point of it all? A series of dialogue boxes assures her that that her confusion is all right, it is "part of the experience of getting constructively lost." Soon she finds herself "wandering along, clicking on everything, faces, litter on the floor, labels on bottles behind the bar, after a while interested not so much in where she might go than in the texture of the search itself." Maxine's discovery in DeepArcher parallels the one Alice's makes after encountering the Cheshire Cat:
 
'Would you tell me, please, which way I ought to go from here?'
'That depends a good deal on where you want to get to,' said the Cat.
'I don't much care where—' said Alice.
'Then it doesn't matter which way you go,' said the Cat.
'—so long as I get SOMEWHERE,' Alice added as an explanation. 
'Oh, you're sure to do that,' said the Cat, 'if you only walk long enough.' 

Being constructively lost, Maxine and Alice both realize, is a mindset conducive to serendipitous discovery. This oxymoronic state, and the idea (so unplatable to Hazlitt) of a long walk without goal, with no recourse to maps and street signs, and without even staying on the same side of the road, cuts to the quick of Google's business plan. The more you stray from the first page, the less profitable your clicks will be. We must not only be forced to keep a straight line, but also be convinced that that is what we wanted all the time. Close-up magicians talk about "card forces"; we are misdirected by sleights of hand and irrelevant patter into believing that we freely choose one of the cards from the deck, but end up with the ace from up the sleeve. Google's page rank algorithm works very much like that. A recent study by the online advertising network Chitika, shows that 34 % of all users stick to the first match – the one shoved upon us, so to speak; fewer than one in twelve take the trouble to even look beyond the first page of search hits. It is as difficult to blame them as it would be to criticize the member of the audience for becoming a mark, and playing right into the magician's script. And yet there is much to be gained from turning heckler, and joining the party of Coleridge, Alice and Maxine. By clicking on the "Next" button and browsing through hits deemed much less important, we will still get "SOMEWHERE", but the destination will not be predetermined by the company spending the most money on search hit optimization, and neither will it be final. DeepArcher is more important than arrival, and once we click on a page we will continue our hyperlink journey, "passing from subject to subject."

The amount of personal information we unintentionally reveal when we are online is mind-boggling. Perhaps cyber-savvy canines could once pass as human beings on the internet, but that was 20 years ago. Nowadays the search engines and social networks would not only know your species, breed and pedigree, but also have your favorite chow and chewing toy down to a tee. All based on your browsing and clicking history. This is no exaggeration. A 2013 study from Cambridge University shows that Facebook Likes "can be used to automatically and accurately predict a range of highly sensitive personal attributes including sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation and gender." In the words of the Victorian poet Robert Browning (see epigraph), it is all too easy to be "classed and done with" by those around us. And we do the same thing for convenience's sake; we file away friends and acquaintances, public figures, and even ourselves, under different categories in a mental "No Surprises Here" Rolodex. This happening on a grand scale is of course nothing new; census taking, direct marketing, Gallup Polls and swing state campaigning hinge on the analysis (and sometimes exploitation) of demographics and public opinion. But there are two major differences between this and the way we are classed and done with as we surf the web. First, individual data in a poll is only used in aggregate form; it is impossible to extrapolate what each individual has answered. The search engines, social networks and federal agencies, on the other hand, are not interested in public opinion as much as private opinions – individualized ads for your browsing pleasure, and dossiers with graphic details of your porn habits for defamation purposes. And secondly, when we fill out a questionnaire we know that we are being polled. Never before in the history of mankind have hundreds of millions of people been individually profiled by corporate and governmental entities – in most cases without them even knowing it.

Is this pigeonholing inevitable? Another Victorian, Charles Babbage, who in close collaboration with Lord Byron's daughter Ada Lovelace laid down the principles behind modern computers, was asked twice by members of parliament what would happen if you put the wrong figures into the machine. Would the right answer come out? The anecdote, as recounted in his 1864 autobiography Passages from the Life of a Philosopher, prompts the following reflection: "I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question." The answer is of course no. "Garbage in, garbage out," in contemporary programming lingo. But what if there are no right figures? What if the cookie crumb traces we leave behind are, from the perspective of Google et al., garbage? In the model the Cambridge study authors propose, the correlation between predicted and actual trait is disturbingly high, roughly 90%. But the corollary of this is that it is still possible to live on "the dangerous [perhaps even Bleeding] edge of things", and aspire to the inscrutable 10% who (for every trait) refuse to be open books scanned by proprietary algorithms, and escape the clutches of automatic demographic, racial, sexual and political profiling. So let us relish "the texture of the search itself," click on everything, ignore the as-you-type suggestions and tap in search terms for the heck of it, and not because we are in the market for Xbox consoles or Victoria's Secret lingerie. Chances are we will be ostracized from our comfort zone, find ourselves on the "wrong" side of the footpath and tumble down rabbit holes.

Take Your Plants out for a Walk
In a staggeringly beautiful passage (I have yet to come across anything like it in 21st century fiction) a humdrum night out on the Upper West Side turns into an epiphany of cosmic solidarity as Maxine experiences the frisson of everyday life:

evening rush hour, it’s just starting to rain…sometimes she can’t resist, she needs to be out in the street. What might only be a simple point on the workday cycle, a reconvergence of what the day scattered as Sappho some place back in some college course, Maxine forgets, become a million pedestrian dramas, each one charged with mystery more intense than high-barometer daylight can ever allow. Everything changes. There’s that clean, rained-on smell. The traffic noise gets liquefied. Reflections from the street into the windows of city buses fill the bus interiors with unreadable 3D-images, as surface unaccountably transforms to volume. Average pushy Manhattan schmucks crowding the sidewalks also pick up some depth, some purpose – they smile, they slow down, even with a cellular phone stuck in their ear they are more apt to be singing to somebody than yakking. Some are observed taking houseplants for walks in the rain. Even the lightest umbrella-to-umbrella contact can be erotic.” (my emphasis)

The ear-splitting soundscape of traffic horns and yakkety-yakking dissolves into liquid. We can read this in two ways. On the one hand, we see the scientist's wet dream realized. Far from random, the static proves to be subject to the laws of fluid dynamics. This is the Eureka moment when chaos and white noise turn into choreography and music. But this melting and surface-to-volume embodiment is also an epiphany for Maxine as she realizes that the passers-by she would normally not give the time or day (at least not without a healthy douche of New York interjections) are both unique and intriguing. Every single one. There are a million "pedestrian dramas" being enacted in front of her eyes and she is part of them all, part of the flow.

In fact, if we look closer at the metaphors in the passage, we see that Pynchon is cannily juxtaposing two very different notions of 'discrimination'. Let us call these the extrinsic and the intrinsic one. In the former, the mathematical-statistical model, the human beings Maxine encounters are visualized as nodes in a network graph or as points in a scatter plot or data matrix, perhaps by telcos and governmental agencies using cellular triangulation ("paranoia's the garlic in life's kitchen, right, you can never have too much," she muses at one point.) The intrinsic notion of discrimination, on the other hand, is modeled on Walter Pater's aesthetic theory, as outlined in his 1873 work Studies in the History of the Renaissance. In his "Conclusion" Pater, much like Pynchon, describes the moment of perceptual wonder in terms of liquefaction and synesthesia, the taking-in of a million ongoing dramas: "While all melts under our feet, we may well grasp at any exquisite passion [...] any stirring of the senses, strange dyes, strange colours, and curious odours [...] Not to discriminate every moment some passionate attitude in those about us, and in the very brilliancy of their gifts some tragic dividing of forces on their ways, is, on this short day of frost and sun, to sleep before evening."

His is also a vocabulary of analysis, of discrimination and division, but the act of distinguishing between different hues and gradations (of colors, wines, ideas) is first and foremost an internal act.  He does not urge us to study the people about us, pen and stopwatch in hand, and have them classed and done with. Rather, he wants us to turn our acute perception towards ourselves and in our confused dreams and desires see beauty. In our cognitive dissonance hear choral harmony. And then we are asked to turn our gaze from our navel to our fellow man and woman, and in them see the same beauty. "We shall hardly have time to make theories about the things we see and touch," he goes on to write. Not theories in the declarative, no, but the "tragic dividing of forces" is still a call to ask questions. What are we, for example, to make of the New Yorkers taking their houseplants out in the rain? Are they environmentally conscious, preferring not to waste tap water? Perhaps they are lonely, dotty or downright delusional. According to the model of extrinsic discrimination, however, an intrigued smile and a "perhaps" will not do. Instead, erratic behavior patterns are to be put into flowcharts or fed into algorithms, say the DSM-5 or why not a piece of classified DHS software used to detect and foil eco-terrorist plots. 

While the extrinsic (or top-down) approach discriminates between individuals in order to "make sense" of them by picking up on consumer patterns or zooming in on irregularities, the intrinsic one also makes fine distinctions, but only in order to lay bare the rich patchwork of dashed hopes, fulfilled dreams and passionate attitudes intrinsic to that pretty amazing thing we call life; it shows that the "million pedestrian dramas" are far from pedestrian, but stops short of making theories. William Empson, the great champion of ambiguity in life and poetry, wrote that we must maintain "ourselves between contradictions that can't be solved by analysis." The New York-houseplant couplings are walking contradictions, dignified and ridiculous at the same time as they strut and fret in a tragicomedy without script. Peculiar indeed, but if we subscribe to the intrinsic notion, the question we must ask is the one Lord Byron once posed when accused of mixing gravity and levity in Don Juan: but is it not life, is it not the thing? As for the men and machines of extrinsic discrimination, well, if the the umbrella touches on further data analysis do not seem to indicate a conspiracy, and if the DHS software draws a blank when it comes to the plant peccadilloes, they would not be able to make any sense of this.

If the disjunction we have been working with sounds too abstruse, let us enlist language for some serious deabstrusification. The collocation "make sense" is extrinsic; there is a judge and someone or something being judged. To make sense, on the other hand, isn't that what we are all doing when singing in our cell phones or inadvertently touching someone's umbrella with our own (let us not forget that 'sense' and 'sensual' are cognates). What the passage suggests is that it is possible to keep "the giddy line between" making perfect sense – to ourselves and to others – and being made sense of by agencies and algorithms. So run out into the rain and let some of it fall into your mouth, recite Sappho to the Evening Star, burst into an aria when the telemarketer calls... And, in all honesty, when was the last time you took your geraniums out for a walk?

Show your silly and sophomoric side
Decried by Dr Johnson as "the lowest form of wit", there is something a tad immature about the pun. When we cannot resist the urge, we feel embarrassed about it and try to excuse our gaffe with a sheepish smile and a palm-up gesture that seem to say "terribly sorry about that, it won't happen again." In Moab is my Washpot, Stephen Fry gives a trenchant account of how those who revel in language (and sometimes, god forbid, use it as an end rather than a means) are publicly perceived and pun-ished:
 
I was always encouraged to believe that cleverness and elegance with words obscured and twisted decent truth [...] To the healthy English mind [...] there is something jewy about verbal facility. George Steiner, Jonathan Miller, Frederic Raphael, Will Self [...] how often that damning word clever is attached to them, hurled like an inky dart by the snowy-haired, lobster-faced Garrick Club buffoons of the Sunday Telegraph and the Spectator.

By all accounts, Pynchon is a clever author. From the zany, hilarious and laugh-out-loud to the ridiculously strained, the pun-to-text ratio in Bleeding Edge is off the charts; a Queens strip joint is called "Joie de Beavre", a character "mistakes" Pokémon for a West Indian proctologist, a Scooby Doo episode in which the heroes battle a Colombian drug cartel ends with the punch line: "I'd've gotten away with it too, if it hadn't been for those Medellin kids." A pun, literary scholar Matthew Bevis writes (and he is on to something important here) gives the words "time off from business as usual." It's like the standard definition of "meddling" has, for a brief moment, disappeared from the dictionary, leaving us with an Out of Office message reading: "Gone to Colombia. Will be back shortly." But why this preoccupation with words that seem constitutionally unable to do even a day's work in Bleeding Edge?

The Deep Web, we recall, is under the constant threat of being brought to order, winched up to the surface web where everything is organized and indexed in semantic databases for the benefit of the user who can enjoy "relevant" ads based on the content. Maxine knows what is coming as she tenderly watches her boys, Ziggy and Otis, stand in the "precarious" light outside of the virtual city Zigotisopolis they have created in DeepArcher: "still safe from the spiders and bots that one day too soon will be coming for it, to claim-jump it in the name of the indexed world." They will soon grow up, and in much the same way, Internet will come of age and leave its Trotzphase truancy and "adolescent angst" behind. The parallelism is exquisite. Ziggy and Otis are precocious kids exploring a medium that will soon reach adulthood, and the city name they come up with reflects this teetering on the brink – at once sophomoric and remarkably sophisticated.  Signifying more than a "well done, bro!" high-five, it is also flirting with "zygote" (their city is, after all, designed and populated by humans rather than bots.)

As this example recalls, words are never "sufficed at what they are." They have phonetic properties and linguistic textures quite different from the semantic meaning; they also trigger associations, recall homophones and carry echoes and allusions. What puns show us is that even the denotative meaning of of a word is not cut in stone; it sometimes moonshines as something else by buying a flight ticket to Colombia or turning into a beaver. As children we are fascinated by this linguistic fluidity. The first jokes we laugh at, not because we realize that the context calls for an amused response, but because we get them, are based on puns and wordplay. Language learning is a constant exercise in compartmentalization; we must be able to tell verb from adjective, genus from species, a dog from a cat, a border collie from a golden retriever. But while these distinctions are vital, we also need some time off for recess, to play our own games rather than follow instructions: "How come dalmatians are terrible at hide and seek? Because they're always spotted!" Spending so much time constructing a world out of mental Lego, it is no surprise that we sometimes relish the prospect of tearing it apart and throwing the pieces in the air.

The spiders intent on organizing "the world's information" are, on the other hand, all work and no play. When they creep into the the sanctum, they will drag Zigotisopolis up to surface web. Every location in DeepArcher will be alphabetically ordered and spatially mapped in relation to other virtual cities. In fact, every word ever uttered by its denizens will be indexed and rounded up for 24/7 drudgery. When all is classed and done with, however, chances are that the not so obvious features, the pun on "zygote" for example, will be lost as possibilities are reduced to simplicities in the relational databases. A Dewey Decimal classification is not a novel, a Köchel number not a symphony. In our brief visit to the virtual city, still safe in the no-space of DeepArcher, we catch a plangent chord from Pynchon's Requiem for a Possibility. But let us not forget that in a Pynchon novel, requiems and elegies are always counterpointed by the honky-tonk of puns and silly songs. And here we can learn something from the narrative tension. In the same way as digression is pitted against the straight line as a way for the individual to keep her dignity, and (if only for a brief moment) throw historical actors off script, the sophomoric pun is an up yours to the adult world of Web 2.0 with its regime of predictability and strict codes of conduct. "You didn't see that coming, huh?" It is as if the individual words stage a wild cat strike against their employer, throwing spanners in the works. This insubordination will not do, and they are issued an ultimatum: "Did you mean cosmopolis?" No, siree. Zigotisopolis is Zigotisopolis is Zigotisopolis. Of course it is aslo "Ziggy-Otis-Ville", "Zygote City" and what have you. Childish cleverness and self-assertion is all it takes to foil classificatory schemes and keep possibilities alive.
 
So, no need to look sheepish or be ashamed of your jejunosity. No need to click on the link that leads back to the fold. Why not just relish the silliness of it all? Pun and be punned with! And don't let anyone else – least of all a machine – tell you what you really meant.  


Live on the Edge 

Tuesday, November 19, 2013

Literary Studies: Planets or Bookcases?


History has many cunning passages, contrived corridors
And issues, deceives with whispering ambitions,
Guides us by vanities. – T. S. Eliot, "Gerontion"

When I was a kid I devoured the Swedish comic book adventures of Bamse – the strongest and kindest bear in the world whose superhero fix was not Kryptonite but bee’s honey prepared by his granny. His sidekick Skalman, an anthropomorphic turtle sporting a yellow hat, was my first hero. He was a bundle of endearing contradictions: an Enlightenment spirit and Aspie, a ridiculously vain socialist, and the Taylorist inventor of an alarm clock telling him exactly when to eat and nap. 



A true Renaissance turtle, he would often lecture his friends on subjects as diverse as Rembrandt’s later self-portraits, why we pay taxes (“to help each other”), Christian zeal vis-à-vis the gang rape and murder (though not necessarily in that order) of the pagan philosopher Hypatia in 5th century Alexandria, and the difference between astronomy and astrology. The gravitational pull from Jupiter, he explained in a passionate debunking of pseudo-science, is about the same as that of a book at a distance of a few inches from you. “You do not become who you are,” (at this point his salivary glands were working overtime) “because of the constellation of the planets at the time of your birth; their influence is as negligible as that of a bookcase in the room.” 

Here I am afraid I have to part ways with my childhood hero. How could he – the most erudite turtle to walk the earth (at least on two legs); someone who spent his summer days under the foliage of an oak reading Kant’s Critique of Pure Reason – how could he so underestimate the pull of books? While I do thank my lucky stars for being born in a house full of them, this, I reckon, is more of a metaphorical gesture. My gravitating towards them was on the other hand something real. On a heap next to my wicker-work armchair, misfits, schlemiels and beautiful losers sprung to life and mingled: the homeless albino moose who lost his happy stomping grounds to deforestation and had to drown his sorrows in apple wine while babysitting, bumped into Fiver running away from the destruction of his warren. And they were horrified to hear Sybil’s tale of woe – that 60s cause célèbre who (if we are to believe the psychoanalyst who cashed in on her story) developed severe disassociation syndrome after being sexually abused by her mother from the day she was born.

I think I might need to pull a Holden Caulfield now, lest you think this is turning into a rather bizarre Bildungsroman. No, I am only trying to make a point about the current status of literary studies; bear with me. Fast-forward some years and I find myself leafing through the books my dad bought (but probably never read) from a bargain box at a seaside second-hand bookstore during our summer vacation in Berwick-upon-Tweed. The green cover caught my attention. It looked somber. The guy looked even more somber in a consumptive kind of way:

I should like to lie still
As if I was dead; but feeling
Her hand go stealing
Over my face and head, until
This ache was shed.

I caught myself misreading the last line and tried again. But every time ‘shed’ turned to ‘shared’. Of course I was the speaker, and of course I had someone specific in mind. I hardly remember her name now, but who can forget the Sturm & Drang of adolescence? The more I read the stanza the more I realized that there is (there must be!) a dramatic irony at play here. The speaker is as deluded as any teenage lover, thinking that the beloved is more than a figment of his or her imagination. But every first lover is a Pygmalion. We carve gods or goddesses out of our dreams, and lose our marbles in the process. If the speaker and girl ever met, all that could come out of the tryst would be a shared pain – the realization that the relationship is impossible. While both I and the speaker were blissfully unaware of this, the poem somehow knew. Or maybe we knew but were metaphorically sticking our fingers in the ears with the infuriating “nanananananana” of a toddler? 

I thought of holographic images that change when you tilt them, and reread the lines for the umpteenth time. This was amazing! It was also a sobering experience. Not only did I realize that my consuming desire was far from unique, but a commonplace, something every pimply man and woman had experienced since… well, I didn’t know, but I would eventually work my way backwards via the “joly wo” and “lusty sorwe” of Chaucer’s Troilus to Catullus’s "odi et amo" and Sappho’s amorous blackout. “A single word even,” Shelley writes in his Defence of Poetry, might be “the spark of inextinguishable thought” and that little unassuming “shed” in the D. H. Lawrence poem did it for me. To live is to suffer cognitive dissonance. And behind the tantalizing ambiguities and oxymora of poetic language lies the realization that this is quite all right. We are not alone.

Articles in literary studies don’t come with disclaimers. If a medical research team is funded by Evil Pharma, on the other hand, that is a conflict of interest (between the disinterested nature of science and the not-so-hidden interests of said company) and must be stated as such when they publish. The medical community will consequently exercise caution when interpreting the findings (perhaps to the effect that Thalidomide is a great cure for bipedalism). I thought I’d break the ice by issuing the following disclaimer: I am not a disinterested critic. My writings have everything to do with me and my childhood reading. The ne'er-do-wells mingling in front of my wicker-work armchair taught me how to make connections. Tenuous? Yup. Idiosyncratic? Guilty as charged. But the realization that, not bound by the covers, literary characters can travel through time and space was earth-shattering. They taught me the beauty of the underdog, from Lawrence Ferlinghetti's canine trotter to the mutt in Paul Auster's Timbuktu. They taught me that Everyman and Everywoman are [expletive] geniuses. And much later I would not laugh at Leopold Bloom when he thought that I.N.R.I was short for "Iron Nails Rushed In" or went looking for the back orifices of the Greek goddesses in the Dublin Public Library, but with him! And every time Homer Simpson surprises me I form my lips into a Molly Bloomian YES! But more specifically, my writings on poetry have everything to do with that green Penguin volume I found in my parents’ bookcase when I was sixteen. Everything. The poets I relate to and write about have all had similar experiences; they are all interested in cognitive dissonance and the awesomeness of a language that can convey these states of mind. When Lord Byron was accused of mixing gravity and levity in Don Juan (a poem that features the seasick protagonist entreating his absent lover in the following couplet: “Beloved Julia, here me still beseeching! / Here he grew inarticulate with retching.”) he sent the following letter to his publisher to answer the critic:

who objects to the quick succession of fun and gravity, as if in that case the gravity did not (in intention, at least) heighten the fun. His metaphor is, that 'we are never scorched and drenched at the same time'. Blessings on his experience! Ask him these questions about 'scorching and drenching'. Did he never play at Cricket, or walk a mile in hot weather? Did he never spill a dish of tea over his testicles in handing the cup to his charmer, to the great shame of his nankeen breeches? Did he never swim in the sea at Noonday with the Sun in his eyes and on his head, which all the foam of Ocean could not cool? Did he never draw his foot out of a tub of too hot water, damning his eyes and his valet's? Did he never inject for a Gonorrhea? or make water through an ulcerated Urethra? Was he ever in a Turkish bath, that marble paradise of sherbet and Sodomy? Was he ever in a cauldron of boiling oil, like St John? or in the sulphureous waves of hell? (where he ought to be for his 'scorching and drenching at the same time'). Did he never tumble into a river or lake, fishing, and sit in his wet cloathes in the boat, or on the bank, afterwards 'scorched and drenched', like a true sportsman? 'Oh for breath to utter!' - but make him my compliments; he is a clever fellow for all that - a very clever fellow. (Byron’s Letters and Journals VI:207)

In response to another Don Juan critic he asked rhetorically: “it may be profligate – but is it not life, is it not the thing?” It is. We are scorched and drenched all the time, if not physically so at least metaphorically. If I had not found that D. H. Lawrence poem, however, I might have been as clueless as Byron’s contemporary critic or as modern critics of M. H. Abrams’ ilk, who are deeply uncomfortable with the way Byron “speaks with an ironic counter-voice and deliberately opens a satirical perspective on the vatic stance of his Romantic contemporaries.” (Natural Supernaturalism 13) This is the end of the disclaimer, but there is more to be said. Like the madcap 17th century librarian Robert Burton, I “confusedly tumbled over divers authors” and turned into the reader, critic and armchair poet that I am today. The gravitational pull from the planets in the solar system did not make me who I am, but (pace Skalman) my parents’ bookcase certainly did.

The planets and the bookcase represent two conceptual views on how to interpret art and literature. According to the planetary notion, a work of literature has less to do with the volition of the author than with the constellation of external forces exerting their pull on the work. These are not necessarily planets, but are commonly referred to as the literary epoch during which the work was written (the assumption is that all works from the same period have family resemblances or share cultural determinants in the form of a certain epistemological or ontological outlook) or as in orthodox Marxism and its more attenuated developments, the material base of society or its ethos. Often sociological, ethnographic and demographic categories are thrown into the mix. The planetary approach is highly useful, but sometimes inadequate. Saying that works of art are in no way determined by material exigency would be laughable. The bookcase in our living room would most likely not have been as well stocked if we were not middle class, and if I grew up in a non-European or non-Western country, its contents would have been different. But beyond this I fail to see how my haphazard readings and trans-generational friendships could be explained by any planetary categories. 

The bookcase approach, on the other hand, emphasizes serendipity, anachronistic affinities, and the Siren song of books heard in dusty attics, seldom-visited library shelves and in the nooks and crannies of used bookstores. It is concerned not with the planetary configurations at the moment of writing or reading but with the hidden paths (textual and psychological) that lead, to quote Harold Bloom, from poem to poem. It can be seen as an attempt to reduce what Richard Rorty saw as the perennial tension between writer and philosopher/critic, between "an effort to achieve self-creation by the recognition of contingency and an effort to achieve universality by the transcendence of contingency." Are the frequent allusions to Middlemarch in Virginial Woolf’s Jacob’s Room indicative of modernism’s ironic reconfigurations of Victorian literature or due to the fact that Woolf quite liked Eliot’s novel? Wary of endowing planets or epochs with agency we bookcase supporters would probably opt for the latter. That is not to say that the approaches are mutually exclusive; they are in fact mutually illuminating. Let us not forget that Skalman claimed that the books in the bookcase exerted the same pull as the biggest planet in the solar system. But it is far from negligible.