On Specialist Realism: Infinite Summer Post #2
When James Wood coined the term "hysterical realism," he angered or irritated a large number of readers who found his judgments tone-deaf and his tastes narrow. Yet he also clearly struck a chord with an unusual number of people. At the very least, the name stuck.
I think the received story for that essay's success goes something like this: Wood's essay was the diagnosis a lot of people were waiting to hear; (some) readers had intuited a general malady in fiction, some were even making the right connections, but "hysterical realism" put a sharp term on this plague, asserting that there was a common origin for the multitude of symptoms.
A lot depends on that name, which I've always found to be inexplicable, probably because "hysterical" has so little analytical value. "Hysterical," besides having an ignominious history as a term of contempt for "eccentric" women, is merely pejorative; it doesn't explain, it doesn't clarify, it merely accuses, shames, castigates. Wood does make a number of observations of what this type of novel does, but these strategic arguments are almost unrelated to the tactically unmatchable brilliance of a catchy name. All you're left with is a kind of neologismic abracadabra, but most people find it difficult to remember an essay-length argument, and boy, do nomothetic fallacies sell.
I think hysterical realism is a lousy name for a lazy generalization, but I do think that the fact that people obviously responded to a singular name for a "genre" encompassing writers like DFW, Pynchon, DeLillo, Rushdie, and Zadie Smith is worth following up on. Although I don't think renaming "hysterical realism" will improve anyone's ability to analyze these actually very different authors and their books, I think a new term might help the folks who really like these books talk to people who don't; it might at the very least allow us to talk about why these books might engender highly conflicting judgments and feelings.
Wood starts his harangue by sneering at "storytelling."
So storytelling is almost surely a red herring. Wood also turns to the idea that the characters in these novels are all caricatures: there aren't "people" in them. But Wood also makes clear that being unpeopled hasn't really stopped a large number of authors from finding (very enthusiastic) readers. No, what I think Wood finally gets to (more so in his review of The Corrections than in the "Hysterical Realism" piece) is the idea that too many authors know too much about stuff. "[C]ontemporary American fiction, whose characteristic products are books of great self-consciousness with no selves in them; curiously arrested books which know a thousand different things—How to make the best Indonesian fish curry! The sonics of the trombone! The drug market in Detroit! The history of strip cartoons!—but do not know a single human being." Back in "Hysterical Realism," he puts the point more simply: "Information has become the new character."
I give a little more credence to the idea that something like this is actually behind a good deal of any frustration with someone like David Foster Wallace. Wood's specific problem with the pleonastic plenitude of information in a book like Infinite Jest is mainly that it's bottom-line futile: the world's always going to have the jump on you, information-wise. I don't know how much of Bellow's occasional criticism Wood's read, but his objections sound a lot like the following passage from an encyclopedia entry Bellow wrote in 1963 for a series called The Great Ideas Today:
At any rate, what I think is important is that this objection to detail is, more specifically, a gripe about detail that has obviously been arrived at by research—as opposed to details that are the result of mere observation. In either case, a certain density is the objective, one that is meant to signal that the artist is, in some sense, a specialist, willing to undertake extensive yet minute pains and labors to get all the details right, whether that's the way light plays on a woman's hair or the way drugs affect the human physiology. The specialist realist is someone who believes heart and soul the Carlyle quote that "Genius… means the transcendent capacity of taking trouble."
But the detail arrived at by pure observation (for some strange reason) often gets a pass; lyricism and le mot juste are seen as somehow more natural to the novel than highly technical nerditude. Perhaps it's as simple as some ineradicable "two cultures" idea—a notion that science and math are inherently alien to the word world. Which is why a writer like Joseph O'Neill and a book like Netherland isn't really expected to go into much detail about a fairly substantial aspect of his protagonist's life—his career as an energy trader. O'Neill's vividly observed detail is, for many, the proper mode of novel-writing. Or, perhaps, the most famous evasion of science/technology in a novel: Henry James's absolute reticence to specify the industry and product that has made the fortunes of the family in The Ambassadors.
Yet many very popular books—technothrillers, historical dramas, period pieces of all kinds—are totally crammed with obviously researched detail, and so it seems strange that its presence in literary fiction would disgruntle. However, I think if we look at some specific instances of specialist knowledge present in Infinite Jest, we can get some idea of what might rankle or put off at least a few of its readers.
Specialized knowledges pervade the book—tennis, recreational drug use, optics, burglary, even punting (surely the most narrowly specialized position in football). But one of the more (in)famous elements of "research" in the novel is the filmography Wallace includes in endnote 24. In the age of IMDb, we might be apt to forget that the filmography is (or was) actually a highly specialized and intensely laborious feat of archival research, but the almost eight-and-a-half pages of James O. Incandenza's collected works should surely remind us that a filmography is actually the product of research, and not Googling.
Yet there was, of course, no research necessary for composing this "artifact"—having no basis in reality, everything in it is a pure product of imagination. Yet Wallace never seems comfortable simply acknowledging that the imagination that produced it is his own. In just about as many ways as possible, Wallace continually disrupts the filmography with secondary or tertiary commentary to let us know that he's looking at it from the outside too: I kept waiting for that click where the self-distancing irony would drop away and, as with Borges or Pynchon or Bolaño or even (especially) Auster, you get a real note of dread or mystery where the author seems to have been finally convinced of the reality of his artifice. Even in the last entry, which is about The Entertainment itself, there are three skeptical footnotes embedded.
And this type of thing occurs many times in the text: consider the phrase, "Goethe's well-known 'Bröckengespenst' phenomenon38" (88). If it's so well-known, why the hell does it need to be footnoted? This feels like Wallace simply can't decide how to be authoritative: does he want to be assholically authoritative ("well-known"), learnedly authoritative (using the German term in the first place), or helpfully authoritative (sticking in a footnote)? If the confusion is simply an attempt to undermine the idea of authority in the first place, then it needs to be decisive confusion: subversion can't be done lackadaisically, and self-subversion even less so.
The perfect example of this indecisive subversion comes twenty pages before, in the first section about poor Kate Gompert: "Something was almost too overt about the pathos of the posture: this exact position was illustrated in some melancholic Watteau-era print on the frontispiece to Yevtuschenko's Field Guide to Clinical States" (68). "Something… almost… some…"—these are words that aren't even tactically indecisive—they're too quotidian really to be noticed, similar in effect to throwing in a "like" every few pages of narration. So they don't truly subvert the over-done specificities of "overt… pathos… exact… Watteau-era… frontispiece to Yevtuschenko's Field Guide to Clinical States." They don't really ironize the position of authority taken by someone who would be this specific so much as they peel Wallace away from fully occupying it. It's an approximate deconstruction of authority, and I think that approximateness pisses some people (including me, some of the time) off.
Most of Infinite Jest, I think, does not do this approximate deconstruction act; the bulk of it is what can be defined as specialist realism—which I think is actually a broadly popular mode of writing. I don't think very many people mind writerly ostentation by itself: there are simply far too many popular authors who are grossly ostentatious for this to be the case. And readers of all kinds are capable of showing enormous patience with heavily-detailed and at times rather tedious passages of questionable importance to the overall novel. "Specialist realism" is not terribly problematic to most readers, and is often even considered enjoyable. (Consider, here, Wallace's enthusiasm for Tom Clancy: there is not as great a distance between the two as one might think.) This mode of writing, however, sometimes slips into a different mode of writing that is indecisively subversive—a lukewarm irony that I think turns nearly everyone off. This is present, too, in Infinite Jest, and in order to have a conversation among people who really like the book and people who can't get through it, I think it's necessary to begin by separating this lukewarmness from the specialist realism that actually makes the novel so captivating.
Wallace may have had very well-thought-out, very theoretically smart reasons for trying to have things both (or more) ways, for trying to be indecisive, but there are lots of things which are really theoretically well-grounded which are simply annoying. I'm sure there are folks who think that the lukewarm ironical mode is really brilliant and is actually the most brilliant thing about the novel. I'd be happy to hear those arguments, but I want to make clear that I don't really find this lukewarmness all that much of an obstacle to enjoying the book. So please, don't confuse me with attacking Wallace or "hysterical realism" or any of that stuff.
I think the received story for that essay's success goes something like this: Wood's essay was the diagnosis a lot of people were waiting to hear; (some) readers had intuited a general malady in fiction, some were even making the right connections, but "hysterical realism" put a sharp term on this plague, asserting that there was a common origin for the multitude of symptoms.
A lot depends on that name, which I've always found to be inexplicable, probably because "hysterical" has so little analytical value. "Hysterical," besides having an ignominious history as a term of contempt for "eccentric" women, is merely pejorative; it doesn't explain, it doesn't clarify, it merely accuses, shames, castigates. Wood does make a number of observations of what this type of novel does, but these strategic arguments are almost unrelated to the tactically unmatchable brilliance of a catchy name. All you're left with is a kind of neologismic abracadabra, but most people find it difficult to remember an essay-length argument, and boy, do nomothetic fallacies sell.
I think hysterical realism is a lousy name for a lazy generalization, but I do think that the fact that people obviously responded to a singular name for a "genre" encompassing writers like DFW, Pynchon, DeLillo, Rushdie, and Zadie Smith is worth following up on. Although I don't think renaming "hysterical realism" will improve anyone's ability to analyze these actually very different authors and their books, I think a new term might help the folks who really like these books talk to people who don't; it might at the very least allow us to talk about why these books might engender highly conflicting judgments and feelings.
Wood starts his harangue by sneering at "storytelling."
The big contemporary novel is a perpetual-motion machine that appears to have been embarrassed into velocity. It seems to want to abolish stillness, as if ashamed of silence. Stories and substories sprout on every page, and these novels continually flourish their glamorous congestion. Inseparable from this culture of permanent storytelling is the pursuit of vitality at all costs. Indeed, vitality is storytelling, as far as these books are concerned… Storytelling has become a kind of grammar in these novels; it is how they structure and drive themselves on.I don't think anybody seriously objects to too many stories or "substories" in novels; isn't this what we praise in a television show like The Wire—the network-narrative proliferation of stories and substories? Not to mention the fact that this emergent fabulist malignancy that Wood wants to irradiate is actually a much more historically central mode for the novel than the staid alternative of well-manicured plots and coolly distanced narration: think of Boccaccio, think of Chaucer, of Rabelais, of Sterne, of the picaresque, of any series of novels—incessant proliferation of (or cycling through) micro-narratives is not a shoddy new wing—it's a cornerstone of the novel.
So storytelling is almost surely a red herring. Wood also turns to the idea that the characters in these novels are all caricatures: there aren't "people" in them. But Wood also makes clear that being unpeopled hasn't really stopped a large number of authors from finding (very enthusiastic) readers. No, what I think Wood finally gets to (more so in his review of The Corrections than in the "Hysterical Realism" piece) is the idea that too many authors know too much about stuff. "[C]ontemporary American fiction, whose characteristic products are books of great self-consciousness with no selves in them; curiously arrested books which know a thousand different things—How to make the best Indonesian fish curry! The sonics of the trombone! The drug market in Detroit! The history of strip cartoons!—but do not know a single human being." Back in "Hysterical Realism," he puts the point more simply: "Information has become the new character."
I give a little more credence to the idea that something like this is actually behind a good deal of any frustration with someone like David Foster Wallace. Wood's specific problem with the pleonastic plenitude of information in a book like Infinite Jest is mainly that it's bottom-line futile: the world's always going to have the jump on you, information-wise. I don't know how much of Bellow's occasional criticism Wood's read, but his objections sound a lot like the following passage from an encyclopedia entry Bellow wrote in 1963 for a series called The Great Ideas Today:
realistic verisimilitude (of the O’Hara sort) has become burdensome and difficult, and that it requires a degree of special knowledge which only a small number of fanatical devotees can attain. In an era of specialization such as ours, even a botanist, studying plant hormones, let us say, will not know what a colleague in plant ecology is doing. Literally to know what he is writing about would impose an impossible strain on the most dedicated realist. The most informational of novelists can no longer adequately inform us. The world is really too much for the realist to cope with.What's highly amusing about this quote is that, nearly 25 years later, Bellow would write a novel (More Die of Heartbreak) featuring a botanist character, and would provide the reader with some semi-recondite data about botany. Time makes a self-contradiction of us all.
At any rate, what I think is important is that this objection to detail is, more specifically, a gripe about detail that has obviously been arrived at by research—as opposed to details that are the result of mere observation. In either case, a certain density is the objective, one that is meant to signal that the artist is, in some sense, a specialist, willing to undertake extensive yet minute pains and labors to get all the details right, whether that's the way light plays on a woman's hair or the way drugs affect the human physiology. The specialist realist is someone who believes heart and soul the Carlyle quote that "Genius… means the transcendent capacity of taking trouble."
But the detail arrived at by pure observation (for some strange reason) often gets a pass; lyricism and le mot juste are seen as somehow more natural to the novel than highly technical nerditude. Perhaps it's as simple as some ineradicable "two cultures" idea—a notion that science and math are inherently alien to the word world. Which is why a writer like Joseph O'Neill and a book like Netherland isn't really expected to go into much detail about a fairly substantial aspect of his protagonist's life—his career as an energy trader. O'Neill's vividly observed detail is, for many, the proper mode of novel-writing. Or, perhaps, the most famous evasion of science/technology in a novel: Henry James's absolute reticence to specify the industry and product that has made the fortunes of the family in The Ambassadors.
Yet many very popular books—technothrillers, historical dramas, period pieces of all kinds—are totally crammed with obviously researched detail, and so it seems strange that its presence in literary fiction would disgruntle. However, I think if we look at some specific instances of specialist knowledge present in Infinite Jest, we can get some idea of what might rankle or put off at least a few of its readers.
Specialized knowledges pervade the book—tennis, recreational drug use, optics, burglary, even punting (surely the most narrowly specialized position in football). But one of the more (in)famous elements of "research" in the novel is the filmography Wallace includes in endnote 24. In the age of IMDb, we might be apt to forget that the filmography is (or was) actually a highly specialized and intensely laborious feat of archival research, but the almost eight-and-a-half pages of James O. Incandenza's collected works should surely remind us that a filmography is actually the product of research, and not Googling.
Yet there was, of course, no research necessary for composing this "artifact"—having no basis in reality, everything in it is a pure product of imagination. Yet Wallace never seems comfortable simply acknowledging that the imagination that produced it is his own. In just about as many ways as possible, Wallace continually disrupts the filmography with secondary or tertiary commentary to let us know that he's looking at it from the outside too: I kept waiting for that click where the self-distancing irony would drop away and, as with Borges or Pynchon or Bolaño or even (especially) Auster, you get a real note of dread or mystery where the author seems to have been finally convinced of the reality of his artifice. Even in the last entry, which is about The Entertainment itself, there are three skeptical footnotes embedded.
And this type of thing occurs many times in the text: consider the phrase, "Goethe's well-known 'Bröckengespenst' phenomenon38" (88). If it's so well-known, why the hell does it need to be footnoted? This feels like Wallace simply can't decide how to be authoritative: does he want to be assholically authoritative ("well-known"), learnedly authoritative (using the German term in the first place), or helpfully authoritative (sticking in a footnote)? If the confusion is simply an attempt to undermine the idea of authority in the first place, then it needs to be decisive confusion: subversion can't be done lackadaisically, and self-subversion even less so.
The perfect example of this indecisive subversion comes twenty pages before, in the first section about poor Kate Gompert: "Something was almost too overt about the pathos of the posture: this exact position was illustrated in some melancholic Watteau-era print on the frontispiece to Yevtuschenko's Field Guide to Clinical States" (68). "Something… almost… some…"—these are words that aren't even tactically indecisive—they're too quotidian really to be noticed, similar in effect to throwing in a "like" every few pages of narration. So they don't truly subvert the over-done specificities of "overt… pathos… exact… Watteau-era… frontispiece to Yevtuschenko's Field Guide to Clinical States." They don't really ironize the position of authority taken by someone who would be this specific so much as they peel Wallace away from fully occupying it. It's an approximate deconstruction of authority, and I think that approximateness pisses some people (including me, some of the time) off.
Most of Infinite Jest, I think, does not do this approximate deconstruction act; the bulk of it is what can be defined as specialist realism—which I think is actually a broadly popular mode of writing. I don't think very many people mind writerly ostentation by itself: there are simply far too many popular authors who are grossly ostentatious for this to be the case. And readers of all kinds are capable of showing enormous patience with heavily-detailed and at times rather tedious passages of questionable importance to the overall novel. "Specialist realism" is not terribly problematic to most readers, and is often even considered enjoyable. (Consider, here, Wallace's enthusiasm for Tom Clancy: there is not as great a distance between the two as one might think.) This mode of writing, however, sometimes slips into a different mode of writing that is indecisively subversive—a lukewarm irony that I think turns nearly everyone off. This is present, too, in Infinite Jest, and in order to have a conversation among people who really like the book and people who can't get through it, I think it's necessary to begin by separating this lukewarmness from the specialist realism that actually makes the novel so captivating.
Wallace may have had very well-thought-out, very theoretically smart reasons for trying to have things both (or more) ways, for trying to be indecisive, but there are lots of things which are really theoretically well-grounded which are simply annoying. I'm sure there are folks who think that the lukewarm ironical mode is really brilliant and is actually the most brilliant thing about the novel. I'd be happy to hear those arguments, but I want to make clear that I don't really find this lukewarmness all that much of an obstacle to enjoying the book. So please, don't confuse me with attacking Wallace or "hysterical realism" or any of that stuff.
"The Oxford English Dictionary defines..."
One of my biggest pet peeves in scholarly prose is the habit of advancing an argument by citation of the dictionary definition of one or more of your key terms.
Is this really all that analytically valid, or just rhetorically decorative, maybe even a little fetishistic? Maybe even a little insecure? It's a non-structural buttress in almost all cases, present merely for the appearance of greater support.
Could it be a hold-over from the older, more philological orientation of literary studies? I'm inclined to think not because it is used so rarely in that mode. One of the few adequate uses of this rhetorical device I've seen recently came in Jenny Davidson's Breeding, where she actually traces the word through a succession of dictionaries. Diachronic uses (reminding the reader of an archaic definition would be another good use) are entirely reasonable, but the synchronic is just word-dressing.
I think that the O.E.D. (it's almost always the O.E.D. that is used) is quite simply a sort of name-check authoritative reference that gets dropped in rather like a flavoring particle in German, identical in function to the frequent footnoted references you see to big-named figures that take the form "Foucault makes a similar point in regards to the panopticon…" These aren't so much ways of building up the argument as showing that your argument is neighborly with someone important. This is talismanic, not analytical.
Because really, how often is the precise articulation of the denotative essence of a word a revelation? Aren't we as readers usually capable of evaluating whether a scholar is using her terms in a manner consonant with standard definitions? If there is a specific aspect of the word's definition which needs highlighting, can't the scholar simply define her own terms, and then we as readers can figure out if she's using it legitimately? If the term is itself so vexed that an O.E.D. citation is "required" to pin it down in univocal terms, maybe the scholar should actually talk through that complexity rather than pre-empting it by citation.
Actually, I think it's just a case of scholars being too tentative, feeling like they can't begin working with specific terms unless they're drawing them from somewhere else—they need someone else to say all the words they want to work with before they can begin working with them. You come up with a set of terms that you want to play with throughout your article or your monograph, and you need some way to introduce them, and you feel awkward just saying "Here's what I'm going to do with 'empire'" and letting the reader decide if you're making sense.
Sorry, I was just flipping through an introduction to a book that I'm really eager to read, and the author feels it necessary to use the O.E.D. to define two of the words in the title in order to draw out an opposition or a paradox that could simply have been stated flatly. As I said, it's a pet peeve—I don't disagree with the definitions, or the paradox the author foregrounds between them; I just don't find it necessary to wield the O.E.D. in such a way, and a little insulting to both of our intellects.
Is this really all that analytically valid, or just rhetorically decorative, maybe even a little fetishistic? Maybe even a little insecure? It's a non-structural buttress in almost all cases, present merely for the appearance of greater support.
Could it be a hold-over from the older, more philological orientation of literary studies? I'm inclined to think not because it is used so rarely in that mode. One of the few adequate uses of this rhetorical device I've seen recently came in Jenny Davidson's Breeding, where she actually traces the word through a succession of dictionaries. Diachronic uses (reminding the reader of an archaic definition would be another good use) are entirely reasonable, but the synchronic is just word-dressing.
I think that the O.E.D. (it's almost always the O.E.D. that is used) is quite simply a sort of name-check authoritative reference that gets dropped in rather like a flavoring particle in German, identical in function to the frequent footnoted references you see to big-named figures that take the form "Foucault makes a similar point in regards to the panopticon…" These aren't so much ways of building up the argument as showing that your argument is neighborly with someone important. This is talismanic, not analytical.
Because really, how often is the precise articulation of the denotative essence of a word a revelation? Aren't we as readers usually capable of evaluating whether a scholar is using her terms in a manner consonant with standard definitions? If there is a specific aspect of the word's definition which needs highlighting, can't the scholar simply define her own terms, and then we as readers can figure out if she's using it legitimately? If the term is itself so vexed that an O.E.D. citation is "required" to pin it down in univocal terms, maybe the scholar should actually talk through that complexity rather than pre-empting it by citation.
Actually, I think it's just a case of scholars being too tentative, feeling like they can't begin working with specific terms unless they're drawing them from somewhere else—they need someone else to say all the words they want to work with before they can begin working with them. You come up with a set of terms that you want to play with throughout your article or your monograph, and you need some way to introduce them, and you feel awkward just saying "Here's what I'm going to do with 'empire'" and letting the reader decide if you're making sense.
Sorry, I was just flipping through an introduction to a book that I'm really eager to read, and the author feels it necessary to use the O.E.D. to define two of the words in the title in order to draw out an opposition or a paradox that could simply have been stated flatly. As I said, it's a pet peeve—I don't disagree with the definitions, or the paradox the author foregrounds between them; I just don't find it necessary to wield the O.E.D. in such a way, and a little insulting to both of our intellects.
The Culture of the New Capitalism, by Richard Sennett
Scott McLemee has described Richard Sennett in the following terms, which I think are accurate and to the point: "Sennett’s work, if not Marxist, is at the very least grounded in some notion of mankind as the species that creates itself through the labor process."
I think this specification of what Sennett (minimally) owes to Marx is valuable because, although it is relatively simple, this notion of self-creation-through-labor is often what seems to go missing when people go hunting for Marxism or Marxian ideas around the Left, looking for ammunition either to smear leftists or to reinvigorate them.
The idea that humankind creates itself through the labor process is not, I suppose, a terribly radical notion, and for that reason, perhaps, it is not very central to many of the most illustrious Marxist or Marxian projects of our day. It disappears behind "fidelity to the event" within Badiou, and gets constantly screened out by Žižek's Lacanianism. I haven't read enough of Negri/Hardt to say for sure, but their conception of the multitude, from what I understand of it, seems to focus on autonomy in a way that very well may be antithetical to this notion.
I think Sennett addresses this fact obliquely within his own personal history: he begins these lectures (the material of the book was originally presented as The Castle Lectures at Yale) by reflecting on his eager participation in the New Left, and notes that the principles espoused most concretely in the Port Huron Statement have, in a sense, been fulfilled, but with results quite opposite to those wished.
At any rate, I think it is important to keep in mind that Sennett is using the New Left as a point for pushing off; these lectures are, in part, arguing that what the New Left failed to account for in their critique of bureaucracy is precisely what must be salvaged—the value of narrative (as in personal narrative), usefulness, and craftsmanship. It is these three values which he believes are most threatened by the new capitalism, as they are being purposely driven out of the workplace under three challenges:
Which is not to say that Sennett's insights aren't powerful and frequently new; I think that his chapter on politics as consumption adds some particularly original ideas to that general idea, and some of them have become much more interesting (and ambiguous) after the last election.
I like this book a lot; I think it is tremendously instructive, and is as purely illuminative of some extraordinarily complex formations as you're ever likely to find.
***
Speaking of The Baffler, Twitter tells me that it's being revived!
I think this specification of what Sennett (minimally) owes to Marx is valuable because, although it is relatively simple, this notion of self-creation-through-labor is often what seems to go missing when people go hunting for Marxism or Marxian ideas around the Left, looking for ammunition either to smear leftists or to reinvigorate them.
The idea that humankind creates itself through the labor process is not, I suppose, a terribly radical notion, and for that reason, perhaps, it is not very central to many of the most illustrious Marxist or Marxian projects of our day. It disappears behind "fidelity to the event" within Badiou, and gets constantly screened out by Žižek's Lacanianism. I haven't read enough of Negri/Hardt to say for sure, but their conception of the multitude, from what I understand of it, seems to focus on autonomy in a way that very well may be antithetical to this notion.
I think Sennett addresses this fact obliquely within his own personal history: he begins these lectures (the material of the book was originally presented as The Castle Lectures at Yale) by reflecting on his eager participation in the New Left, and notes that the principles espoused most concretely in the Port Huron Statement have, in a sense, been fulfilled, but with results quite opposite to those wished.
The goal for rulers today, as for radicals fifty years ago, is to take apart rigid bureaucracy.Sennett doesn't press too hard on this point—the New Left's values foreshadowing those of the New Capitalism—though he uses very similar terms to those above in conclusion, returning to this historical irony. (Some of the work of making this irony more concrete is provided by Thomas Frank in The Baffler and The Conquest of Cool, although I wonder if Sennett might see the congruence of the New Left and New Capitalism on this question to be less collusive than Frank tends to paint it.)
The insurgents of my youth believed that by dismantling institutions they could produce communities: face-to-face relations of trust and solidarity, relations constantly negotiated and renewed, a communal realm in which people became sensitive to one another's needs. This certainly hasn't happened. The fragmenting of big instititutions has left many people's lives in a fragmented state: the places they work more resembling train stations than villages, as family life is disoriented by the demands of work. Migration is the icon of the global age, moving on rather than settling in. Taking institutions apart has not produced more community. (2)
At any rate, I think it is important to keep in mind that Sennett is using the New Left as a point for pushing off; these lectures are, in part, arguing that what the New Left failed to account for in their critique of bureaucracy is precisely what must be salvaged—the value of narrative (as in personal narrative), usefulness, and craftsmanship. It is these three values which he believes are most threatened by the new capitalism, as they are being purposely driven out of the workplace under three challenges:
The first concerns time: how to manage short-term relationships, and oneself, while migrating from task to task, job to job, place to place. If institutions no longer provide a long-term frame, the individual may have to improvise his or her life-narrative, or even do without any sustained sense of self.That (long) passage acts as a very good prospectus for the rest of the book; Sennett bears down on this question in an extremely orderly manner, and takes his time filling in the picture of this culture. At times, much of what he says is extremely familiar—both in the way of description and in the way of critique. Yet it is not over-familiar, and it is so well-stated and so well-organized that the value becomes not so much the insight as the articulation—the book is indispensable because it accumulates all the stray thoughts and angles and observations, seeing clearly the larger shape which each one partially illuminates.
The second challenge concerns talent: how to develop new skills, how to mine potential abilities, as reality's demands shift… The emerging social order militates against the ideal of craftsmanship, that is, learning to do just one thing really well; such commitment can often prove economically destructive. In place of craftsmanship, modern culture advances an idea of meritocracy which celebrates potential ability rather than past achievement.
The third challenge follows from this. It concerns surrender; that is, how to let go of the past. The head of a dynamic company recently asserted that no one owns their place in her organization, that past service in particular earns no employee a guaranteed place. How could one respond to that assertion positively? A peculiar trait of personality is needed to do so, one which discounts the experiences a human being has already had. This trait of personality resembles more the consumer ever avid for new things, discarding old if perfectly serviceable goods, rather than the owner who jealously guards what he or she already possesses. [I'd say it more resembles a fantasy baseball manager—or an armchair stockbroker.]
What I want to show is how society goes about searching for this ideal man or woman… A self oriented to the short term, focused on potential ability, willing to abandon past experience is—to put a kindly face on the matter—an unusual sort of human being. Most people are not like this; they need a sustaining life narrative, they take pride in being good at something specific, and they value the experiences they've lived through. The cultural ideal required in new institutions thus damages many of the people who inhabit them. (4, 5)
Which is not to say that Sennett's insights aren't powerful and frequently new; I think that his chapter on politics as consumption adds some particularly original ideas to that general idea, and some of them have become much more interesting (and ambiguous) after the last election.
I like this book a lot; I think it is tremendously instructive, and is as purely illuminative of some extraordinarily complex formations as you're ever likely to find.
***
Speaking of The Baffler, Twitter tells me that it's being revived!
"I am in here." Infinite Summer Post #1
I know virtually nothing more about holography than what Wikipedia tells me. (Yes, pun intended.) It comes up twice, though, within the first ten pages of Infinite Jest: once in the title of one of the "nine separate application essays, some of which of nearly monograph-length" ("The Implications of Post-Fourier Transformations for a Holographically Mimetic Cinema," p. 7) and once in reference to Dennis Gabor, the inventor of holography, whom Hal says he believes "may very well have been the Antichrist" (p. 12).
I think this paragraph from Wikipedia is the most relevant to what follows:
I think what is being expressed is a certain fear of the mediation of existence, a feel of being antecedent to (and trapped behind) an after-image or reconstruction of the self. Hal feels that he is "no longer present" at or in the moment he is perceived, which is why "lately" is repeated and why he needs to assert that he is "in here"—he is in the hologram that the deans are seeing and interacting with.
I don't want to get ahead of myself here, but in some notable ways I think this fear is linked to the concerns Wallace expressed in his essay on television and U.S. culture, "E Unibus Pluram" [pdf]. In particular, I wonder if we can't think of irony itself as a sort of holographic image of culture, at least it is used in those manners which Wallace critiques.
This post is meant more as a sort of opening up of one line of thought I believe I'll be returning to over the course of my reading, rather than as a full exploration of this idea. Clearly, we're not very far into the novel, so doing more than remarking on a theme which may recur would be premature. GatelyErdedy's obsession over the cartridges of the Interlace viewer and his answering machine's message in the next section already return to a certain paranoia about the nature and consequences of recording. Obviously, there will be others.
What about you—first thoughts, anyone?
***
I'm pretty certain that should be "post-Fourier transforms," right? I mean, that's more common/more standard.
I think this paragraph from Wikipedia is the most relevant to what follows:
Though holography is often referred to as 3D photography, this is a misconception. A better analogy is sound recording where the sound field is encoded in such a way that it can later be reproduced. In holography, some of the light scattered from an object or a set of objects falls on the recording medium. A second light beam, known as the reference beam, also illuminates the recording medium, so that interference occurs between the two beams. The resulting light field is an apparently random pattern of varying intensity which is the hologram. It can be shown that if the hologram is illuminated by the original reference beam, a light field is diffracted by the reference beam which is identical to the light field which was scattered by the object or objects. Thus, someone looking into the hologram 'sees' the objects even though it may no longer be present. There are a variety of recording materials which can be used, including photographic film.There are two other notable repetitions within this first section: Hal uses the word "lately" (intentionally homophonous to Gately, no doubt) on the very first page, and he asserts "I am in here" on both p. 3 and p. 13. Well, on p. 13, he says, "I'm in here."
I think what is being expressed is a certain fear of the mediation of existence, a feel of being antecedent to (and trapped behind) an after-image or reconstruction of the self. Hal feels that he is "no longer present" at or in the moment he is perceived, which is why "lately" is repeated and why he needs to assert that he is "in here"—he is in the hologram that the deans are seeing and interacting with.
I don't want to get ahead of myself here, but in some notable ways I think this fear is linked to the concerns Wallace expressed in his essay on television and U.S. culture, "E Unibus Pluram" [pdf]. In particular, I wonder if we can't think of irony itself as a sort of holographic image of culture, at least it is used in those manners which Wallace critiques.
This post is meant more as a sort of opening up of one line of thought I believe I'll be returning to over the course of my reading, rather than as a full exploration of this idea. Clearly, we're not very far into the novel, so doing more than remarking on a theme which may recur would be premature. GatelyErdedy's obsession over the cartridges of the Interlace viewer and his answering machine's message in the next section already return to a certain paranoia about the nature and consequences of recording. Obviously, there will be others.
What about you—first thoughts, anyone?
***
I'm pretty certain that should be "post-Fourier transforms," right? I mean, that's more common/more standard.
Søren Kierkegaard, "The Rotation Method"
Starting from a principle is affirmed by people of experience to be a very reasonable procedure; I am willing to humor them, and so begin with the principle that all men are bores. Surely no one will prove himself so great a bore as to contradict me in this.I sort of think of Kanye West ("We all self-conscious / I'm just the first to admit it") as the Kierkegaard of hip hop. And I've always wanted to extend this analogy to other artists, but nothing ever really comes to mind.
At any rate, I'd like to take advantage of one of Google Books's new features and begin a feature on this blog that I'll try to keep up every Sunday, taking its title from part of the Catholic Mass. Here's this week's Liturgy of the Word:
Unfortunately, some of the crucial passages at the very end are cruelly hidden (that is, those passages after page 239—those not displayed before then aren't really of great importance, and are actually generally misogynistic). I have a slightly different translation on hand, but here is the rest of it:
At every opportunity he was ready with a little philosophical lecture, a very tiresome harangue. Almost in despair, I suddenly discovered that he perspired copiously when talking. I saw the pearls of sweat gather on his brow, unite to form a stream, glide down his nose, and hang at the extreme point of his nose in a drop-shaped body. From the moment of making this discovery, all was changed. I even took pleasure in inciting him to begin his philosophical instruction, merely to observe the perspiration on his brow and at the end of his nose.By the way, just because I've labeled this a liturgy, don't think I also take it as gospel.
The poet Baggesen says somewhere of someone that he was doubtless a good man, but that there was one insuperable objection against him, that there was no word that rhymed with his name. It is extremely wholesome thus to let the realities of life split upon an arbitrary interest. You transform something accidental into the absolute, and as such, into the object of your admiration. This has an excellent effect, especially when one is excited. This method is an excellent stimulus for many persons. You look at everything in life from the standpoint of a wager, and so forth. The more rigidly consistent you are in holding fast to your arbitrariness, the more amusing the ensuing combinations will be. The degree of consistency shows whether you are an artist or a bungler; for to a certain extent all men do the same. The eye with which you look at reality must constantly be changed…
The arbitrariness in oneself corresponds to the accidental in the external world. One should therefore always have an eye open for the accidental, always be expeditus if anything should offer. The so-called social pleasures for which we prepare a week or two in advance amount to so little; on the other hand, even the most insignificant thing may accidentally offer rich material for amusement. It is impossible here to go into detail, for no theory can adequately embrace the concrete. Even the most completely developed theory is poverty-stricken compared with the fullness which the man of genius easily discovers in his ubiquity.
Savage Excel
I'm a huge fan of the use of Microsoft Excel for literary ends, so I think this is absolutely awesome: a friend of mine over at We Understand How Markets Work But We Read Good Too has made a few spectacular graphical representations of the middle section of Roberto Bolaño's The Savage Detectives—the one with the many different narrators.
Like Joyce and DFW, Bolaño is one of those authors that inspires meticulous exuberance.
Like Joyce and DFW, Bolaño is one of those authors that inspires meticulous exuberance.
Flight of the Old Dog and "Bomb Iran"
In Glenn Greenwald's column today, he questions how many of the same people who have been, for months, advocating significant bombing in Iran can now be so exercised on the behalf of the Iranians whose lives would be seriously jeopardized by such a campaign:
The hypocrisy and indifference of these hawks is breathtaking, but I don't have much to add on that score; Greenwald says it well. I do want to pick up on the fact that Greenwald choses a video game as the most likely model that neoconservatives may have in mind when conceiving their ideal strategies/fantasies. It's a throw-away line, to be sure, but it's not an uncommon one—video games are the most common example used of political unreality (especially when it involves violence) infecting calculations of political reality.
I don't know much about the consumption habits of any of the hawks Greenwald cites, but I'd have to imagine that Giuliani hasn't logged too many hours on any gaming system, and that it isn't video games that might serve him as a resource for political wish-fulfillment.
Because novels are not typically thought of as "technology," they are usually not the first things to come to mind when thinking of models of what highly technological warfare is and looks like. Although techno-thrillers (Tom Clancy, Larry Bond, Dale Brown) can be extremely detail-oriented about the mechanics and logistics of combat, it's more likely that we consider film and video games to be the dream factory of war.
Yet the fantasy that Greenwald mentions—the single target, drop-the-bomb-and-fly-home scenario—isn't just the climax of Star Wars: A New Hope; it's also the plotline of Dale Brown's 1987 best-selling techno-thriller Flight of the Old Dog. Plot details are behind that Wikipedia link. I remember enjoying reading that book—and a number of other Dale [not Dan] Brown books—when I was in high school. They're potent and extremely tech-y.
I'm not saying Giuliani or John Bolton or Norman Podhoretz is any more likely to have read Dale Brown's book than they are to have played Ace Combat 6: Fires of Liberation. But I do think books (and techno-thrillers in particular) play more of a role in shaping our ideas of the dynamics, possibilities and strategies of modern combat than they are often given credit for. The remarks made by these morons are probably in most cases pure political opportunism, but they're also playing to the ideas of what combat is and how it can be conducted that exist and predominate in public consciousness. And I think it'd be a good idea to remember that writers like Tom Clancy have done a lot to set the limits on those kind of questions.
Imagine how many of the people protesting this week would be dead if any of these bombing advocates had their way -- just as those who paraded around (and still parade around) under the banner of Liberating the Iraqi People caused the deaths of hundreds of thousands of them, at least. Hopefully, one of the principal benefits of the turmoil in Iran is that it humanizes whoever the latest Enemy is. Advocating a so-called "attack on Iran" or "bombing Iran" in fact means slaughtering huge numbers of the very same people who are on the streets of Tehran inspiring so many -- obliterating their homes and workplaces, destroying their communities, shattering the infrastructure of their society and their lives. The same is true every time we start mulling the prospect of attacking and bombing another country as though it's some abstract decision in a video game…(Links to evidence/details of his claims about the potential destruction caused by even the most modest of "Bomb Iran" plans are peppered throughout his original post)
The notion that we would have harmed Iran's nuclear capabilities with our bombing attacks without killing substantial numbers of Iranian civilians is a fantasy comparable to the claim that we could remove Saddam Hussein in a quick and easy war, with few civilian casualties, and in the face of a grateful population. Except where there is a single target, that isn't what happens when you bomb countries. Large numbers of civilians die, and the advocates of these campaigns -- today masquerading as crusaders for the welfare of the Iranian People -- were well aware of that result and (at best) were indifferent to it.
The hypocrisy and indifference of these hawks is breathtaking, but I don't have much to add on that score; Greenwald says it well. I do want to pick up on the fact that Greenwald choses a video game as the most likely model that neoconservatives may have in mind when conceiving their ideal strategies/fantasies. It's a throw-away line, to be sure, but it's not an uncommon one—video games are the most common example used of political unreality (especially when it involves violence) infecting calculations of political reality.
I don't know much about the consumption habits of any of the hawks Greenwald cites, but I'd have to imagine that Giuliani hasn't logged too many hours on any gaming system, and that it isn't video games that might serve him as a resource for political wish-fulfillment.
Because novels are not typically thought of as "technology," they are usually not the first things to come to mind when thinking of models of what highly technological warfare is and looks like. Although techno-thrillers (Tom Clancy, Larry Bond, Dale Brown) can be extremely detail-oriented about the mechanics and logistics of combat, it's more likely that we consider film and video games to be the dream factory of war.
Yet the fantasy that Greenwald mentions—the single target, drop-the-bomb-and-fly-home scenario—isn't just the climax of Star Wars: A New Hope; it's also the plotline of Dale Brown's 1987 best-selling techno-thriller Flight of the Old Dog. Plot details are behind that Wikipedia link. I remember enjoying reading that book—and a number of other Dale [not Dan] Brown books—when I was in high school. They're potent and extremely tech-y.
I'm not saying Giuliani or John Bolton or Norman Podhoretz is any more likely to have read Dale Brown's book than they are to have played Ace Combat 6: Fires of Liberation. But I do think books (and techno-thrillers in particular) play more of a role in shaping our ideas of the dynamics, possibilities and strategies of modern combat than they are often given credit for. The remarks made by these morons are probably in most cases pure political opportunism, but they're also playing to the ideas of what combat is and how it can be conducted that exist and predominate in public consciousness. And I think it'd be a good idea to remember that writers like Tom Clancy have done a lot to set the limits on those kind of questions.
Have You Seen..., by David Thomson
There are quite a number of books that people buy, I think, mainly for the lists at the back. 1001 Places to Go Before You Die doesn't sound like much of a travel guide, and I don't really care too much what the guy who recommends 1001 Books You Must Read Before You Die thinks about the novels he includes: he included them, so you'd think he'd like them. What else did you want?
David Thomson's book is closer to these books than it is to a Harold Bloom or Jonathan Rosenbaum canon—both Bloom and Rosenbaum choose their items out of a quasi-metaphysical devotion to the idea that a certain set of works must, at all costs, be preserved. Thomson's 1000 film list isn't about preservation so much as conversation: as he says, people tend to ask critics what they should watch. This is his response. He doesn't like all of them, but someone interested in film will probably find all of them worthwhile in some capacity, and a casual movie-watcher will probably enjoy most.
Thomson's had great practice at writing short, dynamic, fearlessly opinionated entries about film while composing and compiling his Biographical Dictionary of Film (and its revision/expansion). This book is no different, although a little more relaxed.
Thomson organizes the entries alphabetically, but the list at the back is ordered chronologically. His taste grows a little less canonical almost with each year that passes, so that there were actually more films I hadn't heard of from the 90s and 2000s than there were in the 40s and 50s. Which is not to say that he ignores the films that are legitimate and kind of inarguable masterpieces from the past decade or so, but that he adds in some truly obscure (mostly British) work. Here's the films from 2006 and 2007, the last years of the book:
Thomson's Wikipedia page also gives the Top Ten he submitted to the Sight and Sound poll, which is conducted every decade and will be done again in 2012. I don't think Thomson is likely to insert any of the films of the past six/seven years (i.e. since 2002, the last Sight and Sound poll) into his top ten, but his lists got me thinking about which possibly could, at least for me.
The Italian miniseries The Best of Youth (La Meglio Gioventù) is certainly a contender, as are There Will Be Blood and Children of Men. I really liked Syndromes and a Century, by the Thai director Apichatpong Weerasethakul; The New World, I'm Not There, and Hirokazu Kore-eda's Nobody Knows would all be in the back of my mind.
What about you—what films released since 2002 should or could end up on a critic's all-time top ten list? If anyone says Dark Knight, I'm banning them from ever commenting again.
David Thomson's book is closer to these books than it is to a Harold Bloom or Jonathan Rosenbaum canon—both Bloom and Rosenbaum choose their items out of a quasi-metaphysical devotion to the idea that a certain set of works must, at all costs, be preserved. Thomson's 1000 film list isn't about preservation so much as conversation: as he says, people tend to ask critics what they should watch. This is his response. He doesn't like all of them, but someone interested in film will probably find all of them worthwhile in some capacity, and a casual movie-watcher will probably enjoy most.
Thomson's had great practice at writing short, dynamic, fearlessly opinionated entries about film while composing and compiling his Biographical Dictionary of Film (and its revision/expansion). This book is no different, although a little more relaxed.
Thomson organizes the entries alphabetically, but the list at the back is ordered chronologically. His taste grows a little less canonical almost with each year that passes, so that there were actually more films I hadn't heard of from the 90s and 2000s than there were in the 40s and 50s. Which is not to say that he ignores the films that are legitimate and kind of inarguable masterpieces from the past decade or so, but that he adds in some truly obscure (mostly British) work. Here's the films from 2006 and 2007, the last years of the book:
- The Lives of Others
- Longford
- The Queen
- Twenty Thousand Streets Under the Sky
- The Diving Bell and the Butterfly
- 4 Months, 3 Weeks and 2 Days
- Eastern Promises
- No Country for Old Men
- Sweeney Todd
- There Will Be Blood
- You, the Living
Thomson's Wikipedia page also gives the Top Ten he submitted to the Sight and Sound poll, which is conducted every decade and will be done again in 2012. I don't think Thomson is likely to insert any of the films of the past six/seven years (i.e. since 2002, the last Sight and Sound poll) into his top ten, but his lists got me thinking about which possibly could, at least for me.
The Italian miniseries The Best of Youth (La Meglio Gioventù) is certainly a contender, as are There Will Be Blood and Children of Men. I really liked Syndromes and a Century, by the Thai director Apichatpong Weerasethakul; The New World, I'm Not There, and Hirokazu Kore-eda's Nobody Knows would all be in the back of my mind.
What about you—what films released since 2002 should or could end up on a critic's all-time top ten list? If anyone says Dark Knight, I'm banning them from ever commenting again.
The Lonely Londoners, by Sam Selvon and A Passage to India, by E. M. Forster
Early on in A Passage to India, Ronny Heaslop, the British magistrate, becomes rather delighted to get a glimpse of what Dr. Aziz really thinks of the British:
The other thing I think is interesting about the exchange above is Heaslop's assurance that the subjugated natives intend for all their actions and remarks to be decoded by the English, that they not only expect but participate in having their customs and their behavior translated into "plain Anglo-Saxon." Aziz knew full well that his remark would be interpreted correctly by some English person, even if it wasn't Mrs. Moore.
Of course, Forster's novel stands as a rebuke to both the idea that there is nothing private in India (the novel hinges on the terrible privacy of memory, a privacy whose faults cannot be corrected) and the idea that India is a land of mystery awaiting the interpretive work of the clever British. There is a very important passage about midway through the book that demonstrates this latter point with great grace:
***
The status of the West Indian immigrants in London in Selvon's novel cannot be more different from the East Indians in Forster's: their dialect does not require interpretation (and so no one treats it as a mystery to be decoded) yet it is also "plain Anglo-Saxon" enough to be used in the creation of both public and private spaces.
The entire novel is written in the dialect (though there are sections written with few obvious dialect markers), giving it a consistent feel of immersion. Yet this decision does not seem to have realism at its back, but poetry. On a literary level, it is far suppler than the stiff formalities of Forster's dialogue; it has a greater range and much more expressive force. If The Lonely Londoners were narrated in anything close to resembling Forster's primness, it would still be very good, but it would only be a slice-of-life guided tour. Selvon makes it impossible to maintain any distance from the characters, makes it impossible to take them either for a mystery or an irrelevance.
Here is a passage near the end of the book (no spoilers, though, I promise). If you love living in a city—or wish you did—well, enjoy:
"So you and he had a talk. Did you gather he was well disposed?"There are many interesting things going on here; most obvious, I think, is the curious assertion that "Nothing's private in India." While Forster's novel does not actually include more than a few chanted words in Urdu or Hindi, he does depict a couple of conversations among the "natives" that aren't translated or even summarized. Whether speaking in Urdu constitutes privacy in the novel (or in the Raj) is an open question: the conversations Forster depicts are irrelevant to the novel (mostly, we can assume, gossip among the servants), and can the irrelevant really be private?
Ignorant of this question, she [his mother, Mrs. Moore] replied, "Yes, quite, after the first moment."
"I meant, generally. Did he seem to tolerate us—the brutal conqueror, the sundried bureaucrat, that sort of thing?"
"Oh, yes, I think so, except the Callendars—he doesn't care for the Callendars at all."
"Oh. So he told you that, did he? The Major will be interested. I wonder what was the aim of the remark."
"Ronny, Ronny! You're never going to pass it on to Major Callendar?"
"Yes, rather. I must, in fact!"
"But, my dear boy—"
"If the Major heard I was disliked by any native subordinate of mine, I should expect him to pass it on to me."
"But my dear boy—a private conversation!"
"Nothing's private in India. Aziz knew that when he spoke out, so don't you worry. He had some motive in what he said. My personal belief is that the remark wasn't true."
"How not true?"
"He abused the Major in order to impress you."
"I don't know what you mean, dear."
"It's the educated native's latest dodge. They used to cringe, but the younger generation believe in a show of manly independence. They think it will pay better with the itinerant M.P. But whether the native swaggers or cringes, there's always something behind every remark he makes, always something, and if nothing else he's trying to increase his izzat—in plain Anglo-Saxon, to score. Of course there are exceptions."
The other thing I think is interesting about the exchange above is Heaslop's assurance that the subjugated natives intend for all their actions and remarks to be decoded by the English, that they not only expect but participate in having their customs and their behavior translated into "plain Anglo-Saxon." Aziz knew full well that his remark would be interpreted correctly by some English person, even if it wasn't Mrs. Moore.
Of course, Forster's novel stands as a rebuke to both the idea that there is nothing private in India (the novel hinges on the terrible privacy of memory, a privacy whose faults cannot be corrected) and the idea that India is a land of mystery awaiting the interpretive work of the clever British. There is a very important passage about midway through the book that demonstrates this latter point with great grace:
Miss Quested saw a thin, dark object reared on end at the farther side of a watercourse, and said, "A snake!" The villagers agreed, and Aziz explained: yes, a black cobra, very venomous, who had reared himself up to watch the passing of the elephant. But when she looked through Ronny's field-glasses, she found it wasn't a snake, but the withered and twisted stump of a toddy-palm. So she said, "It isn't a snake." The villagers contradicted her. She had put the word into their minds, and they refused to abandon it. Aziz admitted that it looked like a tree through the glasses, but insisted that it was a black cobra really, and improvised some rubbish about protective mimicry. Nothing was explained, and yet there was no romance.That last line would make a terrific epigraph.
***
The status of the West Indian immigrants in London in Selvon's novel cannot be more different from the East Indians in Forster's: their dialect does not require interpretation (and so no one treats it as a mystery to be decoded) yet it is also "plain Anglo-Saxon" enough to be used in the creation of both public and private spaces.
The entire novel is written in the dialect (though there are sections written with few obvious dialect markers), giving it a consistent feel of immersion. Yet this decision does not seem to have realism at its back, but poetry. On a literary level, it is far suppler than the stiff formalities of Forster's dialogue; it has a greater range and much more expressive force. If The Lonely Londoners were narrated in anything close to resembling Forster's primness, it would still be very good, but it would only be a slice-of-life guided tour. Selvon makes it impossible to maintain any distance from the characters, makes it impossible to take them either for a mystery or an irrelevance.
Here is a passage near the end of the book (no spoilers, though, I promise). If you love living in a city—or wish you did—well, enjoy:
What it is that a city have, that any place in the world have that you get so much to like it you wouldn't leave it for anywhere else? What it is that would keep men although by and large, in truth and in fact, they catching their royal to make a living, staying in a cramp-up room where you have to do everything—sleep, eat, dress, wash, cook, live. Why it is, that although they grumble about it all the time, curse the people, curse the goverment, say all kind of thing about this and that, why it is, that in the end, everyone cagey about saying outright that if the chance come they will go back to them green islands in the sun?The Lonely Londoners is a quick book—both in length and in pace. I think it's a little difficult to get here in the States, but it's worth checking for. Inspired last year by reading Junot Díaz, I set myself an agenda this year to read some other Caribbean writers (Naipaul and Carpentier so far, Danticat and Chamoiseau likely to come) and have in all cases been really delighted by my reading, Selvon very much included.
In the grimness of the winter, with your hand plying space like a blind man's stick in the yellow fog, with ice on the ground and a coldness defying all effort to keep warm, the boys coming and going, working, eating, sleeping, going about the vast metropolis like veteran Londoners.
A Bit More on Kunkel
I want to pick up on my post over at Conversational Reading about Benjamin Kunkel's essay on the Internet. There were a few things that I left out because it was already much too long and a few things that have clarified (I hope) because of the excellent comments that Richard and LML have made.
My reading of Kunkel's argument centered around trying to define his idea of experience as it was made clear by the claims he made against the Internet (or, more broadly, a distracted life mediated by technology). I feel that his claim that the "sensuous poverty" of Internet content should be a "reminder" of the more harmful effects of the Internet is not in itself very significant, but is indicative of the way he wishes to conceptualize experience, particularly when it is coupled with the magnetism he attributes to the Internet. His setting off the physically impoverished but nevertheless alluring Internet against the self-discipline-demanding physical beauty of non-Internet content is an example, I believe, of a broader conceptualization of experience as a sequence of alternatives, structured in the form of repeated micro-tests of discipline, taste, and/or will. This particular ethic is not aesthetic; it reveals an ethic originating in a very different organization of human life and experience: "distributing our attention" is Kunkel's term for these micro-tests, and that is surely an economic phrase, and more specifically, a capitalist one, as it means, basically, "investment."
I'm not going to hide that I'm grabbing the next part straight from Weber; although The Protestant Ethic and the Spirit of Capitalism has many historiographical faults, it also has significant successes. And among these is, I think, Weber's success in tying the particular world-view of Puritanical testing to the bourgeois sensibility of accumulation as a sort of proof of inner virtue: accumulation is evidence that you have passed numerous tests, each one bestowing upon you a small trophy. This is what I was trying to describe as Kunkel's ethic of experience, only I feel that Kunkel (as reflected in his novel, Indecision and, I think, demonstrated by some of the language used in his essay) has heroized this ethic further (in Weber, it is already somewhat heroic) by grafting it into the Bildungsroman structure. It's not just the accumulation of correct (self-disciplined) choices—Proust over YouTube—that is necessary (and threatened) but also the activation of something internal ("whatever it is in me") that can only be accomplished by encounters with the beauty proper to "poetry, philosophy, history," a beauty which Kunkel asserts is alien to the Internet. The need for this activation reinforces the need for self-discipline, which is eroded by the Internet (according to him). In other words, the Internet is an existential clusterfuck for the Bildungsroman hero/bourgeois intellectual.
My objections to this ethic don't really require much elaboration: it comes down to the privilege necessary to achieve and then maintain it. That privilege is written all through the n+1 pamphlet What We Should Have Known, of which Kunkel played a part. I have written about that pamphlet on this blog before, in an entry more broadly addressing the role of regret in one's reading life. Regret, which is effectively the subject of that n+1 pamphlet, is the preeminent expression of this ethic: it structures one's past choices as good or bad investments ("why did I spend so much of my adolescence reading Star Wars books?"), as time managed well or poorly, as tests of will and self-discipline passed or failed. It is also, conversely, a way of reinforcing the idea that your ability to reflect and to understand that you have, at times failed, is a promise of future success: you know enough to correct your mistakes and make better investments in the future. You know enough to bring yourself in line with the Bildung curve that one can derive from Joyce, from Stendhal, from Flaubert, from Goethe, et al.
At the end of that post, I noted how the contributors to the pamphlet all disparaged their undergraduate years and their "short twenties"—the period between college graduation and the thirtieth birthday. Referring to college as "summer camp" and effectively bragging about how their various grad programs were either useless or at best stopgaps in trying to figure out what they wanted to do or be, I (despite a fairly shameless infatuation with n+1) was basically appalled. Yes, I too didn't read all the right things in college, and yes, I too am not reading the right things right now, still pretty much at the beginning of my short twenties. But goddamn if I am going to write either off in the manner they did or blind myself to the privileges that allow me to be where I am and study literature as I do. Access to books is hardly universal, much less the kind of access to any book that is so often taken for granted (greatly facilitated, I hardly need to add, by the Internet), much less the funds to buy them or the time to read them. "Distraction" is an incomparably class-based complaint. And while I took far too many words to get to this point, that time—the time that you have spent (dare I say invested) reading this and the time I have spent writing it—is a terribly dark underlining of this simple assertion. You may have been distracted in the reading of this, or in the reading of my earlier posts, checking your e-mail or Twitter or browsing something else, but the ability to be distracted in this manner is simply not a universal experience. And that's all, I guess, I really wanted to say.
***
A related point: in his comment on the original post, Richard contested whether I wasn't treating technology as if it were neutral. I don't think I made a very good answer to him, or to LML, regarding the very different experience of reading on the internet as opposed to sitting with a book. While they aren't making the same point, I'm hoping my answer covers both their points somewhat satisfactorily. LML says in a second comment:
My reading of Kunkel's argument centered around trying to define his idea of experience as it was made clear by the claims he made against the Internet (or, more broadly, a distracted life mediated by technology). I feel that his claim that the "sensuous poverty" of Internet content should be a "reminder" of the more harmful effects of the Internet is not in itself very significant, but is indicative of the way he wishes to conceptualize experience, particularly when it is coupled with the magnetism he attributes to the Internet. His setting off the physically impoverished but nevertheless alluring Internet against the self-discipline-demanding physical beauty of non-Internet content is an example, I believe, of a broader conceptualization of experience as a sequence of alternatives, structured in the form of repeated micro-tests of discipline, taste, and/or will. This particular ethic is not aesthetic; it reveals an ethic originating in a very different organization of human life and experience: "distributing our attention" is Kunkel's term for these micro-tests, and that is surely an economic phrase, and more specifically, a capitalist one, as it means, basically, "investment."
I'm not going to hide that I'm grabbing the next part straight from Weber; although The Protestant Ethic and the Spirit of Capitalism has many historiographical faults, it also has significant successes. And among these is, I think, Weber's success in tying the particular world-view of Puritanical testing to the bourgeois sensibility of accumulation as a sort of proof of inner virtue: accumulation is evidence that you have passed numerous tests, each one bestowing upon you a small trophy. This is what I was trying to describe as Kunkel's ethic of experience, only I feel that Kunkel (as reflected in his novel, Indecision and, I think, demonstrated by some of the language used in his essay) has heroized this ethic further (in Weber, it is already somewhat heroic) by grafting it into the Bildungsroman structure. It's not just the accumulation of correct (self-disciplined) choices—Proust over YouTube—that is necessary (and threatened) but also the activation of something internal ("whatever it is in me") that can only be accomplished by encounters with the beauty proper to "poetry, philosophy, history," a beauty which Kunkel asserts is alien to the Internet. The need for this activation reinforces the need for self-discipline, which is eroded by the Internet (according to him). In other words, the Internet is an existential clusterfuck for the Bildungsroman hero/bourgeois intellectual.
My objections to this ethic don't really require much elaboration: it comes down to the privilege necessary to achieve and then maintain it. That privilege is written all through the n+1 pamphlet What We Should Have Known, of which Kunkel played a part. I have written about that pamphlet on this blog before, in an entry more broadly addressing the role of regret in one's reading life. Regret, which is effectively the subject of that n+1 pamphlet, is the preeminent expression of this ethic: it structures one's past choices as good or bad investments ("why did I spend so much of my adolescence reading Star Wars books?"), as time managed well or poorly, as tests of will and self-discipline passed or failed. It is also, conversely, a way of reinforcing the idea that your ability to reflect and to understand that you have, at times failed, is a promise of future success: you know enough to correct your mistakes and make better investments in the future. You know enough to bring yourself in line with the Bildung curve that one can derive from Joyce, from Stendhal, from Flaubert, from Goethe, et al.
At the end of that post, I noted how the contributors to the pamphlet all disparaged their undergraduate years and their "short twenties"—the period between college graduation and the thirtieth birthday. Referring to college as "summer camp" and effectively bragging about how their various grad programs were either useless or at best stopgaps in trying to figure out what they wanted to do or be, I (despite a fairly shameless infatuation with n+1) was basically appalled. Yes, I too didn't read all the right things in college, and yes, I too am not reading the right things right now, still pretty much at the beginning of my short twenties. But goddamn if I am going to write either off in the manner they did or blind myself to the privileges that allow me to be where I am and study literature as I do. Access to books is hardly universal, much less the kind of access to any book that is so often taken for granted (greatly facilitated, I hardly need to add, by the Internet), much less the funds to buy them or the time to read them. "Distraction" is an incomparably class-based complaint. And while I took far too many words to get to this point, that time—the time that you have spent (dare I say invested) reading this and the time I have spent writing it—is a terribly dark underlining of this simple assertion. You may have been distracted in the reading of this, or in the reading of my earlier posts, checking your e-mail or Twitter or browsing something else, but the ability to be distracted in this manner is simply not a universal experience. And that's all, I guess, I really wanted to say.
***
A related point: in his comment on the original post, Richard contested whether I wasn't treating technology as if it were neutral. I don't think I made a very good answer to him, or to LML, regarding the very different experience of reading on the internet as opposed to sitting with a book. While they aren't making the same point, I'm hoping my answer covers both their points somewhat satisfactorily. LML says in a second comment:
Reading online is often a positive experience. It's sometimes better than the old model. But often, good or bad, it's dizzying--not only is there no "bildung-formation," there's no evident connection to the motives that I sat down in front of my computer to satisfy. This dizzying effect is very powerful, much more powerful than the distractions (flies buzzing, lights flickering, eyelids lowering) that assail me with book in hand, and the effect is so powerfully built into the system that even those of us with the concentration to read Proust find ourselves unequal to the task of using the web in a disciplined manner. I don't find Kunkel's attempt to sort this out laughable--neither do I think it's the final word on the subject--and I think you're overselling the extremity of his point of view. His essay appeared in a web-only book review, no?I don't find Kunkel's attempt laughable, just fairly class-blind. But what I meant to do (and I'm hoping this speaks to the question of whether technology is neutral or not) is to shift the focus on what this particular form of distraction actually is. While I do think that the experience of being distracted by technology is not class- or education-dependent, I think that this bourgeois/Bildung formation of distraction quite obviously is. The technology which seems to create it is simply integrating (extremely well, I should add) with older patterns of distinction and class/education-enabled consumption. Basically, I'm saying that YouTube isn't rotting your brain by keeping you from reading Proust; it's the privileged position you occupy that allows you to conceive of your experience with culture as existing along this kind of distinction.