Yesterday I rode my bicycle past a BP gas station. Almost every pump station was occupied by a hulking SUV, an enormous pickup truck, or a minivan. Weirdly, and fittingly, all of the vehicles were painted oil spill black. My first thought was outrage: How can these people buy gasoline at a BP station? Have they no shame? Was that Joe Barton behind the wheel of a Dodge Ram 2500?
My anger was misplaced, I knew. BP doesn't own the gas station in my home town, or any gas station anywhere, having abandoned the retail gas-selling business a couple years ago because it wasn't profitable enough. Furthermore, because oil is a globally traded commodity, there is no easy way to confirm that the gasoline sold at a BP station was extracted or refined by British Petroleum.
Short of going down to the Gulf Coast and scrubbing oil off a pelican, what can an ordinary person do? Trading in a Toyota Sequoia on a Prius is a start, I guess, but how many people will feel moved by the plight of Gulf Coast birds and fishermen enough to do so? Can a global corporation really care about small people? Is there such a thing as a shared ethical dimension based on empathy? Matthew Taylor, the Chief Executive of the RSA and formerly a political adviser to Tony Blair, thinks so. In fact, he argues that a "reassertion of the fundamentally ethical dimension of humanism" is a cornerstone of what he calls a 21st-century Enlightenment. Following Jeremy Rifkin's The Empathic Civilization: The Race to Global Consciousness in a World in Crisis, Taylor argues humans can empathize not only with other people, but the biosphere as well. Empathy is hard-wired into our brains, neuroscience and evolutionary psychology tell us, although scientists have not yet verified that empathy exists in Tony Hayward's brain. Taylor believes empathy can lead to a public discussion about how we share out natural resources. Which, of course, is a better solution than our current process, which limits the discussions to those people who are sleeping, literally and figuratively, with the US Minerals Management Service.
So far, the 21st-century Enlightenment has gotten off to a slow start: just a pamphlet and last night's lecture by Matthew Taylor. There are problems with Taylor's idea of a new Enlightenment, as he would be the first to admit. The path from empathy with a teary fisherman to support for a $1 a gallon gasoline tax is fraught with difficulty, to say the least. The first Enlightenment was about transcending the body and particular circumstance, while the updated version embraces the former as if it transcended the latter. But if there's a way out from under the grip of implacable fanaticism and global hyper-capitalism, then we should all pay attention.
It's now been scientifically proven that the Internet totally destroys your attention span, so I will keep this Fun Friday post as brief as possible because I may have already lost your attention.
You see, Web surfing actually rewires your brain so that we can't help but focus on crap instead of, say, on an entire Dickens novel or any of the biographies of his overstuffed, antic, distracted life. Some people say having the attention span of a firefly allows us to use up all the time people used to devote to watching television, which, as we all know, broadcasts nothing but crap. Even when they show something worthwhile, television networks manage to reduce it to crap, such as when Fox News edited out the applause during Obama's speech at West Point. Our attention spans will be further reduced once we all start wearing computer screens sown into our clothes in a few years. Then we can use the time we used to spend watching Lost to Google ourselves obsessively. Freed from the constraints imposed by desktops, laptops, even iPads, you can roam around like a flâneur as your location is geo-tagged and mapped so that the surface chaos of the wireless crowds reveals deeper structures.
As advanced reproductive technologies become more pervasive, inducing new levels of distraction, buildings will accommodate our eccentric gazes. We may overlook little gems of world literature and lose interest in questions of how we see other cultures through the cinema. We won't notice that the end of the world as we know it may already be here.
The latest issue of MAS Context is called ENERGY. The theme is playfully broad, covering everything from nuclear power plants to food to water dripping from a block of ice. ENERGY looks at how we generate and consume energy. The issue also demonstrates how our thinking about energy has changed over time.
To my mind the key text in the issue is "A Is for Atom," a short film produced by the General Electric Company in 1953. The film opens with a mushroom cloud. "The atom age was born!" the announcer declares, no doubt scaring the wits out of a generation of American school children. He frankly acknowledges the dangers associated with atomic energy. "There's no denying that since that moment the shadow of the atomic bomb has been a shadow over all our lives." After its terrifying opening, the film hastens to introduce its main character, a cartoon scientist with an atom for a head. Sure, atomic power could snuff out all life on the planet, but aren't atoms fascinating? Scientists couldn't resist tinkering with them. Potential Armageddon was simply an unfortunate side effect of an ingenious quest to modify the building blocks of matter.
Scientists in these period films from the Prelinger Archive work to unlock the mysteries of nature with the focus and intensity of border collies. Before the atomic age, science produced "enduring miracles" that promised to "give us all what we need and want." Splitting the atom, however, produced the first terminal miracle. This loss of innocence pervades all of the articles in ENERGY.
Recently a photographer named Mitch Epstein tried to snap a picture of a power plant in West Virginia. The FBI materialized and dragged him away for questioning. In his conversation with Iker Gil about American Power, Epstein's photographs of power plants across the US, Epstein reports that he was detained several times by law enforcement and utility officials for photographing power plants. He tells Gil, "I carried with me this constant underlying anxiety, 'Am I going to be questioned? What will be the consequences of that questioning?'" There are clear hints of a dark conspiracy at work, although the harassment may also have had something to do with security concerns about a stranger poking around the American power grid.
One constant in our discourse about energy is the delight and wonder at the ingenuity of people working to solve our energy problems. Elizabeth Redmond invented a piezoelectric device while in design school at the University of Michigan. Her device is a sort of floor mat that generates electricity when people walk across it. Redmond claims her POWERleap (pictured above) can generate 10 Watts per square meter per hour. It's a fun solution with all kinds of applications: sidewalks, airport terminals, my children's bedrooms. Too bad no one walks anymore.
That Redmond is a designer rather than a scientist or engineer illustrates how our approach to solving problems has changed since the dawn of the atomic age. In "Oil for Aladdin's Lamp," a Shell Oil production from 1949, scientists were guys who fiddled with beakers and scribbled furiously on note pads in order to make better tooth paste. Now we have designers clutching computer mice, teasing new forms out of industrial products. Scientists produced miracles; designers produce beauty.
Sean Lally, founder of WEATHERS, an environmental design office, creates crystalline blob forms that can be used as stools or temporary green houses. They are intended to be used within nature, rather than as a tool to master it. Ecosistema Urbano's Ecoboulevard was built in Madrid in 2007. I'm not entirely clear what it is or what benefits it offers, but I wish one would be built in Chicago. José Antonio Martínez Lapeña and Elías Torress designed a huge photovoltaic canopy for Barcelona. It's a bold reinterpretation of alternative energy's ugly duckling. While the Barcelona solar panels are entirely practical, realities:united's Powerplants preen rather uselessly over the streets of Pasadena, California. The project is on hold, not surprisingly.
The importance of beauty in energy solutions is illustrated, in a backhanded way, by Marcel Wilson's account of the transformation of the Hawaiian island of Lanai from a pineapple plantation to the power plant for the entire state. Once owned by the Dole Corporation, the island is now owned and operated by Castle & Cooke, a real estate development company. (Lanai must mean "monopoly" in Hawaiian.) Castle & Cooke wants to erect 200 windmills on the island's northern shore. Resistance to the idea comes from both the island and beyond, not because windmills are more unsightly than pineapple plants, but because no one fully trusts a large American corporation. There's no more innocence to the topic of energy any more. There's only beauty, and not enough of that to go around.
Writing in his excellent Sentences blog, Wyatt Mason reminds us that intellectual labor has always been hard work, even in the days before Google and Twitter. Digging into the Harper's Magazine archives he finds an 1882 article on Emerson's seminal essay "Intellect," in which Emerson describes the arduous process of articulating a thought. He says, "you must labor with your brains, and now you must forbear your activity, and see what the great Soul showeth."
Mason comments,
Emerson did not have email, could not tweet, did not date online, could not stream video from his favorite strippercam. Emerson was not distracted, therefore, in the modern way. Whatever did, in his era, stand between him and his setting his mind to that hardest task, the problem of focus is surely nothing new, despite the novel methods we’ve lately heard about its treatment.
The novel methods Mason refers to are the neuroenhancers popular among college students. Just as steroids were (and probably still are) most popular with marginal baseball players trying to hold on to a major league roster spot, neuroenhancers are generally used by middling students trying to balance academic performance and intense socializing.
Neuroenhancers are useful for cranking out a report at 2 AM, but no one claims they can sustain the kind of extended intellectual labor Emerson spoke about. They're an imperfect response to a problem everyone, myself included, encounters more and more: concentrating on a task. But as Mason suggests, the problem isn't really a new one; it's simply that the distractions have become more attractive. Salon's Laura Miller points out that we're hardwired for distraction. The early humans who learned to constantly scan their environments for dangers were the ones who survived because they saw the poisonous snake lurking under the leaves.
But there is another way to think about distractions.
Jacques Derrida was once asked on French television to comment on Seinfeld. Derrida shifted in his seat; clearly he'd never seen it. He responded by snapping at the camera, "you should be doing your homework and not watching television!" He may have regretted this answer. It strikes me as inconsistent with Deconstruction. The problem with the "Google is making us stupid" argument is that there's no center to our mental life. Ask any graduate student in English studying for a comprehensive exam what it's like to focus exclusively on reading. He or she will tell you the distractions are within the field of intellectual labor, not outside of it. No matter what book one is reading, there's always another book that's more essential, closer to the center of the issue, than the book one is reading now. You can be a third of the way through Dombey and Son before a crisis emerges: really, Vanity Fair is a better representative of mid-century British fiction. No, that's too central, too canonical: I should be reading The Tenant of Wildfell Hall! And so on.
Hence the beauty of Facebook: we know it's a waste of time, so whenever we leave it we know we're doing something meaningful. Without Facebook or Twitter on the margins of what's important, we can't identify what we should be doing.
Walter Benjamin was interested in what he called the physiognomy of the thing world. He used one of his most distinguished critical facilities, his "microscopic gaze" into the forgotten remains of the past, in order to construct an image of a society suspended in a collective dream. Small things revealed the intersection between technological and social progress on the one hand, and on the other, unresolved collective wishes that have calcified into myth. Most commonly, Benjamin argued, these mythic fantasies revolved around a classless society of endless abundance. The movie Wall-E is a sort of Benjaminian parable about discarded commodities and our barely conscious but shared desire to have our possessions serve us, rather than the other way around.
Particularly fertile ground for Benjamin's physiognomic analysis were toys. During his 1927 visit to Moscow Benjamin bought a number of Russian folk toys, which he regarded as remnants of a disappearing folk culture, products of a cottage industry about to be subsumed under mass industrialization.
I think he would have also been interested in toymaker Thames & Kosmos's Power House: Green Essentials Edition. Kids can construct a model house and furnish it all the latest green technology appliances, including "a lemon battery," whatever that is. The Power House is a combination of an old chemistry set, a Lost episode, and much else besides. Kids are supposed to experiment with the appliances while they "read the diary entries of a group of young explorers who are learning to live a sustainable existence on an island. To survive, they must implement real-world versions of the projects you are doing in the kit."
The old chemistry sets were about self-improvement; they were supposed to turn you into a scientist. Now kids have to save the whole planet before they can save themselves. Or maybe the message is exactly the opposite: Your suburban neighborhood of neo-Tudor houses can't be redeemed, so it's best to move to an island and start all over again. In any case, the project is straight out of Robinson Crusoe: inside every one of us is a perfectly rational, orderly republic that can be reproduced anywhere, under any circumstances. The Lost narrative is just a more accessible version of the Crusoe story, which is also a rescue story. As perfect as Crusoe's world is, he still needs to escape from the madness of isolation.
The Power House's fusion of science and narrative is its most meaningful element. The combination indicates how far green technologies have moved beyond merely lessening our dependence on foreign energy sources and reducing greenhouse gasses, especially now that green technologies are such a prominent part of Obama's stimulus package. Green technologies are moving further into the realm of myth. Specifically, they're no longer just about lowering our electricity bills, but they're about redeeming us from our collective follies.
NB: The photograph of the Power House was taken by Allison Coffee at PrairieMod, where I first came across the Power House.
"We are annihilating melancholia," Professor Eric G. Wilson warns. Reading his essay, "In Praise of Melancholy," excerpted from his book Against Happiness: In Praise of Melancholy, we discover, to our surprise, this is a bad thing. At first glance, Wilson's book seems like more definitive proof that people will complain about anything, especially if they're given a book contract to do it.
The evidence is chilling: according to a Pew survey, 85% of Americans are happy. Apparently, none of those 85% live in a state that's held a presidential primary so far. The presidential candidates have discovered that each primary state has its unique set of gripes. So who's to blame for rampant contentment across America? Scientists. Wilson traces the conspiracy specifically to happiness studies, which, last time I checked, had found that we feel happiness and sadness in equal measures. In their determination to brighten our moods scientists have also come up with anti-depressants for doctors to foist upon an unsuspecting public. Never mind that anti-depressants can help with the unbearable anxieties our environments provoke. Nietzsche didn't take anti-depressants, so neither should you.
You know the rest of the argument. Drugs are blunting us from feeling depressed, a fundamental, and useful, human emotion. We suffer, although we don't know it because we're on anti-depressants, and our artists become insipid, for without depression we'd no longer have the stereotype of the suffering artist. Wilson says we should be very worried:
I for one am afraid that American culture's overemphasis on happiness at the expense of sadness might be dangerous, a wanton forgetting of an essential part of a full life. I further am concerned that to desire only happiness in a world undoubtedly tragic is to become inauthentic, to settle for unrealistic abstractions that ignore concrete situations. I am finally fearful of our society's efforts to expunge melancholia. Without the agitations of the soul, would all of our magnificently yearning towers topple? Would our heart-torn symphonies cease?
The link between madness and creativity is as old as the Romantic poets. Before that, we should remember, artists were depressed a lot, but they were depressed in a way ordinary, non-artists could recognize and experience themselves. In other words, "the agitations of the soul" as the (perhaps sole) source of creativity is an ideology linked to the changing social position of the artist.
Then again, joie de vivre is an ideological belief, too, and undoubtedly a more pernicious one as well. Almost twenty years ago Phillip Lopate published Against Joie de Vivre in which he dismissed the whole self-help bromide to live in the moment as self-defeating narcissism. Jacques Lacan, with more theoretical élan, darkly warned against Anglo-American psychiatry and its "cult of the normal man," in which any anti-social behavior was hunted down and eradicated, by surgery if necessary.
Freud himself said, "the intention that man should be happy is not included in the plan of Creation." The whole post-Freudian European philosophical tradition has taught us to think of happiness as a trivial pursuit for the Oprah generation, a Shangri-La perpetuated by self-help gurus. Greatness of soul has always been linked to a clear-eyed stoicism. Lincoln's law partner, W.H. Herndon, once observed that Lincoln, prone to bouts of depression throughout his life, "crushed the unreal, the inexact, the hollow, and the sham." Lincoln's "fault, if any," Herndon said, "was that he saw things less than they really were." What Herndon is describing here is a kind of depressive realism, in which depression can stem from fundamentally accurate perceptions—a worldview that, in some situations, can be an advantage.
Depression, like wine and butter, is good for you in small doses. Depression can be the first step to changing one's life. Misery is a recognition that not all is right in one's life, and in this sense depression is a part of happiness. I don't think we have a chronic shortage of depression in this country, as Wilson wants us to believe. But cutting ourselves off from a human emotion makes us less human. People have jokingly speculated what Dostoevsky would have been like on Prozac. We should wonder how people on Prozac read Dostoevsky, or how they truly understand the world around them.
Back when I had spare time--i.e., before I had children--I sometimes played SimCity. As the game grew more realistic with each edition, SimCity grew more engrossing and more frustrating. I became exasperated with my lazy Sims' refusal to walk more than a couple of blocks to see a doctor. I proved to be as inept at managing city budgets as I was at managing my own budget. Whenever a coal-burning power plant collapsed in a toxic heap, I was invariably short of funds to build a cleaner one to replace it, forcing me to raise taxes during brownouts. My department heads were supposed to help me avoid this sort of problem, but their contradictory, at times nonsensical advice induced the same sputtering incoherence as Mayor Daley is exhibiting these days. I kept playing, though, spurred on by the game's combination of whimsical humor and authoritarian power. The feeling of omnipotence is a nice complement to the cloistered effect of being on a PC for long periods of time.
Yesterday Electronic Arts launched the latest edition of the SimCity franchise, SimCity Societies. SimCity has always included pollution as a factor in its games, but it's never been as important as placing a health clinic on every corner. The latest version ups the environmental ante, making environmental factors like rising sea levels a much more important element in the game. Because the game was created in eco-friendly San Francisco, you can bet you'll have to disperse windmills throughout your city and make sure your waste water is recycled, or you'll hear plenty of loud complaints from your Sims--and Sims are a cranky lot. Supposedly, in this edition Sims can be induced to walk to work, but in my experience Sims wouldn't leave their houses if you issued them flying carpets.
If you can stand the whining, I would think creating a putrid metropolis would be most people's first impulse. The temptation to hasten global disaster is all part of the pleasures of apocalyptic thinking. The Western tradition is rich with apocalyptic brooding, and that strain of thought is just below the surface of some neo-conservative and Christian fundamentalist rhetoric. For this reason one would think these people would have been among the first to embrace Al Gore's message about global warming. Anyway, as any veteran of SimCity knows, after one has completed the sewer system and built the airport, you start to hope for a tornado or an alien attack to give you an excuse to clean up some of the less successful parts of your city. (Sims never die, but their houses burst into pieces very elegantly.) In short, disaster is all part of the pursuit of social perfection.
It's interesting that the game makers continue to carefully limit themselves to a single city, even when playing with global ecological disaster. A player is polluting one small corner of the earth, or constructing an oasis so rigorously green you can drink out of sidewalk puddles. This seems entirely consistent with our current approach to global warming: the solution rests solely in personal consumer choices; we're not going to force our national politicians to do anything about it, no matter how many hectoring columns Thomas Friedman writes. I once played the SimCity franchise's first foray into eco-politics, SimEarth. The game wasn't successful, in part, because of the totalized nature of the game. The Gaia theory upon which the game was based dictated that one oil spill meant your whole planet was doomed, and, since there were no department heads to blame things on, it was your fault alone. It was despairing. On the other hand, the rewards for keeping a healthy planet weren't exactly obvious. In my ecologically sound world, antelope munched on grass. Rabbits hopped around. It was boring. I longed to create one fetid corner of my world--one Gary, Indiana in paradise, just to keep things real.
SimCity Societies may end up teaching us the limits of our desire to clean up our world. Speaking for myself, I have a limited capacity for good acts that don't have an immediate payback. I've noticed, for instance, that the more I recycle the less I floss. If I buy the game and use up my thirty minutes of free time a week while both my children nap, I know I'll probably create a low-tax sinkhole and try to distract my Sims with stadiums and plenty of health care clinics. And this time I'll ignore my department heads.
Cult films are a distinctive feature of post-modern cinema. In an earlier post I argued that Napoleon Dynamite is a recent example of the phenomenon. One could list others, of course, but one in particular stands out: the Wachowski Brothers' The Matrix. The cult film phenomenon involves, among other things, appropriating a public text for private, yet still shared, means, sometimes far beyond what the original filmmakers may have envisioned. The Matrix has inspired all kinds of speculation on the nature of reality--some of it interesting, some of it silly. The filmmakers themselves supposedly based the series on a misreading of a philosopher with a cult following, Michel Foucault.
Now the Matrix as metaphysics idea has come full circle. John Tierney reports in today's New York Times on an Oxford philosophy professor named Nick Bostrom who argues there's a good chance that we may be living in a computer simulation. Tierney explains,
This simulation would be similar to the one in “The Matrix,” in which most humans don’t realize that their lives and their world are just illusions created in their brains while their bodies are suspended in vats of liquid. But in Dr. Bostrom’s notion of reality, you wouldn’t even have a body made of flesh. Your brain would exist only as a network of computer circuits.
Bostrom's theory is unprovable, but Tierney goes so far as to claim he has a "gut feeling" there's a better than 20% chance that our world is just a computer simulation. I guess it takes a more sensitive gut to detect this possibility than the one I have, because my gut has no inkings about being trapped in a computer simulation.
It's probably just a coincidence that Tierney raises the virtual world question during a financial crisis in which vast sums of money were made based on the fiction that the housing market would expand well past Americans' means to pay for housing. Still, if one wants to conduct a thought experiment about the nature of reality, then this is the direction I'd head toward.
Although Foucault is well known for his musings on the constructed nature of reality, Jean Baudrillard is our most systematic theorist of simulated worlds. His most famous concept is the simulacrum, i.e., the endless repetition of copies with no originals. Contemporary culture, according to Baudrillard, consists of the free exchange of signs without any referents. In earlier stages of Western culture the place of the referent was occupied by nature--raw materials and direct industrial production (e.g., turning raw rubber into tires), as well as artisan and craft modes of production. Now cultural products refer to nothing more than the circulation of commodities in late capitalism.
The recent housing boom saw a new phenomenon: flipping a house. At one time a private home was a middle class person's last tie to a specific territory, a small, but very specific slice of nature. During the housing boom the home became just another commodity to be bought and sold on a global scale. The "California Dream" is now the LA housing market, where one's mortgage starts off as a signed contract but quickly ends up as a chit in some vast investment portfolio in New York, Paris, Frankfurt, or Tokyo. The tangible reality of the home, where Bachelard tells us houses our daydreams, is like the bodies suspended in liquid in The Matrix: just a husk, its intrinsic value is determined in some obscure and complex marketplace few people truly understand. Life in today's real estate market is a gut-wrenching experience that may very well account for the intuitive sense that somebody out there is controlling our lives, and doesn't really care what happens so long as the lindens pile up.
Die-hard Monty Python fans will remember the spoof of the Icelandic sagas in which the name of each character is accompanied by a long list of his forbearers: "Ethelridge, son of Barfleby, son of Clem the Meek, son of Clem the Destroyer," and so on until the dawn of creation itself. The joke is that the list is so long the hero (Michael Palin, if I remember correctly) falls asleep. Once the list is over, the hero snaps awake, hops on his horse and goes off to kill somebody. The Monty Python skit is a joke, of course, but it conveys the gist of pre-modern identity. The epic hero was all his forefathers rolled into one, and his actions were always sanctioned, even the bloody slaughters, because of the continuities of kinship and the unities of the social order. You don't see Beowulf (son of Ecgþeow, grandson of Hreðel, king of the Geats) fretting about killing Grendel, Grendel's mother, and a dragon, an endangered species. When Beowulf dies, no one suggests he may have had unresolved issues with his mother, causing him to be aggressive toward women. The very suggestion is absurd: you would have to psychoanalyze all his forbearers, even his entire culture.
Juxtapose this with a patient lying on a psychiatrist's couch recounting some humiliation in high school and trying to figure out how that incident may account for his current timidity before his boss. We tell these kinds of stories all the time, and not just to psychiatrists. We tell them because we have to, because we're not sure who we are and, more crucially, how we got to be whoever it is we are. We also tell life stories because we'd rather be somebody else. American psychology has just discovered how stories integrate our stable sense of self with our everyday lives. Yesterday the New York Times reported on this hot news.
Every American may be working on a screenplay, but we are also continually updating a treatment of our own life — and the way in which we visualize each scene not only shapes how we think about ourselves, but how we behave, new studies find. By better understanding how life stories are built, this work suggests, people may be able to alter their own narrative, in small ways and perhaps large ones.
"When we first started studying life stories, people thought it was just idle curiosity — stories, isn't that cool?" said Dan P. McAdams, a professor of psychology at Northwestern and author of the 2006 book, "The Redemptive Self." "Well, we find that these narratives guide behavior in every moment, and frame not only how we see the past but how we see ourselves in the future."
Well, better late than never. For a century literary studies, philosophy, even some brands of Continental psychology have understood that these acts of narrative self-fashioning have been going on since Shakespeare's time. There's even a technology that has been developed since the 16th century to allow us to see ourselves as actors in a drama that is at once coherently plotted and open-ended. That technology is the novel.
In The Theory of the Novel (1920) George Lukács notes that Don Quixote--generally considered the first true novel--appeared exactly when "the Christian God began to forsake the world; when man became lonely and could find meaning and substance only in his own soul, whose home was nowhere; when the world, released from its paradoxical anchorage in the beyond that is truly present, was abandoned to its immanent meaninglessness."
In contrast with the epic hero who always feels perfectly at home wherever he is, the novelistic hero always feels a gap between his inner and outer selves, between what he thinks and how the world behaves. In a world of uncertainties and partial truths, the novel offers complete stories at the end of which everything (usually) makes sense. But unlike the heroes of the ancient epics, the hero or heroine of a novel has to learn what's possible in the real world and what's not. He or she must then reconcile themselves to those reduced possibilities. At the same time, the crucial innovation of the novel is that this compromise is freely chosen and serves as an act of self-definition. Think of Twain's Huckleberry Finn, in which a boy's dreams of adventure are transformed into a moral education, which in turn sets him on the path of a mature and autonomous adulthood. At the beginning of the story, he is a crude being trying to defend his own life. By the end, he is capable of making free choices--and the "right" choices, as defined by his particular place and time. By the end of his narrative, he is a recognizable person. He's a subject in the modern world. He may, however, have to undergo some therapy to get over whacking his father in the head with a shovel.
Recent Comments