How I Spent My Summer Vacation (which lasted two years, apparently)

American Popular Culture in the Era of Terror

Hello! It’s been a very long time since my last post, which was October 2013. That’s at least 14 years in blog years. While I’m not coming back to regular blogging–not yet, anyway–if there’s anyone out there who remembers me, I wanted to share some great news.

I recently published a book, American Popular Culture in the Era of Terror: Falling Skies, Dark Knights Rising, and Collapsing Cultures (Praeger, 2015)It’s in some ways a vast elaboration of some of the topics and cultural criticism that I spent two years exploring on this very blog. Although, and I can’t emphasize this enough, it took far longer than a hour to write.

Here’s the description:

Bringing together the most popular genres of the 21st century, this book argues that Americans have entered a new era of narrative dominated by the fear—and wish fulfillment—of the breakdown of authority and terror itself.

Bringing together disparate and popular genres of the 21st century, American Popular Culture in the Era of Terror: Falling Skies, Dark Knights Rising, and Collapsing Cultures argues that popular culture has been preoccupied by fantasies and narratives dominated by the anxiety —and, strangely, the wish fulfillment—that comes from the breakdowns of morality, family, law and order, and storytelling itself. From aging superheroes to young adult dystopias, heroic killers to lustrous vampires, the figures of our fiction, film, and television again and again reveal and revel in the imagery of terror. Kavadlo’s single-author, thesis-driven book makes the case that many of the novels and films about September 11, 2001, have been about much more than terrorism alone, while popular stories that may not seem related to September 11 are deeply connected to it. 

The book examines New York novels written in response to September 11 along with the anti-heroes of television and the resurgence of zombies and vampires in film and fiction to draw a correlation between Kavadlo’s “Era of Terror” and the events of September 11, 2001. Geared toward college students, graduate students, and academics interested in popular culture, the book connects multiple topics to appeal to a wide audience.

Features

  • Provides an interesting new framework in which to examine popular culture
  • Examines films, television shows, and primary texts such as novels for evidence of cultural anxiety and a preoccupation with terror
  • Offers insightful and original interpretations of primary texts
  • Suggests possible conclusions about cultural anxiety regarding breakdowns of tradition and authority

 You can read more about it here at Praeger’s website, or you can go to good ole Amazon.

As it turns out, I miss writing the blog. And I have an idea for the next book, and some of the ideas  should work well as the kind of short explorations that blogs are known for, with the plan to revise and expand in book form later. Here’s hoping–for me, anyway, and maybe for you?–that I’ll be able to get that project underway and that, a few years down the road, it will lead to another book.

Sorry about the long internet silence, sorry about some more subsequent silence, and here’s hoping that 2016 is a big year for American Popular Culture in the Era of Terror as well as the beginning of the next project.

Cheers!

 

Tagged , , , ,

This: The Popular Culture Studies Journal!

PCSJ-Cover-791x1024

OK, Hourman needs to go on hiatus for a little while.

But in the meantime, I’d like to share something that I–in my secret identity as Jesse Kavadlo–wrote that took significantly longer than an hour.

In the brand-new, just launched Popular Culture Studies Journal, I have an essay titled “9/11 Did Not Take Place: Apocalypse and Amnesia in Film and The Road.”  I’m very happy with it, and the other articles in the journal are excellent and more accessible than the average academic journal.

So happy reading, and when Hourman returns, probably in about two months, look for an interesting new direction.  I’m thrilled to have broken 600 Followers and over 65,000 views, which I do not take lightly.  The readership and response to the blog has far exceeded my expectations, and I’m looking forward to getting back to it soon.  Thank you.

Tagged , , , , , , , ,

The Ethicist Who Wears the Black Hat

I wear the black hat

There is no way that I won’t read a book by Chuck Klosterman.  Still, that sentence’s double negative reveals my ambivalence.  In some ways, CK and I doppelgängers: we’re the same age, moved cross-country in our adult lives (me: born and raised in Brooklyn, then lived for 4 years in Minnesota and now, St Louis; Klosterman: born and raised in North Dakota, now lives in Brooklyn), grew up on and still defend heavy metal when other aspects of our lives would seem to suggest—even demand—more highbrow predilections (such as the use of the phrase “highbrow predilections”).   Certainly this blog is indebted to Klosterman’s groundwork as that rare writer who is a popular culture specialist who is also firmly a part of popular culture itself.  Yes, he sells way more books than I do, but I is a English professor.

Yet Klosterman’s writing is also sometimes exasperating, including his current gig as the New York Times Ethicist and his new book, I Wear the Black Hat.  And they are exasperating for opposite reasons.  In his Ethicist column, Klosterman prevaricates and dithers for most of the response, before finally settling on an ethical verdict—one that often seems shortsighted at best and just wrong at worst.  Klosterman’s cultural analyses, on the other hand, are consistently overconfident and make sweeping generalizations—Klosterman would have written this paragraph’s topic sentence this way: “What is so weird is that they are always exasperating for exactly opposite reasons.” Although often, he  also has a good point.  In their approaches, tone, worldview, and conclusions, the Ethicist and the author of I Wear the Black Hat seem to be two completely different personae of Chuck Klosterman.

Or, better yet: two different Chuck Kloster-men.

05ethicist-superJumbo

Let’s look at Kloster-man A, the Ethicist.  Sometimes, he is just wrong, such as his response to a former college student who writes that he would “sometimes write a single paper that would satisfy assignments in more than one course. For instance, I once wrote a paper on how ‘The Love Song of J. Alfred Prufrock’ expressed satire; I submitted it for assignments in both my poetry course as well as my completely separate satire course. I did not disclose this to either professor.”

As usual, the the Klostethicist dithers for a surprisingly long time:

As I read and reread this question, I find myself fixated on the idea that this must be unethical, somehow. I suppose my knee-jerk reaction could be described like this: Every professor is operating from the position that any assignment she makes is exclusive to that particular class, even if she doesn’t expressly say so at the onset (in other words, it’s simply assumed that work done for a specific class will be used only for that specific class). It’s as if you were breaking a rule that was so over-the-top obvious it may not have been overtly outlined. But you know what? The more I think this over, the more I find myself agreeing with your position. I don’t think this is cheating. I wouldn’t say it qualifies as “genius,” and it might get you expelled from some universities. Yet I can’t isolate anything about this practice that harms other people, provides you with an unfair advantage or engenders an unjustified reward.

I look at it like this: You were essentially asked two questions that shared a common answer. The fact that you could see commonalities between unrelated intellectual disciplines is a point in your favor. Some might call your actions self-plagiarism, but the very premise of stealing your own creative property is absurd. You’re not betraying the public’s trust. It seems strange only because the assignments involve a degree of creativity. If this had been a multiple-choice physics test you failed to study for — yet were still able to pass, based on knowledge you acquired from an applied-math class taken the previous semester — no one would question your veracity.

It’s possible to argue that you were “cheating yourself” and wasting your own academic experience — but that’s not an ethical crossroads. That’s more of an existential dilemma over the purpose of a college education that (in all probability) you paid for. In the abstract, the notion of using the same paper twice feels wrong — and if you contacted your old school and told them this anecdote, it would most likely cite some rule of conduct you unknowingly broke. But fuzzy personal feelings and institutional rules do not dictate ethics. You fulfilled both assignments with your own work. You’re a clever, lazy person.

In other words, Verdict: ethical. Or not unethical. What Klosterman does not acknowledge, however, is that this person’s actions, in addition to  “cheating himself,” which is apparently not unethical, is that self-plagiarism dupes instructors—the former student did not ask permission, knowing that he was breaking the rules.  But OK, why is it a rule? Because it cheats the instructor, who wants original work, and more importantly, it cheats his classmates, who, through no fault of their own, did not have the luck to land two assignments similar enough for the same paper.  The students in those other classes did all of their work, essentially twice as much, as the letter writer. This person did half of it, for the same credit, at the expense of his teachers and peers.  Despite the dissembling, the answer was still wrong.

Here,  a person writes that he or she volunteers “for a program that serves homeless and at-risk American Indian people.”

He or she continues:

I sometimes sort and distribute their mail. In a separate community role, I advocate for infant and maternal health, because infant mortality rates in the Native community are three times higher than average. While distributing mail, I found an “introductory” infant-formula package for a Native mom. My first instinct, knowing the proven health advantages of breast-feeding, was to toss the package into the garbage, which seemed unethical. But it seems more unethical, given the higher infant mortality rates, to give her formula marketing materials without providing her the information that breast-feeding is better for her baby.

The Ethicist’s response–more evasion:

While the solution to this particular dilemma is straightforward, the broader question it raises is not. You have two unrelated jobs — mail delivery and advocating for infant health. So what do you do if the requirements of one contradict the responsibilities of the other? My advice would be to consider the worst case within each ethical framework and ignore whichever system has the least damaging real-world potential. Throwing away someone else’s mail is absolutely unlawful. (In this case, it’s defined as obstruction of mail and would be treated as a misdemeanor.) On the other hand, there’s obviously nothing illegal about failing to tell someone that formula is less healthful than breast milk. But can anyone objectively argue that the upside of upholding a man-made law regarding the improper disposal of unsolicited mail is greater than the downside of placing an already at-risk child in a potentially amplified position of peril? It’s not as if you’re making this judgment arbitrarily; as someone holding both jobs (and presumably trained to do so), you are in a valid position to decide which edict matters more.

So eventually, Klosterman decides that reading a stranger’s mail is OK, but only if you’re going to hector her about her personal life decisions, even if you don’t know anything about what might be going on in that person’s life, or what the person has even decided to do, if anything, with the formula:

In the specific scenario you cite, however, your two volunteer jobs are not really at odds. Give this woman the formula that was mailed to her, but not before urging her to consider the value of breast-feeding. Use the opportunity to educate her about how these nutritional methods are different, and let her decide what is best for her and her baby. In this way, you’d be performing both of your duties simultaneously.

This seems to me a clear case of Don’t Interfere with Other People’s Mail–or Personal Life Decisions. Ten unambiguous words.

In another ethical quandary regarding another’s mail—in this case, email—Klosterman again equivocates. Here’s the letter:

I sent my wife an angry e-mail. An hour or two after sending it, I was working at our shared computer and saw my e-mail, unread, in her in-box. Feeling regretful, I deleted it. Was this unethical?

And here’s the evasion (God, I’m running out of synonyms), before finally suggesting that the husband cannot ethically delete the email he sent to his wife, followed by a completely hypothetical caveat:

This is a situation in which our current relationship with a specific technology obfuscates the essence of the problem: who owns information, and when does that ownership start?

Let’s say you dropped a physical letter in the mailbox, walked up the block and suddenly regretted the decision to send whatever was in the envelope. Reaching into a public mailbox to retrieve that letter is unlawful (and complicated). But if the only letter you want to grab is the one you deposited, would the impulse be immoral? What if you regretted the decision not because of what the letter contained but because you realized it was incorrectly addressed? And what if the mailbox wasn’t public? What if it was the private mailbox in front of your suburban home (but you’ve already raised the box’s flag, signaling to the postal employee that the letter is now available to be delivered)?

It’s difficult to definitively declare when a physical letter no longer belongs to the person who wrote it. It could be argued that the moment a letter is placed inside an envelope and the recipient’s name is scrawled on the outside, the contents become the recipient’s property. But this, somehow, feels incomplete; you could hold onto an addressed, stamped envelope for years, and no one could stop you. What makes e-mail different is that this philosophical haze is technologically eliminated by the lack of a middleman: the moment a user hits the “send” button, the question of ownership is moot. But that shouldn’t dictate the ethics.

The reason I would classify what you did as unethical is that you shouldn’t be directly accessing your wife’s e-mail account. The fact that you saw this unread e-mail was possibly unavoidable, as that’s always a risk with a shared computer. But you should not manipulate or examine the contents of her in-box, regardless of where those contents are from (there are theoretical exceptions to this rule, but they’re so rare that they can almost be disregarded from the discussion — if your wife was missing, for example).

I will, however, say this: had you remotely deleted your own unread e-mail after it was sent, I would not classify the act as unethical. If someone wrote an ill-advised e-mail in haste (or inadvertently sends a message to 100 co-workers instead of one) and used an “unsend” feature to destroy it before it could be opened, I would support the act (although it should be noted that the current technology for doing so isn’t very practical — not everyone has it, and my e-mail system only allows for a 30-second annihilation window).

Now, I realize this presents a logical contradiction. As the writer of the e-mail to your wife, you could claim you’re being reprimanded for manually doing something that would somehow be acceptable if it were done remotely, even though the outcome is identical. The difference, however, is this: the first situation involves rooting through someone’s nonphysical mailbox, which we’ve collectively agreed is off limits. The second situation involves pre-emptively extracting something that — in my view — is still partly your property. That distinction is minuscule and certainly debatable. But that’s what makes this a good question.

So…  it’s ethical to read a stranger’s mail if it leads to meddling in her personal life, in which you have nothing at stake, but it is not OK to delete an email you sent to your wife, even though married couples often share computers, often leave their computers open, and seldom sign out of accounts, making such a deletion less of an intrusion than reading the stranger’s mail, but it’s also wrong even if you yourself wrote the offending email—and deleting it will preserve marital harmony.   In Klostermania ethics, deleting an email from your spouse’s account, that you yourself sent, is a worse violation than egregiously  hurting her feelings.  Perhaps the email had been lecturing her about breastfeeding, so it’s OK.   Of course, a husband who is sending his wife nasty emails probably has bigger issues in the marriage, which is apparently a less important point to raise than a “distinction [that] is minuscule and certainly debatable.”

Yet for all the, um… synonym… fudging in those answers, the other Klosterman, in Black Hat, is sure as shit and right as rain: The Eagles “are the most unpopular super-popular entity ever created by California…  I know this because everybody knows this….”  Beginning a book about villains with the Eagles is counterintuitive, but it helps to reveal an interesting idea—that people are capable of vilifying even the blandest, more innocuous stuff—that then becomes smothered by the high-stakes hyperbole.

And: “There is no greater conundrum for the sports-obsessed historian than the relationship between Muhammad Ali and Joe Frasier.” Look, I’m not a sports guy at all, but I find it hard to believe that sports-obsessed historians agree on anything.

When Klosterman gets to comparing real-life subway shooter vigilante Bernard Goetz with Batman, then I’m interested.  But again, Klosterman’s strange need for absolutism and Manichaeism leads to pronouncements like this: “But oddly—or maybe predictably—most of these comparisons [between Goetz and Batman] are primarily occupied with why everyone still loves Batman (as opposed to why everyone stopped loving Goetz). They start from the premise that vigilantism is indisputably wrong. The core question is always some version of ‘Why are actions unacceptable in life somehow acceptable in fiction?’ But this seems like the wrong thing to worry about. That answer seems self-evident.  I more often wonder about the reverse: Why are the qualities we value in the unreal somehow verboten in reality?”  He goes on to suggest that “Batman is a beloved fictional figure who would not be beloved in a nonfictional world… He would be seen as a brutal freak, scarier to the public that the criminals he captured.”

It could be such an interesting comparison. But the insistence that the first question is self-evident and that the second question is somehow better and opposed to the first seems wrong-headed.  Klosterman’s second question in fact seems far more self-evident: because real people get hurt in real life.  And the follow-up ignores that the recent Batman relaunch—the one that grossed a hundred gazillion dollars—is in fact primarily concerned with the very question of what Batman would be like in a less cartoonish, less fictionalized fictional world.  What began as a good set of questions seems undermined by smug certainty and cherry picked examples.

Look. I liked the book. I like the topic. I like that Klosterman actually talks about Mr. Bungle (who I love, unlike CK).  But I can’t get behind that I Wear the Black Hat is written in the same overheated rhetoric of the above quotations.  Here’s the repetition breakdown:

Always is repeated 68 times

Never: 78 times

Inevitable/inevitably: 22 times

Everyone: 38 times

No one: 32 times

Certainly: 23 times

True: 36 times

All: a whopping 92 times

The book is only 199 pages.

Perhaps, in the end, the two Klostermen can come together.  Perhaps the Ethicist can achieve some of Black Hat Klosterman’s insight and moral clarity—less wishy-washy, but more insightful. And BHK can approach the world in a way that’s more relativist (in a good way), to try to examine his subjects in a way that acknowledges that not all things, everyone, or no one certainly always believes or behaves in the ways he proscribes.

We would have a more ethical Ethicist, and more readable cultural criticism that acknowledges the ambiguity of his subject matter.  After all, metaphorically speaking, most characters and people wear gray hats.

Time: Over! 90 minutes

Tagged , , , , , ,

Fall: Verb, Noun, Season, Metaphor

fall

Although I’m facing a late summer heat wave, and it’s  still about three weeks away, the beginning of school makes me think it’s fall.  It’s a strange word, “fall”: really a verb—action word!—technically also a noun.  Kids can recite “person, place, or thing” in a heartbeat, but fall is not any of these, not even exactly a thing.  Ideas are also nouns, but fall is not quite an idea.  Yes, in most parts of the world the temperature and weather literally change.  But seasons are also metaphors, and the idea of fall is the most powerful one.

Many people say they love spring.  But spring is a cliché.  Even the name “spring” sounds too eager to please, too self-helpy, archaic slang that should have gone the way of “keen” or “corking” or “moxie.”  Warmer weather, longer days, shorter clothes, life in bloom, fertility symbols like bunnies and eggs [1], school almost out, and, if you’re into that sort of thing, resurrections.   What’s not to like?  Spring ahead, fall behind.

It takes a special person to love fall.   Trees sense the cold and pull back unto themselves, sacrificing their own expendable body parts for the upcoming months of darkness to save the whole, like trapped animals gnawing off their legs.  The leaves self sacrifice for the greater good, tiny reverse lifeboats abandoning ship, each a desiccated little martyr and hero.

We imagine that it’s the leaves that do the falling.  But people also retreat in winter as well: into more interesting clothes, and the interiors of home and self, even more comforting knowing that it’s getting cold and dark outside.  And some of us like the feeling of falling.

Our language reflects fall’s pleasant equivocality.  We speak of falling asleep, as something that happens almost by itself, pleasantly passive even as millions actively take medication and work hard to achieve it.  You’d think falling would be easy.  Then, once we do satisfyingly fall asleep, many of have recurring nightmares. About falling.

Warning: this is not a metaphor!

Warning: this is not a metaphor!

We fall in love, the language itself shaping our understanding of life’s most delicate/ confusing/ overwhelming/ important/ wonderful/terrible feeling.  Fall suggests the suddenness of love at first sight, the helplessness, lack of control, and even danger.  I fell for her so hard.  Sounds painful.  Sometimes it is.  Unlike real falling, but like falling sleep, trying to fall in love will probably prevent it.  What would happen, though, if we did not fall in love, but, say, flew in love—or settled in love?  Floated in love, or ran in love?  Poured or drew or brewed or even stewed… in love?  Crashed in love?  When I met her, we didn’t dance in love right away, but gradually danced closer as we got to know each other.  Once we fall into a metaphor, we lack the imagination to get back up.

do-not-fall-in-love

Few of us have fallen in any serious way in real life, and if we did, it was likely a horrifying accident, not something we would wish for.  And if we’ve not just literally fallen, but fallen in something, it’s even worse.  What, other than love, can you fall in that’s not terrible? And why fall in love at all?  Even if I try to change the image, love is still, metaphorically, something to be in, a container, at best; an abyss, at worst.  But most of us pine to fall in love.  Sometimes it feels good to fall, as so many amusement park rides simulate.  And, in the words of Jeff Bridges’s character in Crazy Heart, “Sometimes falling feels like flying/For a little while.”

In some ways, though, the idea of the fall has shaped the views of our moral and mortal world.  Last semester, when I taught Paradise Lost, students were struck by the sadness, but also the hopefulness, of Adam and Eve’s fall, their expulsion from Eden.  Yes, the fall is bad.  But,as the Angel explains,

This having learnt, thou hast attained the sum
Of Wisdom; hope no higher, though all the Stars
Thou knew’st by name, and all th’ ethereal Powers,
All secrets of the deep, all Nature’s works,
Or works of God in Heav’n, Air, Earth, or Sea,
And all riches of this World enjoy’dst,
And all the rule, one Empire: only add
Deeds to thy knowledge answerable, add Faith,
Add Virtue, Patience, Temperance, add Love,
By name to come called Charity, the soul
Of all the rest: then wilt though not be loth
To leave this Paradise, but shalt possess
A paradise within thee, happier far.
(XII.575–587)

That’s precisely what’s better about fall than spring.  The happiness is internal, not just external.  it allows for paradise within.  Besides, you can’t have spring without fall, can’t regain paradise without losing, can’t love or sleep without falling, and you can’t fall in something that’s not already deep.  Spring—even Paradise—eschews fall’s depths.

The sunshine spring lovers love?  It’s carcinogenic.  The renewal of life? Life is a sexually transmitted disease with a 100% fatality rate.

Happy Fall!

Time: 60 minutes.


[1] And egg-laying bunnies. I shudder to remember the Cadbury Egg commercials showing a rabbit laying a chocolate egg.  KIDS: if you see this is real life, IT IS NOT CHOCOLATE.

Tagged , , , , , , ,

I Bet You Think This Blog is About You: Blurred Lines and the Problem with Direct Address

blurred-lines-cover

Who are you?

Or, I guess, who are “you”?

More accurately, if less grammatically, who is “you”?

“You” has been very busy, at least going by song lyrics.  Other genres—including a lot of poetry, even though people think of lyrics and poetry as the same thing—stay away from using “you” as the dominant pronoun.  You can count on one hand the number of novels written in second person.  (Bright Lights, Big City; something by Italo Calvino…  OK, on two fingers.)  Instruction manuals, and their snooty siblings, self-help books, sure, “you” yourself away.  Nonfiction—and blogs—use direct address as an occasional rhetorical device (“You can count on one hand…”).  But every song is about You.  Here’s a rundown of some song titles that begin with “You”:

You, Breaking Benjamin

You, REM

You & Me, Dave Matthews

You and Your Friend, T-Ride

You Are Not Alone, Michael Jackson

You Are the Everything, REM

You Are the Girl, Cars

You Belong With Me, Taylor Swift

You Better Run, Pat Benatar

You Can Call Me Al, Paul Simon

(You Can Still) Rock in America, Night Ranger

You Can’t Always Get What You Want, Stones

You Can’t Get What You Want, Joe Jackson

You Can’t Kill Michael Malloy, Primus

You Can’t Kill Rock & Roll, Ozzy

You Can’t Stop Progress, Clutch

You Could Be Mine, Guns n Roses

You Don’t Have to be a Prostitute, Flight of the Concords

You Don’t Know Me at All, Don Henley

You Don’t Know What Love Is,, White Stripes

You Drive Me Ape, The Dickies

You Dropped a Bomb on Me, Gap Band

And that’s just the “You D–“’s, with more than 40 more You-first, not including “You” contractions. This is just from my iTunes library.  (Yes. Taylor Swift.) Go pull up your own playlists and see for yourself (and go ahead and post favorite or significant titles in Comments).  And obviously this list can’t include all the songs that revolve around “you,” since that would be nearly all of them.  “You” had to be a big shot.  Who is “you,” and how do you have so much time to do everything?

All of this is a way of getting to the Song of the Summer, in caps, Robin Thicke’s Blurred Lines.  Yes, it’s crazy popular, and you can’t listen to the radio for 10 minutes without hearing it.  But it’s controversial, because the lyrics have been declared rapey, a word that fills an important vocabulary niche but that still sounds, meaning aside for the moment, like the name of a cat.  A lot of it comes down to this line:

You know you want it.

It’s pretty damning out of context, especially the way I used sinister italics, sitting on the page like that creep on the public bus. And I’m not here to defend the song. (Hourman hates controversy.)  But a few things are interesting about it.  First, its context is not the page but rather a pretty sweet R&B song, which is melodic, playful, and even a little corny (musicians: it’s all in the flat 7s, the corniest of all intervals).  How else can Robin Thicke, Jimmy Fallon, and the Roots pull off an all kiddy instrument version of the song?  (Of course, I am conveniently ignoring that OTHER video.)

And the song sounds and feels nothing like what I think of as the (intentionally, to be make fun of that sort of thing, according to the group; Thicke said something similar about his own song) rapeiest song of all, Stone Temple Pilots’ raucous Sex Type Thing.  Featured lyric: “You wouldn’t want me have to hurt you too, hurt you too,” totally worse than “You know you want it.”  (Musicians: the main riff revolves around a flat 5 interval, which everyone knows is the devil in music).

Maybe the song could deflect its accusations better if it were one of those He said/She said songs that allow for more than one point of view and point of “you”—think Don’t You Want Me’s first verse by the Guy:

You were working as a waitress in a cocktail bar
When I met you
I picked you out, I shook you up
And turned you around
Turned you into someone new

 Followed up in the second verse by this, by the Girl:

I was working as a waitress in a cocktail bar
That much is true
But even then I knew I’d find a much better place
Either with or without you

Balanced, dueling “you”s.  Or more recently, Gotye and Kimbra’s Somebody I Used to Know, where we get the sense that both the man and woman are hurting over the breakup, not that one is right and the other wrong. First Gotye sings this:

Now and then I think of when we were together
Like when you said you felt so happy you could die
Told myself that you were right for me
But felt so lonely in your company
But that was love and it’s an ache I still remember

Later followed by Kimbra’s POV:

Now and then I think of all the times you screwed me over
But had me believing it was always something that I’d done
But I don’t wanna live that way
Reading into every word you say
You said that you could let it go
And I wouldn’t catch you hung up on somebody that you used to know

No assumptions about what the woman wants, since the song allows her to tell us.

It’s also interesting to compare Blurred Lines to the runner-up song of the summer, the maybe even catchier Get Lucky by Daft Punk.  Here’s the chorus:

She’s up all night to the sun

I’m up all night to get some

She’s up all night for good fun

I’m up all night to get lucky

We’re up all night to the sun

We’re up all night to get some

We’re up all night for good fun

We’re up all night to get lucky

We’re up all night to get lucky

We’re up all night to get lucky

We’re up all night to get lucky

We’re up all night to get lucky

No “you” at all!  Instead, the song uses “she” and “I’ before settling on “we.”  If the repeated line had been “You’re up all night to get lucky,” a la Blurred lines, it wouldn’t sound so sex positive:

 You’re up all night to the sun

You’re up all night to get some

You’re up all night for good fun

You’re up all night to get lucky

You’re up all night to get lucky

You’re up all night to get lucky

You’re up all night to get lucky

You’re up all night to get lucky

 Now it sounds so accusing, kinda shamey.  Now, it’s not a story about two individual people, She and I, who together comprise We, but rather the lyrics’ male speaker looking at and judging the behavior of an unnamed woman.

The same thing happens if you contrast the next line in Blurred Lines: “You’re a good girl.”  It’s the singer’s assessment of what the woman thinks of herself, not necessarily what she thinks.  Contrast it with Tom Petty’s Free Falling, which begins with the exact same line but in 3rd person:

She’s a good girl, loves her mama
Loves Jesus and America too
She’s a good girl, crazy ’bout Elvis
Loves horses and her boyfriend too

And with 3rd person comes the feeling of objectivity, which may be at the heart of the Blurred Lines—and, for me, the “you”—controversy.  “She’s a good girl” sounds like an omniscient narrator.  It means what it sounds like, or at least doesn’t call attention to its own possible ambiguity.  “You’re a good girl” sounds subjective—who are you to say or know whether she/I/ you is/am/are a good girl?  Blurred lines indeed.  Whether we find the line—and “You know you want it”—offensive or not boils down to whether we believe the singer.    If the singer—he—is reliable, and she—the recipient of the song’s words—is a good girl, and does want it, and the blurred lines of the title represent  the internal conflict within the woman herself, then the song is seductive, which I take as Thicke’s—and every lyricist’s—intention.  But if we doubt him, and hear situational blurred lines—he thinks that she wants it, but she doesn’t—well, that’s rapey.

But it’s up to you to decide.

Time: Over time, about 80 minutes, since I didn’t keep track that well, with double apologies for going italics crazy.

Tagged , , , , , , ,

Everyone who believes in books, or has (or has been) a child, should read Andrew Solomon’s Far From the Tree

Far From the Tree

A quarter of Americans read zero books per year.  The Onion, as usual, put it best: “Print is Dead at 1803.”  I know this is a blog. You’re reading it on a screen.  And I like blogs, and websties, and Facebook. (Twitter, however, is too scary. Mean people.) I read articles and sites online every day, sometimes for hours.  I teach online classes and collect and respond to student papers, even in face to face classes, electronically.  But books are different, and special.  People need to read more of them.[1] Andrew Solomon’s book Far From the Tree: Parents, Children and the Search for Identity demonstrates precisely what a book, and no other form or medium, can still do.

Greatly exaggerated

Greatly exaggerated

What it’s about: children who are different from their parents.  That, of course, would be all children, but a simple recitation of the chapter titles alone reveals something of the book’s scope and depth: Son, Deaf, Dwarfs, Down Syndrome, Autism, Schizophrenia, Disability, Prodigies, Rape, Crime, Transgender, Father.  The first and last chapters refer to Solomon’s personal experiences and bookend the other the chapters.

Of course, it takes nearly a thousand pages to cover the material.  Solomon frames his discussion of seemingly disparate groups in two main ways.  First, he talks about parents and children as having both “vertical” and “horizontal” identities.  Parents default toward the vertical—that is, what is the same between parents and children, and what is passed down (the language itself suggesting verticality) directly from parent to child.  Hearing parents, heterosexual parents, cis gender parents expect and assume their children will be the same as them.  But often, children are radically different, instead having what Solomon calls horizontal identifies, therefore becoming part of a new, horizontal community outside of the biological family —the deaf community, the dwarf community, the disabled community.  And sometimes, there is only the identity without the community: prodigies, children born of rape, children whose disabilities prevent them from any form of communication, who, unlike other groups, have not coalesced into an identifiable horizontal identity.

But even the idea of identity itself is complex, which brings Solomon to his other framework. Drawing upon his own experience as a gay man and the cultural trajectory homosexuality has taken during his lifetime, Solomon suggests that his subjects can each be understood as operating on a kind of spectrum.  On the one end of the spectrum is illness, which requires intervention: homosexuality, and various kinds of disability, were believed to necessitate cures, treatment to establish the vertical identity of the parents.  But on the other end of the spectrum is identity: a meaningful difference that is not perceived as undesirable, one not to be taken away, pitied, or, for that matter, admired.  Where different syndromes and orientations fall on this spectrum, however, is subject to contentious debate.

Not surprisingly, the book is exhaustively researched and extensively documented: over 100 pages of notes alone, so it felt nutritious—I learned more on every page.  But it is not just a synthesis of academic articles, or the more than three hundred interviews that Solomon conducted himself. The tensions between these ways of understanding children who are not like their parents—vertical/horizontal; illness/identity—inform each chapter, and my summary cannot do justice to the overall intelligence, nuance, morality, and warmth that comes through.   It is a long book that easily moves back and forth between individual case studies—no, not case studies, people, since “case study” sounds more clinical than the human, and humane, portraits that emerge—and academic analyses spanning literature, psychology, history, and medicine, navigated and negotiated by an author who places himself, and his well-informed beliefs and ideas on the page.  By the time I was done, I felt as though I had been through something, and gotten to know, and love, Andrew Solomon himself.  I didn’t agree with everything I read, but I considered everything I read.  Nonreaders are quick to create a false dichotomy between books and life, but they are wrong. The best books provide a deep, meaningful life experience for the reader.  Books, like births, create horizontal communities and identities as well.

One of the few 1-star reviews on Amazon.com, for me, helps explain the book as well as one of the many 5-star reviews: “The author talks 2 much- and he is super boring and actually sounds like he just took a class in college and is repeating what the professor said- very disappointed.”  This reader, unwilling to put in the time, retreats into the worst cliché, boredom.  (The part about “repeating what the professor said” baffles me, though, as through Solomon somehow didn’t write it.)  Reviews like this help me to understand the zero books per year number.  A good book, unlike other popular forms of reading, to say nothing of other forms of entertainment, makes the reader work, but feel as though the work is worth it. Even if I did not work as hard as Solomon, who took over ten years to write Far From the Tree.

I don’t know how he finished it so quickly.

Time: 50 minutes, not counting reading the book.

In Comments, feel free to share a book that you felt to be a meaningful life experience.  While Far From the Tree is of course nonfiction, any genre is fine.


[1] I’m not going to get into the electronic vs print book issue here, except to say that I still read books only in hard copy, and I can’t imagine having read this one on a screen.

Tagged , , , , , , ,

Breaking Bad; or, the Superhero Uncertainty Principle

breaking_bad

I am several years late to the Breaking Bad party.  I tried watching it two years ago but lacked the fortitude to see how Walt and Jesse were going to dispose of the dead body and get themselves out of trouble in just the second episode.  But having spent the past three weeks catching up—I want to use the word “binging”—on Seasons 1 through 4 (so no Season 5 here), I’m struck by the ways in which the show—about how down on his luck high school chemistry teacher Walter White turns to cooking meth to provide for his family when he’s dead, having discovered he has late stage lung cancer—thoroughly borrows from, and just as thoroughly subverts, all of the stale ingredients of the superhero story to cook something new and powerful.

There’s the basic Superhero 101 stuff: Walter White has an alliterative name : Clark Kent, Peter Parker, Reed Richards, ad infinitum; he has a sidekick who is younger and physically smaller, Jesse Pinkman, whose own name is superheroic, although The Adventures of Pinkman may not appeal to the target demographic.  (Jesse also has a sketchpad full of superhero drawings, each, according to his late gf, a version of himself).  Walter has an identifiable vehicle (although, like Pinkman, it’s not exactly awe-inspiring—it’s an Aztec), a secret lab (with a 60s style Batcave entrance—a secret staircase behind a secret door), a disguise (hat and sunglasses count), and most importantly, a dual identity: Heisenberg, the nom de guerre he takes that, like Batman, reveals something important about who he is to the viewer but somehow not any characters—Batman’s legend of the bat flying through the window as a way to inspire fear; Heisenberg, as one of the key thinker in quantum physics but known in the popular consciousness for the Uncertainty Principle, which could have been the name of Breaking Bad itself.  And like Batman, Heisenberg has no superpowers, just his superbrain and whatever gadgets and plans the brain can come up with.

aztec car

But what BB really borrows from the superhero story is less the outer trappings than the inner workings of the dual identity conceit.  In a show obsessed with secrecy, it’s not surprising that Walter has more in common with Superman than the newest version of Superman himself (except for the good and evil thing, which I‘m getting to).  Instead, what Walt is hiding is neither the meth nor the money, but something that harkens back to the earliest symbolic and dramatic appeal of superheroes themselves: that there is something special, wonderful, and necessarily hidden about Walter that only he and his closest confidants—including the viewers—know about him.  The Walter that the world knows is a regular guy at best and a bit of a loser at worst. In devising a cover story for a multiple-day disappearance, Walter lays out what he knows he looks like to the world for a psychologist (and here I paraphrase from memory): having seen all of his peers surpass him and make millions, Walt now makes $44,000 a year, has a disabled teen-aged son, a baby on the way, and a terminal disease.  Ouch.  But secretly, he is fearless, awesome, and superhumanly capable—everything he is not on the surface.  He synthesizes the best crystal meth ever, improvises explosive and poisonous chemicals, charges his RV’s dead battery out of the pocket change lying around, and takes on and takes down crime kingpins.

Like Superman’s Clark Kent, the Walter White that the world knows, and who he used to be, becomes the hapless alter ego, the disguise of normalcy he wears for protection so that no one knows who he really is. Even Hank, his DEA brother in law, so often superheroic in his own cop instincts, cannot fathom that lame ol’ Walter is Heisenberg, just as Lois Lane, star reporter, cannot connect that Clark is Superman.  Despite staring them in the face, the notion is too preposterous to take, even when Walt jokes, on several occasions, that he is a super criminal. “Got me,” he says to Hank, who laughs, and to the audience, who laughs for entirely different reasons.

Heisenberg-e1316393225858

Which takes me to the other significant superhero trope that Breaking Bad appropriates: dramatic irony mixed with suspense.  That is, the audience, but almost none of the characters, knows all about Walter.  We know what Walt knows, which means that we can see how the tensions between his identities and secrets will play out.  It’s a great device that seems to have fallen out of favor—witness Man of Steel’s  jettisoning of the classic Clark Kent/Superman/Lois Lane triangle of dramatic irony, as well as the many excellent movies of the last decade—the Bourne movies, Eternal Sunshine of the Spotless Mind, Memento, and more—that use the what used to be tired trope of amnesia to reverse the very premise of dramatic irony (undramatic irony? Dramatic sincerity?).  Instead of knowing more than the characters, we know as little as they do and learn as they do.  It’s interesting and maybe fun, but it can be exhausting.

Yet even though we know what we know, one of the show’s addictive qualities for me is the suspense, even back to that second season 1 episode that almost put the brakes on the Bad for me. We know Walt is the smartest, most resourceful, and most desperate guy in the room. We know he has to get out of whatever craziness the particular episode focuses on—disposing of dead bodies, disposing of live bodies, getting out of a trap, luring someone else into a trap, breaking into one building, breaking out of another—and whatever Walt has now gotten himself into, he somehow has to get out of it.  Until the very last episode—sadly, coming up soon—we know that Walt somehow has to walk away mostly unscathed.  (Unlike in, say, Game of Thrones.) But again and again, we need to see how.  In a form pioneered by superhero comics, the show continues the best tradition of the serial narrative.  It has a larger, longer, season-wide arc that shifts and varies, but also a single-episode, smaller arc that never changes: Walt gets into trouble, Walt gets out of trouble, seeming to restore the status quo, but the getting out must somehow create newer, even worse trouble for next time. It’s 60s Batman with a meth twist.

The big question, then, is the moral one.  Aren’t superheroes the good guys?  Isn’t Walt really a villain, not a hero?  The bald head he decides to keep post-chemotherapy, not to mention the way that Brian Cranston is able to change his face from fake kind to real evil like it’s a special effect, puts him in firm Lex Luthor territory (sorry, Professor X).  It’s been the perennial post-Sopranos TV problem.  Walt is a lot like a combination of Tony, or the dad version of Nancy from Weeds, an regular guy version of Jax from Sons of Anarchy,  or, at times, Dexter.  And since my time is up, I’m not going to resolve the idea of narrative sympathy, subjectivity, or evil here (which I talked about a greater length for Game of Thrones anyway), as much as to say that it reminds me of a large-scale version of a dopey old Jerry Seinfeld routine:

I love these nature shows, I’ll watch and kind of nature show, and it’s amazing how you can always relate, to whatever they’re talking about. You know like you’re watching the African Dung Beetle and you’re going “Boy, his life is a lot like mine.” And you always root for whichever animal is the star of the show that week — like if it’s the antelope, and there’s a lion chasing the antelope you go, “Run antelope Run! Use your Speed, Get away!” But the next week it’s the lion, and then you go “Get the antelope, eat him, bite his head! — Trap him, don’t let him use his speed!”

But instead of a lion and the antelope, we root for whoever is on screen.  Go, Walt! Get away from Hank! Hank, you can get Walt! He’s right there! Walt, get away from Gus! Gus, kill the cartel guys who killed your old partner! Jesse, get back at Walt! Walt, stay away from Jesse!  We are simply suckers for the point of view characters, morality and uncertainty be damned.

Time: 80 minutes. Darn.

BONUS HOURMAN!  It’s been a while since a major show had a character named Jesse (which is my name.)  Dukes of Hazard, Full House, and Rick Springfield ruined my childhood, but Breaking Bad seems not to have had any effect, other than the weirdness of hearing my name so many times on TV. In Comments, feel free to post about your own experience sharing a name with someone or something famous or in the media.

Tagged , , , , ,

Man of Steal

The S stands for Hope. Shope. S'hope.

The S stands for Hope. Shope. S’hope.

Man of Steel, the movie that dares not speak its name, uttering the S word only once[1], opens in a CGI sci-fi universe reminiscent of Avatar.  No giant Smurfs, but plenty of bizarre creatures and vaguely cloud-forest images.  Russell Crowe shows up, reprising his weird fake English accent[2] from Les Mis but now playing a Jedi, including requisite Prequel Mullet, and before long the movie looks like Star Wars by way of Alien, a kind of PG-13 HR Giger, biomechanical but desexualized, down to the Kryptonian asexual reproduction, even as everything on Krypton also looks like a phallic symbol.  (They’re obviously sublimating their sexual frustration.)

Cute little Kryptonians!

Cute little Kryptonians!

Then Michael Shannon shows up, and you know he’s a bad guy because of the shape of Michael Shannon’s head.[3]  Krypton blows up on cue, Kal El is launched in another phallus, and before long, Clark Kent is a grownup on Earth—33 years old, a portentous age that the movie does not fail to point out to us.  Then we’re in X-Men territory, as the heavily muscled and even more heavily chest-haired[4] Henry Cavill drifts, just as heavily muscled and  equally hirsute Hugh Jackman did as Wolverine over a decade ago, trying to understand his place in the world, the charm on his necklace again his only clue.  Cue “Seasons,” the depressing Chris Cornell acoustic grunge song from the movie Singles, as the Artist Formerly Known as Superman swipes a conveniently flattering flannel shirt from a clothesline and hitchhikes to the next identity a la David (not Bruce) Banner in the TV show The Incredible Hulk.  If only the movie played that music instead:

The movie is cut with flashbacks to young Clark’s childhood, where, rather than having super abilities, he’s treated, and behaves, more like a child with disabilities. It’s an interesting metaphor that the movie doesn’t do much with—Smallville, the TV show, did it much better.  Ma and Pa Kent show up, although Kevin Costner’s Jonathan isn’t what I associate with the role. Rather than teaching Clark to celebrate who he is and always do what’s right, he warns him that he has to hide his true self.   Again, shades of X-Men, which I always read as a reversal of the Superman story. While classic Superman is a wonder of assimilation, cheered and welcomed by humanity for his differences, the X-Men are feared and suspected for their differences, and in Man of Steel’s revision, Superman is not only an alien but alienated.

Christopher Nolan co-wrote and produced the film, and he brings his rebooted Batman sensibilities to the project—Superman is dark and brooding, not just orphaned, like Batman, but orphaned twice, by both Jor El and Jonathan Kent.  Before long, General Zod’s mean-shaped head is back and threatening to TAKE OVER THE WORLD, at which point the movie takes its cues from War of the Worlds, down to the giant tripods, and Cloverfield and other 9/11/2001-infleunced films, all shaky handcams and masses of people fleeing the dust, wreckage, and debris of falling buildings.  Meanwhile, a Transformers-like cityscape CGI battle ensues for, I don’t know, like an hour.  Superman wins! Yay! And kisses Lois Lane, even though Metropolis looks like it was hit by a hundred 9/11s.  No matter. In the final scene, the Clark Kent we know and love—glasses!—shows up in a miraculously restored Metropolis (although it took over a decade to put up a single new tower in Ground Zero), and we’re ready for the next adventure.

Quick! Sneeze on them!

Martians! I mean, Kryptonians!

Look.  I don’t want to be a jerk here.  But I took my boys, ages 11 and 15, to see this movie, hoping for—for what?   The way I felt when I saw Superman with Christopher Reeve, I guess. Or Star Wars, or Indiana Jones, or the many movies that I can honestly say felt like a formative childhood experience.  I’m not one to wax nostalgic.[5]  And there’s nothing exactly wrong with the picture, as the discrepancy between the fan ratings (largely positive) and critics’ reviews (negative to lukewarm) suggest.   But in borrowing from, let’s recap here, Avatar, Star Wars, Alien, X-Men, Hulk, Smallville, Batman, War of the Worlds, Cloverfield, and whatever I left out, director Zak Snyder and Nolan seem profoundly embarrassed by Superman himself.  Superman thrives on the dramatic irony of Clark Kent’s nebbishy persona, the one that Reeve did so well, the one that is as absent here as Superman himself is.  We know who he really is, and we’re special for it.  But there is no Clark Kent here, and no Superman.  Nolan’s Batman movies got to the core of that character, a man pushed by tragedy to the brink of psychosis, living in a noir nightmare, neurotically and impotently trying to avenge and atone for his parents’ deaths.  But Superman is not Batman, and Man of Steel does not get to the core of Superman.  In trying to reboot him, it abandons what I liked about the character–his contrasting personas, his simplicity, his good nature, his fun.    It should be awesome to be Superman.  We don’t need to learn that [spoiler?] he himself is somehow responsible for luring Zod to Earth, or [spoiler x2?] not saving Jonathan, that he struggles with who he is, that humans fear him.  (The only human who used to fear Superman was Lex Luthor.)  In the end, Man of Steel is a perfectly adequate summer special effects extravaganza.  It is not Superman.  Which is a shame.

Time: 55 minutes  


[1] “Superman.” What S word did you think I meant?

[2] Usually British accents come easier to Aussies and Kiwis.  Not so Crowe.

[3] Michael Shannon will make phrenologists of us all.

[4] I will admit that I was happy to see the chest hair.  I’m not only a member of the Chest Hair Club for Men; I’m also its president.

[5] Or wax anything. See: chest hair.

Tagged , , , , , , ,

The Three Movies that Traumatized Me

 childhood-trauma-rpg-d-amp-d-lvl1-demotivational-poster-1216902600

For some people, it’s Bambi.  For my brother Al, it was ET and Pee Wee’s Playhouse—he must have had a psychic intuition about that Pee Wee Herman guy.   But everyone can look back on childhood and recall—sometimes fondly reminisce, as I suppose I do —about the Movies that Ruined Their Lives.  (In the comments, go ahead and mention the movies that traumatized you. It’s fun!)  It’s not that I hate the movies or think that they’re bad.  As Facebook would say of my relationship, it’s complicated.

1) The Shining

shining

I remember the day that Brendan, Michael, and I watched The Shining at Irving’s house, I guess at some point in elementary school.  Irving had the only VHS and, obviously, most neglectful parents.  I think they were going through something.  Supposedly, kids figure everything out and know what’s going on, but I was a confused, oblivious child.  Danny, the boy with the title’s power, seemed roughly our own age, and when he talked to his hand decades before talk shows would emplore people to do the same, and called his pointer finger Tony, then spoke in a raspy voice as Tony, it didn’t seem funny, or campy, or kitschy, or cheap.  It was fucking horrifying.  So was the “REᗡЯUM” in lipstick on the bathroom door, which spelled out “MURDƎЯ” in the mirror, something that at 10 years old (maybe?) I DID NOT SEE COMING AT ALL.  And that was nothing compared with the terrifying twin dead ghost girls.  Like regular twins aren’t scary enough.  And of course, the Naked Lady in the bathtub, who begins as beautiful (not that I noticed; see: oblivious) and turns into a shrieking, droopy-breasted  hag as she chases Jack Nicholson down one of the million hallways in the film.  The later scenes, involving Jack going crazy, hacking poor Scatman Crothers to death with an ax, and subsequently menacing and attempting to murder his wife and child, had little effect after the powerful childhood magic of Tony,   REᗡЯUM, the girls, and especially the Naked Lady.  Either that or I had no more unconscious recesses left in my brain the ruin.  As Psycho must have done for a previous generation, The Shining made me scared to go anywhere near a bathroom for, like, a year.   And for many years after, Michael and I would yell “Naked Lady!” to each other, a phrase which for other kids may have evoked laughter, or titillation. Bur for us it was like screaming Boo! Times a million.

I watched The Shining again about a decade later. I was an English major in college and wanted to see what all the fuss in my head had been about.  This time, the movie was hilarious, a black comedy about writer’s block and isolation, less about Danny and bathrooms than Jack Nicholson’s madcap persona and the ridiculous haunted house conventions that had been beaten into everyone’s heads a hundred times by then. A hotel built on an Indian burial ground? Really?  I laughed at the film, at Jack, at Jack’s stupid, frozen face at the end, and myself, for misreading the movie so badly.

the_shining_ Jack frozen

And then I watched it again about six years ago.  I was teaching a class about conspiracy and paranoia in literature and film and wanted to pair Diane Johnson’s excellent, underrated novel The Shadow Knows with a movie.  And it was scary all over again, for new reasons. This time, I hardly saw anything supernatural or monstrous about it.  Instead, it seemed a harrowing psychodrama about loss of masculinity and domestic abuse, the not- at-all-funny ways in which women and children are most threatened by, most likely to be murdered by,  husbands and fathers,  supposed protectors and providers.  Without society or any kind of social arrangements, Jack has nothing to keep his rabid unconscious in check. I was disturbed all over again.  Maybe I wasn’t as oblivious as I thought as a child.

  2. The Fly

fly_poster

Not the 1950s Vincent Price classic, although I did see and love that movie as a child. No. In 1986, a few years after The Shining, I was at an in-between movie age and faced a choice: to see the Transformers (the cartoon movie that no one wants to talk about these days, featuring Orson Welles’s last role. Ah, cruel fate), or David Croneneberg’s remake of The Fly.  Later in life, I’d grow to love many of Cronenberg’s films.  Jeff Goldblum/Seth Brundle’s revolting and horrific transformation—no easy head-switcheroos here; the way   Brundle snaps a man’s wrist arm-wrestling in a bar; the way the mutated Brundle-Fly uses his fly vomit to disintegrate a man’s limbs; the way Gina Davis’s push dislodges Brundle-Fly’s jawbone and with it, his last vestige of human resemblance; Brundle-Fly’s like-nothing-else-ever appearance at the very end, after he accidently goes through the teleporter alone, failing in his Shining-esque plan to use the machine to merge his own DNA with Gina’s and their in-utero child, and how he points the gun at his own head but in his hideously deformed state can’t pull the trigger and Gina has to do it for him. OH MY GOD.  I can’t believe I ever saw another movie again. Or slept again.  Or had children.  But YOU WILL NEVER GET ME IN A TELEPORTER.  This plot summary was written from memory and without IMDB or Wikipedia.  Although I have not seen this movie in over 25 years, its images are burned into the internal plasma screen of my psyche.  Unlike The Shining, I do not expect to see The Fly again.

3. The Elephant Man

elephant-man_poster

Now, here’s the catch: not only have I not seen the Elephant Man since I was a child; I NEVER saw The Elephant Man. Although I added to my Netflix queue over a year ago in a failed attempt to cure myself through immersion therapy.  Which counts for something, I guess.  Even before The Shining, I saw a short clip of The Elephant Man on TV.  The clip I saw, which, again, I remember vividly although it was over three decades ago, features John Merrick, as he was known in the film, wearing a pillowcase over his head and fleeing a mob, which rips his mask off only to shock themselves into stunned murmurs.  Suddenly emboldened, Merrick bellows, “I am not animal! I am a man! A human being!” before collapsing from the exertion.  Then I saw a Ripley’s Believe it nor Not (or something like that) episode featuring Elephant Man reenactments, although the disfiguring makeup was far cruder than the film’s and, if I remember right, kinda purple. No matter. I become obsessed with The Elephant Man, reading all I could about him while strenuously avoiding any pictures of him, or John Hurt in the movie, which was not easy.  Even at the time, I had no idea what I was scared of.  Was I going to run into him somewhere?  I was kind of scared that I would, although obviously the odds of, say, being killed by Jack Nicholson were far greater.  Would I turn into him?  Um, no.  I didn’t know what I was scared of.  I still don’t, although the fact that I felt terrorized and traumatized by the clip is, as far as I can ascertain without having actually seen it, the exact opposite point of the film itself, which seeks to re-humanize, rather than dehumanize, the Man, not the Elephant.  I should really watch it.

But I won’t.

Honorable Mention: Snoopy Come Home.  In 1976, Snoopy, one of my childhood loves, ran away from Charlie Brown. Or something like that.  Did he run away, or was he left behind? Was it a misunderstanding? If you need to know, go check Wikipedia http://en.wikipedia.org/wiki/Snoopy,_Come_Home , which, unbelievably, has a significant entry on it. I haven’t seen this one again and don’t plan to.   And unlike the others, I hardly remember it.  Call it traumatic amnesia. All I know is that Snoopy was gone for like an hour and a half, and everyone is crying and crying and crying those big Peanuts teardrops from the sides of their eyes like water hoses, and then five minutes before the end, after everyone gives up, Snoopy Comes Home and it’s all OK.  Well, Charles Schultz, it WAS NOT OK.  The ending could not fix the feelings of loss that, when I close my eyes and psychically look back, I may not have yet gotten over. 

Time: one re-traumatizing hour.  

 215px-SnoopyComeHome

Tagged , , , , , , ,

Commencement

commencement

 

 

This graduation season, you’ve almost certainly sat through one of the worst literary genres, the commencement speech.[i]  Yes, David Foster Wallace achieved greatness with his:

And there is always Kurt Vonnegut’s Wear Sunscreen speech.  But most speakers are shackled by the speech’s conventions.

They begin with a list of thank-you’s:

I want to thank all of the students, the parents, the professors, the college president, the board of trustees…

With a little self-deprecation…

…for letting me have this opportunity to speak with your class.  You’re a great audience, especially since you can’t go anywhere!

Followed by the story: narrating a personal obstacle that the speaker overcame…

…I may be the CEO of Ceo Industries now, but it wasn’t always that way…

…in order to laud the role of education in that success…

…In fact, when I first came to college, I still didn’t know what I wanted to do with my life.  I struggled with finding…

…while being optimistic, preferably with some Speech 101 rhetorical flourish:

…But I did know that I wanted to make a change. A change for the better. A change for the future. A change for myself.  A change for the world.[ii]

And, of course, a quotation from someone famous to wrap:

Because after all, as Gandhi famously said, “Be the change you wish to see in the world.”

Except much longer. You’re welcome.

Once in a while, someone makes news by violating the tacit agreement that speeches need to stay positive, like last year’s “You are not special. You are not exceptional” speech by David McCullough.     But a commencement speech seems to me an inopportune time to lay too much on the caps of the newly minted graduates.

For me, the problem may be, as usual for Hourman, time.  We keep thinking of commencement as  “the ceremony of conferring degrees or granting diplomas at the end of the academic year.”  

But it’s easy to forget that commencement means beginning.  Not end.

Commencement has turned into a phantonym, one of those words like inflammable that means one thing but seems to mean its opposite. Of course, we want to mark the end of college, the completion of the degree, even though many students have expressed some ambivalence about the ceremony when they know that they’re set to start graduate school almost immediately after finishing college.[iii]

So for many students, it’s not an end at all.  But is it a beginning?  What is it the beginning of, exactly?   For cynics who think that school is not real life, ending the year means entering the real world. But that never seemed right to me, given how much real life so many students have already experienced.  It’s not entering adulthood, which in many ways has also already begun for them, even as many people don’t see college graduation as the mark of official adulthood anyway, preferring marriage, or children, or, in my case, the purchase of real estate, which seemed more difficult to get out of than either of the others.

So let’s have two cheers for commencement, even commencement speeches.  We need to impose all sorts of beginning and endings to portion our time: day and night, even though they start at different times for different people in different parts of the world and year; the year itself, although it too is an arbitrary marker; the seasons, although they are cyclical and, this year, totally inconsistent.  We want to imagine that time, like the seasons, is consistent and linear—time flies like an arrow[iv], straight and in a single direction, when the way time and life[v] feel is more amorphous, scarily circular, or even sometimes unchanging, so that once in a while I’m surprised to see my older-than-24-year-old face uncannily staring back at me in the mirror. 

Without the decorative sign posts and pit stops—our commencements to celebrate what we would love to think of as the beginning of post-collegiate life, or the end of pre-collegiate life; the candles taking up more room on the cake each year; a wedding and subsequent anniversaries—life becomes a series of one damned thing after another.  A grim death march.  No wonder we’re implored in commencement speeches to see life as about the journey and not the destination.  We don’t want to go there.

Because in the beginning, and in the end, there is only one real beginning, and one ending, and we can’t remember either one of them.  Let’s celebrate the rituals we have, not in spite of the clichés, but because of them.  The speeches are trite, but maybe they’re the right ones for the occasion. And maybe, ideally, they even contain some truths.  Unlike Kurt Vonnegut’s sunscreen speech, which he didn’t write and wasn’t ever a real speech.   Unlike Gandhi’s famous quotation, which he never said.

graduation-caps

 

Time: 65 minutes. Wasted too much looking for links.


[i] Being that I have attended thirteen graduation ceremonies that I can remember, I believe I’m in some position to evaluate them.

[ii] This one is anaphora, about the most basic.

[iii] I didn’t attend my MA ceremony for that reason. Then I didn’t attend my PhD ceremony for a different reason.

[iv] But fruit flies like a banana.

[v] Not the magazines.

Tagged , , , , , , ,