Monthly Archives: July 2012

The Many Masks of The Dark Knight Rises

Here is Tom Hardy, who plays Bane in The Dark Knight Rises:

A mouth!

I had no idea that I knew the actor from Inception and Warrior.  Yes, there’s the new bulk, the shaved head, and the costume. But mostly, I didn’t recognize him because of the headgear.  And the mask has raised the ire of two of my favorite movie critics.  Anthony Lane writes in the New Yorker that “Bane wears a crablike mask over the lower part of his face—a disastrous burden for Tom Hardy, whose mouth, sensual and amused for such a tough customer, is his defining feature. Via this device, Bane declaims his bold, anarchic sentiments; at least, I think they were anarchic. Given that I could make out barely a third of them, he may well have been reciting from ‘Clifford the Small Red Puppy.’   

And in Slate Dana Stevens laments “the film’s disappointingly uncomplex villain, the bald, hulking, pitiless arch-terrorist Bane, as played by Tom Hardy.”  She continues:

 Hardy obviously put an enormous amount of work into preparing for the role, bulking up his body and developing a strange, swooping voice that promises to give rise to a thousand late-summer Bane impersonations. But the choice to clamp a leather-and-metal mask over 60 percent of Hardy’s face for the entire movie means that, for all practical purposes, the actor’s diligent iron-pumping was in vain. Since we can’t tell whether the person producing that sound actually resides in that body or not, Nolan might as well have cast an already-huge body double and just had Hardy dub in the voice. Most of all, though, the mask is a mistake because we never get a good look at Bane’s face. With nothing to work with but a pair of darting eyes, Hardy can’t endow Bane with motivation enough to make him more than a generic bogeyman.

Of course. It makes sense. And yet, I can’t help but think that Bane’s mask makes a perfect visual and symbolic foil to Batman’s cowl—and scowl.  This blog entry’s opening image, one of the most common promo shots, depicts the contrast and symmetry perfectly: Bane’s face is a kind of negative, a reverse mirror image, of Batman’s; what is exposed on Batman—the mouth, the jaw, the chin—is concealed on Bane.  Batman’s head and eyes are disguised, whereas Bane’s are open.  Batman’s guttural voice is an affectation; Bane’s is the real result—in a major revision of the comicbook character—of the mask he cannot remove without dying.  When we first meet Bane at the beginning of the movie, he is hooded, but removing one mask only reveals another.  Covering the mouth, even more than the eyes as the source of his humanity, forces Hardy to act entirely kinesthetically; together with Batman’s costume and mask covering 95% of his own body, the choreographed fight scenes, seemingly graphic, instead become a version of Japanese Noh drama, where the masks themselves embody the characters’ distinctiveness and personalities, freeing the actors to use their bodies, rather than their faces, as their sole vehicles of expression.  When Bane finally breaks Batman, his final humiliation is removing Batman’s mask.  In doing so, he does not reveal Batman’s true identity—he takes it from him.

All of the faces of the Dark Knight movies have been masks.  The face of Harvey Dent—Two Face—is crucial to the new film, in that Gotham is presented only with his good side, his dark side hidden, an omission that Commissioner Gordon and Batman consider a necessary fiction but one that inevitably is revealed.  Two Face is like Bane, a reversal of Batman’s face, but divided exactly vertically rather than horizontally. His perfect split represents both his fractured psyche and his Manichaeism, a division that proves unstable within himself as well as Gotham.  

Bane’s mask-in-a-mask revelation was also used to introduce Heath Ledger’s Joker in the previous movie—one clown mask removed to reveal another beneath it.  But unlike Batman, even unlike Bane, who gets a few seconds of backstory revealed in the end, the Joker has no secret face, and no secrets.  His mask is his face and his face is his mask; he is exactly as he appears to be as well as a complete walking fiction.  He is his own shadow, his own mask.

And what I thought from the advance images to be Catwoman’s mask turned out to be her goggles flipped up onto her head, the only whimsical, lighthearted mask in the film. Cat suit and cat burglar aside, Selina Kyle of Dark Knight Rises is not the comic’s Catwoman at all, not even in name, as “Catwoman” is never said.  She wears a the thief’s domino mask seemingly to hide her self, but we discover that the one thing she truly desires is to be free of her identity, not to protect it at all.

But what lies behind Christopher Nolan’s mask?  What is his political ambition?  His artistic aspiration for the films seem clear enough: big sound and bigger spectacle.  But Batman himself, like the Riddler, remains an enigma: a hero and an anti-hero; a cautionary tale of unchecked, out of control ego—Super Ego!—but also the need for order; the 1% given everything but also the self-made man; a right-wing borderline fascist or a left-wing critique of same. The film blows up Gotham City, looking more like New York than Chicago this time around, continuing the previous film’s imagery of 9/11. Police officers are trapped beneath rubble; we see a geographically vague Middle East and detention centers.  The film seems to reference the War on Terror, Occupy Wall Street, the language of homegrown class warfare and New World Order conspiracy, symbolic pits with real walls to be scaled, the French Revolution, Kafka-esque (or Lewis Carroll-esque?) courts, a Fight Club-like Project Mayhem no longer content to blow up empty buildings, and a genuine allusion to Dickens’ A Tale of Two Cities.   But what does any of it mean?  

Now that Lucius Fox has granted Batman “The Bat,” a great chiropteran hovercraft, we finally have a way to begin grasping the Dark Knight trilogy’s political import: it is a series of what Claude Lévi-Strauss coined “floating signifiers,” and what Roland Barthes amended to a “floating chain of signifieds”—that is, like a mask itself, it means exactly, and only, what people see in it, whether everything, or nothing.  And now, in the aftermath of the July 20 Aurora, Colorado, mass murder, the cinematic gunfire, mayhem, bloodshed, and masks (shooter James Holmes wore a gas mask during the massacre and had a Batman mask in his house when police searched it) inevitably take on darker new meanings.

Unfair?  Of course.  But Nolan’s brand of sustained ambiguity, something I am usually so quick to celebrate, has its own dark side.  

Here’s William Butler Yeats’s poem, “The Mask”:

Put off that mask of burning gold

With emerald eyes.”

“O no, my dear, you make so bold

To find if hearts be wild and wise,

And yet not cold.”

“I would but find what’s there to find,

Love or deceit.”

“It was the mask engaged your mind,

And after set your heart to beat,

Not what’s behind.”

“But lest you are my enemy,

I must enquire.”

“O no, my dear, let all that be;

What matter, so there is but fire

In you, in me?”

Like Dark Knight Rises, it too is an exercise in sustained ambiguity, in the challenge of determining desire or deceit, who is a lover or an enemy, and what is or isn’t behind the mask.  It seems to mean a lot of things, or, if poetry isn’t your thing, nothing.  Yet one meaning that I take from it is the notion that we need to stop worrying about what’s behind each of our masks—that the face we put forward is our real face, even when it is just a mask.  It sounds nihilistic, like the Joker. But it’s also, in many ways, all we really have. So perhaps Nolan’s sound and spectacle are all there is.  

And they’re enough.  

Time: 90 minutes. I knew this was going to be a long one before I started.

Tagged , , , , , , , , , , , , , , , , ,

No Fun

“Anhedonia”: the original title of Woody Allen’s Annie Hall, a motif in Jonathan Franzen’s novel The Corrections, a word I felt immediately. Literally, it means “without pleasure” (an + hēdon), and it expresses something like the inability to enjoy things.  According to experts, it’s associated with clinical depression, depressive disorder, endogenous depression, and major depressive episodes.   I don’t feel depressed, or in denial about depression. I would even say that I am a happy person, give or take some seasonal affective disorder and how well I avoid cable news. But I frequently question why so many people find certain things pleasurable when I can’t. Pleasure, joy, amusement: these terms are obvious in the abstract—by definition, everyone likes “fun”—but they’re problematic in the particulars. Especially for me. 

Technically, I don’t have anhedonia, since it’s associated with a loss of pleasure in things that one used to take pleasure in, and there’s too much that I never enjoyed in the first place. No Code Red Mountain Dew, KFC Double Down, Cool Ranch anything. No “Two and a Half Men,” “[Anything] with the Stars,” “Bridalplasty.” No “Hey, Soul Sister,” “Tik Tok,” the double down of “Glee”’s cast singing “Don’t Stop Believin’.” Maybe these are easy targets. Maybe I’m elitist. Maybe my age is showing. But everyone else seems to like them, and I like other popular entertainment, and I would never have liked them, even as a kid. Especially as kid. On the contrary, I like to think I’ve grown remarkably tolerant and mellow.

I can’t listen to a human voice on the radio unless it’s singing. Without Autotune.  Or has a British accent on NPR.  I can’t tolerate movies featuring talking dogs, especially if they depict real dogs in digitized lip synch. I have never watched a game of professional baseball on television except long enough to change the channel.  I have never participated in any competitive sport, spending every high school phys ed class sitting in the bleachers talking to Tommy about Metallica. Mr. Arbuse didn’t care because I was wearing my gym uniform, as I’ve chronicled before. I now exercise only so that I may eat more ice cream. I have never sent a successful text message.  I prefer not to talk on the phone. I don’t really like to drive. When I finally took my kids to Disney World, they—and my wife—loved every second of our eleven-hour days in the park. As I carried the backpack of water, extra clothes, and a camera while occasionally pushing the stroller through the crowds, I endured only by picturing soldiers, waist-deep in the quagmire, rain sheeting down in cacophonous chime on their helmets, under threat of enemy fire, fifty pounds of gear on their backs, arms straining to keep their guns above their heads. Later I felt sheepish, and guilty, about comparing my three days in Disney, the Happiest Place on Earth, with War, which Is, according to trusted sources, Hell. But it got me through the week.   

At the risk of sounding like a personal ad, I like to play with my kids in a green, sunny park that doesn’t charge admission. I like complicated foods with simple, pronounceable ingredients. But I also like every breakfast cereal. I like to watch TV if the shows involve any two or more of the following: conspiracies, plot twists, glorification of dubious ethical behavior, foul language expressed in creative combinations, good-looking supernatural creatures.  I like abrasive music by brutal musicians.  I read as much as I can, preferably great, depressing novels where the main characters die. But I also like every magazine, and science for non-scientists, and superhero comics, where no one who dies ever stays dead. I eat pints and pints of Ben and Jerry’s ice cream but refuse all lesser brands. I can’t eat breakfast.  I like to play the blues on the guitar.  I love doing anything, or nothing, with my wife. I look forward to going to work. I write, not because I like to, but because I like to read what I wrote. 

Did not stay dead

Not dead.

Dead? No. And no.

I don’t, in the end, have anhedonia, even if there’s much that I can’t—or that I refuse—to take pleasure in. With literature, writing, and the blues, it feels good to feel bad. Or maybe more people should feel bad for feeling good. Or perhaps the measure of life should not be pleasure at all—anhedonia’s lack, or its linguistic opposite, hedonism, where enough is not enough. More than “fun,” yet another thing to have, perhaps we can instead substitute “contented,” something to be.  And I am. 

At least sometimes.  

Time: I wrote this a little over a year ago for my college literary journal and felt like revisiting and revising if for the blog.  I wrote one or two a year for the last eight years, and these short personal essays at the time usually also took a little over an hour.  They were, in retrospect, proto-blog entries.

Tagged , , , , , , , ,

A Cultural History of Spider-Man’s Web Shooters

Just point ‘n’ shoot!

As much as I love superheroes, I can’t say that the new Amazing Spider-Man movie needs to exist.  First, as long as it was being remade, time to drop the hyphen—just “Spiderman.”  It’s cleaner.  Second, the movie reminded me of seeing a high school play: “Aw!  So cute! They’re doing Spider-Man!” When Sally Fields showed up as Aunt May, I thought, “Aw! There’s Sally Fields pretending to be Aunt May!”  And then when Martin Sheen showed up as Uncle Ben, I thought,” Aw! There’s Martin Sheen! I love that guy!” before quickly remembering that he’s a dead man walking, to be gunned down before the second act ended so Peter could learn his lesson about power and responsibility.  This must have been how medieval audiences reacted to seeing Jesus-Christ show up in the passion plays: “I can’t believe he’s gonna  get killed AGAIN.”

But crucially, the movie revises, updates, and, for many fans, corrects what turned out to be a huge comic controversy of the 2002 Spider-Man.

Namely, the mechanical, wrist-worn webshooters (single word, no hyphen) are back. The organic vs. factory debate deepens.

This is a BFD.  When Spider-Man (hyphen for historical accuracy) debuted in 1962, bitten by a radioactive spider, proportionate strength and speed etc etc etc, he invented the synthetic webbing and pressure-sensitive webshooters himself:

 

Peter Parker as misfit, scientist, and genius is crucial to the early stories.  It’s not enough to get spider powers.  Much of his early success as a hero stems from the use of his pre-bite intellect and his own diligence and hard work, as opposed to mere accident: “So they laughed at me for being a bookworm, eh? Well, only a science major could have created a device like this!” And so his identification with his audience of bookworms is complete.  Spider-Man, as Stan Lee, in his usual overwrought, avuncular, carnival barker voice, introduced him earlier, is a hero like… You!  So he needs to have something comic readers can pride themselves in having; Spiderman is about smarts and perseverance, not just a lab accident. Later comics elaborated upon the original idea:

But while 1962 Peter Parker, as a non-sidekick, picked-on teen,  was unlike any of the other superheroes of that time—more like, of course, a stereotypical comics reader—he was also very much like most of the other 1960s heroes who believed in Better Living Through Chemistry.  Sputnik had been launched a few years earlier, the Space Race was on, kids began working with their chemistry sets in their rooms, and comics followed, whether to embrace the post-war American dream or just because the hero/scientist opened up new character and narrative possibilities.  Until that point, THE SCIENTISTS HAD ALL BEEN BAD GUYS!   Suddenly, Professor X (who had to open his own school to receive tenure, apparently) , bald and in a wheelchair just a Superman’s first supervillian Ultra-Humanite (hyphen?), looking like Lex Luthor, was leading the X-Men! Reed Richards took the Fantastic Four into space, then into crime-fighting! Bruce Banner started off as a nuclear gamma physicist before going green as Hulk. Over at DC, the Flash’s Barry Allen—usually thought of as ushering in the Silver Age—was reimagined as a police scientist; the new Green Lantern was test pilot/astronaut proxy Hal Jordan, whose power ring (two words) got a science fiction makeover from the previous incarnation’s magic origin. Spiderman’s invention put him in the center of the new wave of super science police.  

Forward forty years later for the first big film, though, for a changed world. The idea that teenaged Peter Parker could invent the webs himself suddenly didn’t seem realistic.  The dream that the brilliant kid his bedroom could do what millions of dollars in government and industrial research and development couldn’t? Ridiculous.  Just as important, the early 2000s saw a sudden upswing of anti-technology cultural forces—technophobia brought to the surface by Y2K, a wave of anti-factory farming, the Fight Club-style anger at the techno-corporate world, left-wing distrust of surveillance and electronic voting machines, and right-wing fears of a technologically driven New World Order. Stan Lee and Steve Ditko had devoted all of two panels for Peter to invent the webshooters. Could a multimillion dollar movie really be that casual and still be credible?  So the webs became a part of Spiderman’s new powers, his body generating them organically, leaving the film open to hundreds of snarky commentators noting that spiders don’t fashion webs from, um, that part of their anatomy. Taken together, we see a nice example of Samuel Coleridge’s famous dictum about suspension of disbelief: audiences could suspend disbelief long enough to imagine that a bite from a radioactive  genetically altered[i]  spider could spontaneously generate natural webshooters , but not that Peter Parker could have invented the ‘shooters himself—broke, without a lab,  and alone in his Queens bedroom.  The dream of technological progress was over.

My hands are making what?

But only for a decade. Today, Andrew Garfield, playing Tobey Maguire playing Peter Parker, indeed invents his webshooters again, like Kennedy’s in the White House and it’s 1962.  Yet unlike Classic Peter, he doesn’t quite invent them by himself. While it’s all a little hazy (damn you, montage!), what Nu Peter seems to do is closer to what contemporary techies get.  Instead of opening his chemistry set, he draws from preexisting technologies—some prefab Oscorp tensile-strength web fluid here, some, um, other mechanical movie-looking parts and gears and awesome LEDs and stuff that looks like machinery there.  2002 was too soon to imagine the day when every kid would not just own a smart phone—as Peter plays games on his phone to kill time while waiting for the Lizard to emerge in the sewer—but that more than a few teens would also be savvy enough to jailbreak them, invent their own apps, and create original graphic art, digital music, and code, alone in their rooms.  The basement chemistry sets of the early 1960s have given way to the new tech mythos of Steve Jobs in his garage, not inventing the computer but rather remaking and improving it based on previous iterations of the same ideas that Xerox and IBM used but somehow didn’t really get.  C. 2012 Peter’s genius isn’t that he invents the webbing and webshooter a la 1962, but rather that he recognizes that the technology for them already exists, and he makes them work together.   Only a science major post-millennial could have created a device like this.  We love technology again, but in a remix, mashup, sampling, collage kinda way.

So it’s fitting that, in the Tobey Maguire version, Natural-webbing Spidey fights techno-corporate Green Goblin/Norman Osborne, who relies on the worst of tech R&D: metal mask and body armor, disintegration grenades, and deadly projectiles; in Spiderman II, Doctor Octopus recalls the 1940s and 50s Scientist Gone Wrong, becoming a crazed metal-armed cyborg, while again Natural-webbing Spidey has to set him right and destroy the dangerous incursion of technology into the human realm. Lots of other fantasy movies of the early 2000s shared this pro-natural, anti-tech spirit: The Lord of the Rings pits the sylvan elves and pastoral hobbits against Sauromon’s metal hammers, metal towers, bio-engineered monsters, and willful destruction of trees.  In those Harry Potter movies, technology is shunted aside entirely, unable to coexist with magic at all.  In Phantom Menace, those stupid Jar Jar-looking aliens use natural weapons… ah, I can’t even continue; I hate that movie so much.[ii]  

Yes, the Lizard is a bit of a retread of Doc Ock, in that he’s a scientist whose attempt to do good results in the potential destruction of New York again, his mind altered by a biotech-transformation.  But when Dr. Connors emerges transformed into the Lizard, he sheds his lab coat and his humanity, symbolically and visually the worst kind of natural—slimy, scaly, swampy, primitive, lizard brained.  New Tech Spidey is web savvy (har har) and smart, using his—and Gwen Stacey’s—head to configure a quickie technological solution to New York City’s new alligators in the sewer problem.  OK, technology may have created the problem, but, unlike earlier incarnations of superheroism, technology can also solve it. Call it Web 2.0.

So when the techno-pendulum swings back, expect to see some other new version of the webshooters for the inevitable 2022 reboot.  And when we do, will someone please get Uncle Ben a bullet-proof vest this time?

Or the cynical explanation: you can’t sell organic webshooter toys.

Time: 90 minutes. Over, but this piece is pretty long, and I even spent at least 10 minutes cutting tangents. Plus I managed not to make any Marc Webb (!!!) puns.  It’s also funny that my conclusion—2000s Spider-Tobey is natural and fights techno-bad guys, while 2012 Spider-Garfield is technological  and fights a natural bad guy—came to me in my sleep two nights ago. Call me 24-Hour Man. 


[i] The radioactivity concomitant with the early ‘60s Cold War was replaced by new wishes and fears of genetic modification for the 2000s. But that, Dear Reader, is the subject for another exciting post! Excelsior!

[ii] Irony alert: these seemingly anti-technology movies could not have existed without their recent advances in digital technology.  

Tagged , , , , , , , , , , ,

Water and Fire: Metaphors I Blog By

Contrary to Marc Prensky‘s popular binary, I don’t see myself as a digital native, or a digital immigrant.  Rather, I am a reluctant, reformed Luddite, washed gasping onto your shining silicone shores of technology because the formerly lush pre-technology terrain has ebbed and eroded beneath my feet.  So I used a laptop as a life-preserver and floated across the digital divide, trying not to drown.  No, I am no digital immigrant, one who came here by choice following the dream of electric sheep and your Statue of Technology’s gleaming beacon, a flickering iPod held aloft. 

I am a digital refugee. 

I don’t speak the language. 

I plead digital asylum. 

But now that I’m here, I’ve come to discover that, just as there are activities that thrive in the face to face world—or, worse, “F2f,” the shorthand for what used to be called interacting, talking, or being human—there may also be opportunities that technology creates that are not pale imitations of personal contact or just more expensive versions of previous, now obsolete technologies like paper, paint, or vinyl.  Rather, there may be whole new avenues to travel, channels to explore, waters to drink. 

Two weeks ago I wrote about the things I learned after six months of blogging, focusing on how I felt to get page views and to view how readers viewed me.  And that was interesting and enlightening for me in a kind of techno-sociological way, my time-traveler’s view of my strange new home in the future.  So on the surface, the least blogging has helped me see are the ways in which I can now easily and frequently incorporate images, video, and links into posts.  It’s plenty fun and entertaining for me (and, I hope, others), which I do not denigrate.  

But it has also helped me to learn more about the creative process, something I was very interested in well before six months ago.  I started this project with the hourman concept—one topic covered in sixty minutes of writing, and, as I’ve said, I’ve mostly stuck with it.  But what I haven’t discussed is what I’ve done with that writing time.  It has occasionally been linear, the way students are forced to write essay exams in school, or the Alice in Wonderland approach: “Begin at the beginning…  and go on till you come to the end: then stop.”  But mostly, while I may spend the hour composing, I spend the day, or sometimes week before, composting, to borrow the metaphor of writer’s writer Natalie Goldberg.  Before I even sit down, and before I start the clock, I already have my topic, my angle, even if it’s vague, and preferably, my way out.  I’ve always believed in the importance of endings—one of the things I try to emphasize to my writing students is that you can’t tack on a conclusion.  Perfunctory, fake conclusions sound like this: “In conclusion, here’s what I just said.”  But now, I take them even more seriously.  Like a good war, a good piece of writing needs to plan its exit strategy before it even begins.  

But I also now build the link and image searches into my writing process as well, so that I’m not simply writing for an hour, then looking for apt an entertaining images or videos, or deciding in the editing and posting process which terms or ideas would benefit from or be bolstered by a missing link.  Instead, I Google as I go (possibly sing to “Whistle as I Work”?), and often enough, something that I see online gets me rethinking what I’m working on right then and there.  Blogging allows for a less hermetically sealed approach to writing: not the frustrated, isolated Artist on a mountaintop, quill and parchment in hand, awaiting divine inspiration—nothing that I’ve written would merit that kind of pretention anyway. But rather, writing online, using online tools, for online readers, has challenged the digital native/immigrant/refugee metaphor’s very foundation.  John Donne knew that no man is an island.  But every link, piece of writing, image, reader, and writer can become part of a vast digital island chain, a sweeping archipelago connected by legions of lightspeed Google ferries.

In addition to challenging the pseudo-Romantic cult of the lone writer, blogging has also challenged my romantic idea of creativity. Too often, we imagine writing can be blocked, as though it were a physical and terrestrial thing.  But if creativity is water, it flows and resists blockage.  Yet water may not be the best metaphor now, since water can indeed be dammed.    And while people do refer to writer’s block when they can’t produce, I don’t think that blockage is really the best metaphor for creativity or lack thereof either.  Nonwriters don’t get blocked; only writers do.  So what writers mean is that their creative process is like agriculture: it is capable of being grown, harvested, and exhausted.  We can overfarm and deplete our imaginary crops or clearcut our creative forests, leaving a fallow period of, we hope, restoration and germination.  We hope the ideas will come back, but we never know.  So when I committed to one blog post per week, I wondered how soon I might, shifting to another familiar metaphor again, burn out.  But instead I’ve come to think of the writer’s ideas as fire.  Yes, Plato, Prometheus, and Jesus beat me to this metaphor, but I think it’s a crucial one: rather than thinking of ideas as blocked vs. flowing, or developing vs. producing, we can think of them as a flame.  When we take from the fire, it does not get any smaller.  With the right conditions—air, kindling—it can perpetuate itself indefinitely, producing and reproducing at any rate.  You can’t put out a fire by taking from it; rather, that’s how you make it grow.  Creativity can operate in this way, too.  It does not need to burn out at all.

Yet even the fire metaphor falls short in describing what I’ve learned.  The commitment I’ve made to writing this blog—a commitment that has no obvious benefits, no product to push, no money to make, no political agenda, and no foreseeable purpose at all—is a reminder of the cliché about life being about the journey and not the destination. 

A little trite, though, so let me update it: life is about the journal and not the desperation.   

Time: just under an hour.  And I didn’t have this ending planned at all—it came as I wrote it. So much for what I’ve learned.

 

Tagged , , , , , , ,