Doug Baldwin Sabotages the Seahawks, or “How I Lost the Super Bowl by Pooping”

upzlim6ybira8y2qnnrt

The Seattle Seahawks loss last night will live in infamy for as long as the NFL football is played. But while wide receiver Ricardo Lockette will be remembered—at least on video—for being the target of quarterback Russell Wilson’s awful pass that snatched defeat from the jaws of victory, it is fellow wideout Doug Baldwin who deserves to be castigated by football historians for a single, senseless, selfish act that changed the course of a game decided by the flimsiest of margins.

Doug Baldwin’s stat line in Super Bowl XLIX is short but not insignificant: one reception, a 3-yard touchdown. But you have to read the play-by-play to garner the full context of his contribution. After scoring his TD, Baldwin’s endzone celebration attracted a 15-yard penalty for unsportsmanlike conduct. Without that penalty, it’s quite possible that this morning the Seahawks would have woken up as repeat champions, rather than the most memorable losers the big game has seen.

Baldwin’s penalty forced Seahawks placekicker Doug Hauschka’s ensuing kickoff to be taken at the Seahawks’ 20-yard line, rather than the 35. As a result, Hauschka had no chance to kick the ball into the Patriots’ endzone for a touchback, which would have forced the Pats to start the ensuing drive at their own 20. Instead, Hauschka’s kick reached only the Pats’ 14-yard line and was returned by Danny Amendola to the Pats’ 35.

At this point in the game there was 4:48 left to play in the 3rd quarter and the Seahawks led 24-14. So when the Pats went 3-and-out and punted the ball into the Seahawks’ endzone, it seemed Baldwin’s douchery was inconsequential. And but for chaos, it might have been.

In its most simplified terms, chaos theory tells us that in any complicated system even the smallest alterations can bring about perhaps unforeseeable, and potentially huge, consequences. The most famous formulation of chaos theory may be “the butterfly effect,” the theoretical possibility that particular flaps of a butterfly’s wings on one side of the world might play into a causal chain resulting in a tornado on the other side of the world, whereas different flaps would have played into the causal chain differently, perhaps resulting in a future without that particular tornado, with a slightly different version of that tornado, etc. In a highly complex system, the theory goes, it’s impossible to predict exactly what effects result from a given action.

While it seems that Baldwin’s endzone celebration, which consisted of a mime of his pulling down his pants, then squatting over the ball (laying an egg? shitting?), had a negligible effect on the game, there is really no way to know. What is certain is that what happened from that point forward would not have happened exactly the same way had Hauschka kicked off from the 35. If he had, the Pats would probably have started their next drive from a different part of the field. This might have led to the Pats calling different plays, either by original design or by quarterback Tom Brady’s pre-snap changes at the line of scrimmage, choices made on the spot based on the quarterback’s perspective in the present moment.

Because the Pats went 3-and-out, it seems the causation resulting from Baldwin’s penalty might have benefitted the Seahawks indirectly, since 3-and-out is about as good a result as a defense can force on a particular drive. But who knows? Without Baldwin’s penalty, maybe Brady throws an interception that gets returned for a TD, giving the Seahawks seven more points. Or maybe the difference is less dramatic. Maybe the Pats still go 3-and-out but aren’t able to punt the ball away into the Seahawks’ endzone, an alteration in the flow of the game that precipitates a long Seattle drive that pays off in points a few minutes later. 

As it happened, the Seahawks never again scored, while the Patriots put up two more touchdowns. Then came what everyone will remember: the final drive, the fluky catch by Jermaine Kearse, and what may be the worst pass in Super Bowl history.

But none of this happens exactly that way without Baldwin’s penalty. Maybe the Seahawks wouldn’t even have been in a position to get the final score they seemed sure to get had they run Marshawn Lynch from the 1. Maybe they wouldn’t have needed it. The only sure thing is that the ending would have been different.

Immanuel Kant offers a fairly straightforward way of deciding how to act in a chaotic world, a world in which the consequences of a given action are not always predictable. In the realm of Kantian ethics, one is duty bound to do the right thing in the moment, without regard to what will ensue. How far he goes with that belief is made clear in his essay “On a Supposed Right to Lie from Altruistic Motives”. His answer is that there is no such right, at least if you believe (like Kant) that telling the truth is “a sacred and absolutely commanding decree of reason.” You do the right thing in the moment, Kant says, come what may.

Doug Baldwin is a pedestrian NFL player, but even pedestrian players know that you’re supposed to help your team win. In principle, unsportsmanlike conduct penalties, which set your team back 15 yards a pop, don’t do that. They’re called “penalties” for a reason. They hurt. An NFL equivalent of a Kantian categorical imperative would be something along the lines of Avoid penalties whenever you can, particularly penalties such as unsportsmanlike conduct, which can in no way be rationalized as necessary in the moment (as opposed, say, to a defensive back grabbing a receiver who’s beaten him so as to prevent a likely touchdown).

But Baldwin didn’t care. Or he cared more about getting someone’s goat (he didn’t say whose) than not hurting his team. After his one big moment on the NFL’s biggest stage, he felt the need to pretend to pull down his pants and poop a football. It was gesture that was guaranteed to draw a flag, and thus to alter the course of the game.

Maybe the difference was negligible. But in a game that came down literally to the last seconds, a game that culminated in one particular decision made in at specific moment in response to one particular set of circumstances, there is almost no way in which the game does not end differently if Baldwin keeps his pants on.

So while everyone’s blaming Pete Carroll and Russell Wilson and maybe even Ricardo Lockette for the greatest gaffe in Super Bowl history, save a little blame for Doug Baldwin. Because but for Doug Baldwin, there might be a helluva party happening in Seattle right about now.

Love and formic acid,
G.I.ant

(Image: Matt Slocum)

Posted in G.I.ant Tagged with: , , , , , , , , , , , , , ,

Is the Universe a Hologram Even in Brooklyn?

annie-hall-4 (1)

Early in Annie Hall, young, Brooklyn-born Alvy Singer is depressed because he’s just learned that the universe is expanding, meaning that everything is being pulled ever farther apart from everything else. It was a reasonably new concept in the WWII era of Alvy’s childhood, as it was only in 1929 that Edwin Hubble made the measurements that proved that the Andromeda galaxy is receding from us Milky Way folk, and we from it.

Alvy grasps that this matters only at the galactic level, but he’s still depressed by the idea that the universe will, as he puts it, “break apart, and that would be the end of everything.” His mother, though, is more pragmatic. “What has the universe got to do with it?” she kvetches. “You’re here in Brooklyn! Brooklyn is not expanding!”

But what would Alvy’s mom say were they to reply their parley a half-century later and Alvy fell into a funk about the idea that our universe is just a hologram?

That idea came to us by way of a succession of superstring theorists—Charles Thorn, Gerard ‘t Hooft, Leonard Susskind, et al.—working on a method of reconciling inconsistencies between general relativity and quantum physics. But it was Juan Maldacena who in 1997 proposed what’s come to be known as the holographic principle, which Nature describes as “an audacious model of the Universe in which gravity arises from infinitesimally thin, vibrating strings could be reinterpreted in terms of well-established physics. The mathematically intricate world of strings, which exist in nine dimensions of space plus one of time, would be merely a hologram: the real action would play out in a simpler, flatter cosmos where there is no gravity.”

Now comes the culmination of a series of papers by Yoshifumi Hyakutake and his team of researchers at Ibaraki University, which provides what Nature calls “compelling evidence” supporting the holographic principle.

What is that evidence? Look, I’m just an ant, and ants don’t get nearly enough math in school to follow the argument. But let’s pretend that not only is Hyakutake’s math good, he has stone cold proven that our universe is indeed a hologram. What does that mean for us here in Brooklyn?

It might mean determinism is definitely the case. Consider Brian Greene’s statement that if the holographic principle is fact, “the universe’s most fundamental ingredients […] would reside on a bounding surface and not in the universe’s interior. What we experience in the ‘volume’ of the universe—in the bulk, as physicists often call it—would be determined by what takes place on the bounding surface, much as what we see in a holographic projection is determined by information encoded on a bounding piece of plastic. The laws of physics would act as the universe’s laser, illuminating the real processes of the cosmos—processes taking place on a thin, distant surface—and generating the holographic illusions of daily life” (The Fabric of the Cosmos, pp. 482–483).

Naturally, a deterministic universe means we’ve got no real freedom here in Brooklyn. But determinism’s disconcerting implications exist only if we’re bound and determined to wrap ourselves up in concepts of past and future and to break up our world into separate parts, just like NYC is broken up into boroughs, a way of seeing that can for some purposes be useful but should be seen as a manmade construct and not an inherent reality.

In 1980, when holography was still in its infancy, theoretical physicist David Bohm was already using the hologram to make just this point. As he explains in Wholeness and the Implicate Order, “The key new feature of [the hologram] is that each part contains information about the whole object (so that there is no point-to-point correspondence of object and recorded image). That is to say, the form and structure of the entire object may be said to be enfolded within each region of the photographic record. When one shines light on any region, this form and structure are unfolded to give a recognizable image of the whole object once again” (pp. 224–225, italics in original).

For Bohm, the upshot of this consideration is a new notion he called “the implicate order, by which “one may say that everything is enfolded into everything. This contrasts with the explicate order now dominant in physics in which things are unfolded in the sense that each thing lies only in its particular region of space (and time) and outside regions belonging to other things” (p. 225, italics in original).

What that means in Brooklyn is that there is no Brooklyn, at least not as far as that Brooklyn is (t)here or now. It’s all how we perceive it, “we” being just another of those perceptions that falsely break up the whole.

Naturally, to some degree Alvy, his mom, and the rest of us can’t help it. But only to some degree. As Bohm points out, groundbreaking infant research by Jean Piaget

has made it clear that a consciousness of what to us is the familiar order of space, time causality, etc. (which is essentially what we have been calling the explicate order) operates only to a small extent in the earliest phases of life of the human individual. [… F]or the most part infants learn this content […] One reason why we do not generally notice the primacy of the implicate order is that we have become so habituated to the explicate order, and have emphasized it so much in our thought and language, that we tend strongly to feel that our primary experience is of that which is implicate and manifest. […] This then contributes to the formation of an experience in which these static and fragmented features are often so intense that the more transitory and subtle features of the unbroken flow (e.g., the ‘transformations’ of musical notes) generally tend to pale into such seeming insignificance that one is, at best, only dimply conscious of them. Thus, an illusion may arise in which the manifest static and fragmented content of consciousness is experienced as the very basis of reality[,] and from this illusion one may apparently obtain proof of the correctness of that mode of thought in which this content is taken as fundamental. (pp. 261–262)

The potential payoff of such considerations, Bohm says, is no less than a paradigm shift in how humans relate to our existence:

[W]idespread and pervasive distinctions between people (race, nation, family, perception, etc., etc.), which are now preventing mankind from working together for the common good, and indeed, even for survival, have one of the key factors of their origin in a kind of thought that treats things as inherently divided, disconnected, ‘broken up’ into yet smaller constituent parts. […] When man thinks of himself this way, he will inevitably tend to defend the needs of his own ‘Ego’ against those of the others; or, if he identifies with a group of people of the same kind, he will defend this group in a similar way. He cannot seriously think of mankind as the basic reality, whose claims come first. Even if he does try to consider the needs of mankind he tends to regard humanity as being separate from nature, and so on. […] If he thinks of the totality as constituted of independent fragments, that is how his mind will tend to operate, but if he can include everything coherently and harmoniously in an overall whole that is undivided, unbroken, and without a border (for every border is a division or break) then his mind will tend to move in a similar way, and from this will flow an orderly action within the whole. (pp. xii–xiii)

So yeah, it turns out that one way or another Brooklyn is a hologram, in that it’s a projection—by us, from the brane, whatever. But we see it, we feel it. And whether or not we really have free will, as far as we’re concerned we seem to have some say in how we Brooklynites put together the whole shebang.

So why not make it more harmonious? That’s probably how it really is, anyway. What have we got to lose?

Love and formic acid,
G.I.ant Read more ›

Posted in G.I.ant Tagged with: , , , , , , , , ,

Why Much of the Backlash Against the Whiteness of the Oscar Nominees Is Prejudice

85th_OscarsX

prejudice (n) 1. preconceived opinion that is not based on reason or actual experience (From the Latin prae, “in advance,” + judicium, “judgment”)

That word, prejudice, perfectly describes much of the backlash against the Academy of Motion Picture Arts and Sciences for its nominating White people almost exclusively for Oscars this year, because it was a foregone conclusion that the Academy was going to faces charges of racism if Selma, the Martin Luther King Jr. biopic nominated for Best Picture, didn’t receive a boatload of nominations, and that it will be every year that non-Whites aren’t among the nominees for the top prizes. For many there is a simple, prefabricated equation: no nominees of color = racism.

Of course, they may be right. We know that the Academy is disproportionately male (77%) and White (94%!), with a median age of 62. And while it’s a form of prejudice—as in racism—to presume that old Mr. Charlie will always favor “his own kind,” it doesn’t stretch the imagination to conjecture that a lack of diversity among the Academy members might indeed feed into a lack of diversity in the films they favor.

Then again, it isn’t like the Academy simply refuses to recognize the work of non-Whites. The same Academy that this year is getting called on the carpet for not recognizing enough non-Whites last year gave 12 Years a Slave Best Picture and John Ridley the Oscar for Best Adapted Screenplay. Steve McQueen was nominated for his direction, with the award going to Alphonso Cuarón for Gravity. In the acting categories, while Lupita Nyong’o won Best Supporting Actress, Chiwetel Ejiofor and Barkhad Abdi also garnered nominations, which means three of the 20 acting nominations, or 15%, went to Blacks, which is proportional to the percentage of Black faces in American society.

And get this: aside from Nyong’o’s, none of the winners deserved their statuettes. Dallas Buyers Club and especially Her are better than 12 Years a Slave, Richard Linklater and company’s adaptation of characters from his earlier Before… films for Before Midnight is more compelling than Ridley’s work, and Cuarón’s directorial turn in that piece of shit Gravity is laughable.

This, of course, is a matter of taste (even if I feel I can make a near-objective case for what garbage Gravity is), and that’s part of the point: De gustibus non est disputandum. There is no disputing about taste. Who’s to say who is worthy of being nominated and winning awards and who isn’t? Well, when we’re talking about the Academy Awards, it’s the Academy, stupid.

Does the Academy have stupid taste? To my mind the answer is and always has been “yes!” You can find travesties of taste almost anywhere you look in the Academy’s history, travesties that have nothing to do with ethnicity. My favorite is 1990, when the Academy not only awarded Best Picture to Dances with Wolves over Goodfellas, but even gave Kevin Costner the Best Director Oscar over Martin Scorsese, despite the latter’s having turned in what may be the single greatest work by a director in cinema history. Was it because the Academy is biased against Italians? No: it’s because they have shitty taste.

Sure, sometimes they get it right. In 2007 No Country for Old Men deservedly raked in the awards for Best Picture, Best Director, Best Supporting Actor, and Best Adapted Screenplay, while Daniel Day-Lewis got Best Actor (in a fantastic field!) for his work in There Will Be Blood, another stone-cold masterpiece that garnered Best Picture and Best Director nominations. Hell, they even got Best Documentary right by giving it to Taxi to the Dark Side. Okay, so they bit the big one by giving Best Animated Feature to Ratatoullie over Persepolis. But hey, you can’t win ‘em all.

Usually their choices are terrible almost all the way across the board, and on occasion those terrible choices include snubbing deserving White men. Remember when The Hurt Locker got Best Picture, Best Director (Kathryn Bigelow), and Best Original Screenplay (Mark Boal)? Well, James Cameron was right—not about Avatar‘s being more deserving than The Hurt Locker, but that the awards went to the wrong film. The film year was 2009, the year of Quentin Tarantino’s glorious Inglorious Basterds, which outclassed The Hurt Locker in all three categories so clearly that one could argue that the only reason The Hurt Locker, a rather pedestrian film, won anything at all was that director Kathryn Bigelow is female, especially considering all the hype around her gender in relation to the film at the time.

But that didn’t stop people crying “sexism!” three years later when Bigelow’s Zero Dark Thirty—another pedestrian film—was shut out. Is Zero Dark Thirty better work than Argo, which won Best Picture and Best Adapted Screenplay? Maybe. But it wasn’t in the same league as Michael Haneke’s Amour (or Beasts of the Southern Wild, for that matter). Haneke also got robbed on the Best Director front when the Academy gave the award to Ang Lee for The Life of Pi. (The two Andersons, Wes and Paul Thomas, got jobbed even worse, because Moonrise Kingdom and The Master are magical, yet the only nomination between them was a Best Original Screenplay nod to the former.)

The point of this little side trip into Academy injustice is that in many areas of life—such as the Academy—sometimes the inferior is favored over the superior, and sometimes it happens for reasons that have nothing to do with racism.

Sometimes they do, though. A very probable example occurred in 1992, when Al Pacino when Best Actor for his ridiculously exaggerated performance in the silly Scent of a Woman over Denzel Washington’s pitch-perfect tour de force in Malcolm X. Not that the Academy has been unwilling to cast their ballots for Washington: he’s got two Oscars sitting on his mantle and four additional nominations, including for Malcolm X. But could racism or other negative feelings vis-à-vis Malcolm X have kept Washington from an Oscar he so obviously deserved? It doesn’t seem out of the realm of possibility.

MLK is less likely to be the target of such White ambivalence. But might Selma’s portrayal of LBJ as more nuanced than a simple civil-rights hero have struck a nerve at the Academy? Again, it doesn’t seem out of the realm of possibility. But that doesn’t mean it’s why David Oyelowo didn’t make the cut.

Of course, to make a fair determination—subjective as it may be—of what three to ten films, performances, etc., should get award consideration in a given year, one must actually see all of the films released during that twelvemonth. That’s a lot of time at the cine. In 2013, for example, the British Film Institute counted 698 films opened in theaters.

Have I seen all of the 600+ films that came out last year? Don’t be silly! Have I seen all of the Oscar nominees? Not even close. Did I at least see Selma? I did not. So am I in a position to know whether the Academy’s nominations, along with the concomitant omissions, are good ones? Obviously not. But neither are most of the people complaining. Rather, they see Selma‘s cast snubbed while White faces fill most of the nominee slots and figure there must be something wrong here.

There may in fact be something wrong here. But most of the people saying so—in social media, for example, where the topic has blown up—have no idea what they’re talking about. You can’t judge a book by its cover, even when that book might look dubious at first glance. Yes, the Academy is damn near all White, which seems pretty hard to defend. But we can’t presume that, were the demographics of the Academy the same as American society at large, the 2014 Academy Award nominees would look different—and we can’t presume that if they did, it would be deservedly so. Winnowing 600+ films and all the performances therein down to the top few is a matter of taste, and it’s far from statistically impossible—especially in a society that is 63% White, which is certainly reflected in the casting of film (maybe overly reflected, but that’s another issue)—that one might end up with an all-White list of nominees for reasons that have nothing to do with prejudice.

That doesn’t mean that’s what happened in 2014. I certainly don’t know. But a lot of the people complaining don’t know, either. That, in itself, is a problem, even if it’s not the only one plaguing this year’s Oscar buzz.

Love and formic acid,
G.I.ant

(Image: detail from “85th Oscars” by Source. Licensed under Fair use via Wikipedia – http://en.wikipedia.org/wiki/File:85th_Oscars.jpg#mediaviewer/File:85th_Oscars.jpg)

Posted in G.I.ant Tagged with: , , , ,

Nine Little Wishes for the New Year

Happy New Year 2015. 3d

Like every year, 2014 was a mixed bag. And as George Santayana tells us, “Those who cannot remember the past are condemned to repeat it.” So with memories of 2014 still fresh in my little ant mind, here’s a modest wish list of phenomena we shouldn’t repeat in 2015.

 

DON’T LET ANYBODY INHIBIT OUR FREEDOM OF EXPRESSION.

The Interview is not a film I’d ever see, but the fact that it never played at a theater near me is a major bummer, because by kowtowing to the will of North Korea or whoever threatened us over the release of that stupid film, Sony Pictures, along with the nation’s biggest theater chains, showed that perhaps the only thing that can trump profit motive in American capitalism is spinelessness, thus giving ammunition to every nutjob out there who can hack a computer. However well intended, it was a cowardly and dangerous move. And despite Sony’s bending to pressure to allow less cowardly members of the motion-picture community to screen The Interview, 2015 is sure to come with a boatload of similar threats against all manner of forms of expression, simply because said nutjobs have been sent the message that threats work. The sad truth is that when someone is committed enough, it’s not all that hard to kill people. Thanks, Sony, Regal Entertainment, AMC Entertainment, Cinemark, Carmike Cinemas, and Cineplex Entertainment, for encouraging the practice. Let no-one follow your example in 2015.

 

HOLD POLICE TO ACCOUNT FOR EXCESSIVE FORCE.

Body cameras—haute couture for police in 2015—will help, but all the body cameras in the world won’t matter if the judicial system automatically immunizes police officers from prosecution even when they’re caught murdering people, as was the case with Eric Garner. Okay, let’s tone it down a notch and say Daniel Pantaelo manslaughtered Garner. It was nonetheless another needless death at the hands of cops, and it’s got to stop. And yes, Whitey, this kind of thing really does happen far more often to people of color.

 

DON’T LET ISRAEL DO WHATEVER IT WANTS WITHOUT REPERCUSSIONS.

Sanctions against a country—particularly when imposed by the U.S.—are a reasonable move when you want to pressure that country (as opposed to forcing it) not to do bad stuff. It doesn’t always work, but you do it under the principle that bad behavior shouldn’t go unpunished. It’s why we just imposed new sanctions against North Korea for the Sony hack, and why we imposed sanctions on Russia for its incursions into the Ukraine. But not only was 2014 another year in which Israel continued its land-grab in the West Bank, but by its own estimates—wildly conservative, according to independent monitoring agencies—the Israeli military killed over 1,000 civilians in response to missile attacks by Palestinian extremists that killed a total of three Israeli civilians. The Israeli Defense Fund destroyed schools, hospitals, and even UN shelters, drawing the ire of human-rights organizations worldwide, such as Amnesty International, which criticized the IDF for its “callous indifference” to civilian casualties. While the Knesset talks peace, under Prime Minister Benjamin Netanyahu (if you who don’t know, think about what would happen if George W. Bush and Vladimir Putin had a baby) the best Israel can hope to get is unrest. That’s why the U.S. would actually be doing Israel a favor by imposing sanctions until Israel relents from settlement-building. (Really they should be tearing them down, but one step at a time.) In 2014 Secretary of State John Kerry made it clear that the U.S. considers the settlements “illegitimate”—as does the entire rest of the world; and even some within the Knesset, such as Israeli Justice Minister Tzipi Livini, recognizes that continuing on this course can “only distance us from the ability to recruit the world against Hamas.” These are the right sentiments. But as Dostoyevsky loved to say, “Assez causé!”

 

VACCINATE YOUR BABIES.

There is no place in the world more modern technologically advanced than Southern California. And yet currently the area is in the throes of an epidemic of whooping cough, with 10 times more cases reported in 2014 than there were way back in 1976. The reason is simple: vaccinations have gotten a bad name. Thanks, Andrew Wakefield. Thanks, Jenny McCarthy. And thank you, you too-stupid-to-breed-responsibly parents. Because of you, diseases that were all but extinct have gotten new life over the last few years. Let’s hope this year’s new parents reverse the trend.

 

BUY ALBUMS OTHER THAN TAYLOR SWIFT’S.

Nothing against Ms. Swift, but the fact that her 1989 was the only million-selling album in 2014—and that it hit surpassed that total in its first week of release!—is a pretty sad statement about the record-buying public. And I’m not talking about the teenyboppers who drive the Top 40, I’m talking about you, Mr. Hipster, Mr. Audiophile, Mr. I Dig Serious Music. The thing is, you’re not supporting those great artists who are out there making albums that are good from start to finish, that provide an integrated listening experience. The Flaming Lips, The Black Keys, Brian Jonestown Massacre, Beck, Röyksopp, Pixies, Mogwai, Phish, Aphex Twin, and Broken Bells all came out with new material. Okay, the Pixies’ album sucks. But Conor Oberst? The War on Drugs? Real Estate? Spoon? Ray Lamontagne? Caribou? Suburban Serfs? If you didn’t buy an album in 2014, it’s just because you didn’t try very hard. (Then again, vinyl sales surged, so the news wasn’t all bad.)

 

DON’T ALLOW TORTURERS TO GO UNPUNISHED JUST BECAUSE THEY’RE AMERICAN.

We’ve been making this mistake for years now, but with the release last month of the Committee Study of the Central Intelligence Agency’s Detention and Interrogation, perhaps finally there will be enough momentum to restore a bit of the U.S.A.’s credibility with the rest of the world, which is pretty sick of our failures to practice what we preach. Probably not, though, with the Republicans in power. See, the torture happened under the George W. Bush administration (see Taxi to the Dark Side, the 2007 Oscar-winner of Best Documentary, to get details on how), and so most Republicans not named McCain either continue to defend the program—even though among its results was the torture of at least 26 innocent people!—or sweeping what happened under the rug. I wonder what the Republicans would be saying if the torture had happened on Obama’s watch. Hmmmm.

 

STOP TORTURING PEOPLE.

The truth is, on Obama’s watch we’re still at it. Maybe not the way we were a decade ago, but Guantanamo Bay is still open for business, and part of that business in 2014 was force-feeding detainees. Have you been force-fed? Me, neither, but those who’ve had the displeasure of being strapped down and having a tube shoved up their nose and down into their stomach—without the benefit of anesthetic, of course—call it torture. Obama and company would say otherwise, but they’ve been doing all they can not to let you see videos of the practice so you can make up your own mind. I wonder why. Hmmmm.

 

DON’T BE SQUEAMISH ABOUT USING VIOLENCE AGAINST ISLAMIC STATE AND THE TALIBAN.

Say what you will about the Vatican, but in today’s world they’re not pro-war. Nevertheless, when it comes to Islamic State, even the Vatican realizes there is no way other than force to deal with them. “When all other means have been exhausted, to save human beings the international community must act. This can include disarming the aggressor,” said Archbishop Silvano Tomasi, the Vatican representative to U.N. agencies in Geneva, in August. Archbishop Giorgio Lingua, the Vatican nuncio to Iraq, went so far as to say that U.S. airstrikes against Islamic State “had to be done, otherwise [the Islamic State] could not be stopped.”

Gandhi was a mahatma, but in advocating nonviolent resistance against the Nazis he was naïve, because the Nazi’s would have unrelentingly exterminated or subjugated everyone who opposed them. Islamic State and the Taliban are like that. They have no qualm about not only murdering whomever they see as infidels—which simply means anyone who doesn’t believe as they do. And not just the infidels, but their children, as we saw last month when the Taliban methodically killed 120 people—mostly children—in a targeted attack on a school in Pakistan. Islamic State, meanwhile, so openly advocates raping women and taking slaves that they promote such practices in Dabiq, a glossy magazine. In dealing with people like, nonviolent resistance is futile. Almost no group, no matter how angry or oppositional they appear to be from across the trenches, are beyond engagement. Islamic State and the Taliban are two of those sad exceptions. While talking about the former in August, Archbishop Tomasi recalled the international community’s failure in 1994 to take up arms to stop the slaughter of Tutsi’s at the hands of the rampaging Hutu majority. “People met, but did nothing, and we have mourned the Rwandan genocide ever since,” he said. “[… The aggression of Islamic State] is not a religious issue, it is not a matter of Christians defending Christians, but it is a call for the defense of human beings by all human beings.” Amen.

Love, formic acid, and a happy new year,
G.I.ant

Posted in G.I.ant Tagged with: , , , , , , , , , , , , , , , , ,

The X Factor in Xmas, or Why Christ Doesn’t (Necessarily) Matter

jesus one possible

They call him “the reason for the season.” They are Christians, and they’re talking about Christ. But the truth is that you don’t need to give a good goddamn about Jesus to love Christmas, because the true spirit of Christmas is in no way necessarily tied to religiosity.

No disrespect to the man or his most devout fans. Jesus was a real cool cat, a rebel rebel. He was no elitist. He cared about the disenfranchised. He challenged the oppression of the establishment. He disavowed the value of money and property.

He also hated hypocrisy and had no use for outward displays of so-called piety. “When you pray, be not to be like the hypocrites, for they love to stand and pray in the synagogues and on the street corners so that they may be seen by men,” we hear him preach in Matthew 6:5–6. “Truly I say to you, they have their reward in full. But you, when you pray, go into your inner room, close your door and pray to your Father who is in secret, and your Father who sees what is done in secret will reward you.”

The point he’s making is less about reward than sincerity. Transport Jesus to these times, and you can almost hear him saying, “Don’t talk the talk, but walk the walk.”

Today’s Christians who make a big show of whining about Christmas’s being a secular holiday would have annoyed the shit out of that guy. For starters, he would point out, it’s silly to say Christmas is a celebration of Christ’s birth, since scholars agree that, although the exact day or even time of year cannot be determined by the accounts the one thing that’s clear is that he wasn’t born in winter. Whatever the significance of December 25th, the birth of Christ isn’t part of it.

But that’s trivia. What really would have steamed old INRI is that those whining about the secularism of Xmas are ignoring what’s most important: that it evokes more than enough warmth and charity to roast chestnuts by.

Obviously there are many ugly aspects to Christmastime in these United States, most all of which are tied to the crass commercialism we see in TV ads for all kinds of crap and people storming stores for Black Friday deals to get it. But let’s be honest: that ugliness is more about American-style capitalism (or perhaps something far more deep-seated in Homo sapiens sapiens) than the holiday season. Such greed is always present; we’re just not always so on the lookout for it.

What Xmas brings that is unique comes at the other end of the intentional spectrum. Consider all those fab holiday gatherings, both large and intimate, this time of year. Somebody is holding them, and they’re using Xmas as an excuse to get us together in groups—a tradition sorely needed in a world where technology, for all its ability to connect us, makes it increasingly easier to isolate oneself from face-to-face interaction.

On the individual level, many of us get more generous as the calendar becomes just one page. I know I’m a case in point. Not to imply that I would be nominated for the Nobel Prize for Generosity were I wealthy, but it’s the financial facts of my existence that dictate I’m rarely standing a round at the pub. But come Christmastime I’m all about giving gifts despite my modest means. It’s not out of a sense of obligation: it’s that Xmastime is my excuse to loosen the purse strings a bit. I enjoy the process of trying to come up with something useful or enjoyable that I can afford for my loved ones. I don’t know whether it’s better to give than to receive, but it’s pretty damn good.

And it seems I’m not alone, and not just when it comes to personal gifts. A recent study finds, 30% of all annual giving online occurs in December, while another finds that 18% of all charitable donations are made in the same month.

Even our simple house decorations shed light on the subject. Sure, we do it for ourselves, but it’s also for others to enjoy. Even in sunny Southern California, during this subseason the landscape changes, with lights and other festoonery at every turn. It’s beginning to look a lot like Christmas, everywhere you go….

Face it, zealots:  you can take the Christ out of Christmas, and the holiday is no worse for wear.

That’s no dig on the way you celebrate. An idea crystallized for me by Richard Rorty, my philosophical hero, is that belief matters only insofar as its use. Some of my best friends are Christians, as are some of the worst people I’ve known. Whether or not you believe Jesus to be the Son of God and that Christmas is about celebrating His birth doesn’t contribute inherently to, nor is a necessary indicator of, how happy of a person you are, how kind, how generous, how good a friend or neighbor or citizen. And it’s those things that matter to me. If tying Christ to Christmas is helping to make you a happy and giving person, good for all of us. But not all of us need that reason.

I love Christmastime because on balance people are celebratory. Together, we rejoice, we revel in the season. Some of us, at least. I feel sad for those who find it a stressful time or who are soured by their own cynicism. Like so many things under heaven, Xmas is what you make of it.

And like so many things, the what of Xmas matters more than the why. So I’m offering this simple phrase for kids from 1 to 92 (and beyond): may you have a good what this Christmas. Find your own reason for the season, and join the rest of us who have already found it. Because Xmas is truly a case of: the more, the merrier.

Love and formic acid,
G.I.ant

Posted in G.I.ant Tagged with: , , , , , , , , , , ,

Under Appreciated Movies – 1

I would’ve made a list about books instead but who really reads these days?! Not me anyway. And this is a response of sorts to G.I.ant’s post. He got a few things right, like Miller’s Crossing, City of God and Lone Star. How someone with usually impeccable taste can get anything wrong on a diminutive list like that has us scratching our shiny shell-like pates. But we’re not fascists at tired ant; we’re not about forcing others to think the “correct” way. They’ll get there eventually! ;) Here’s what you should be watching (again, if applicable) on a Saturday night when you prefer a little alternate reality. A mix of terrific entertainment and some artsy-fartsy shit to elevate your filmic vocabulary. Yeah, an arbitrary list, in other words. Maybe when I’m not so tired I’ll actually comment substantially on each flick. Don’t count on it. —tired ant

Screen Shot 2014-11-20 at 5.41.06 PM

My Man Godfrey – they don’t make ‘em like they used to. “sniff”

Election – hey it’s American politics in microcosm with Reese Witherspoon playing the part of a much smarter Sarah Palin!

Searching for Bobby Fischer – watch a Civil Action also. Zaillian’s a smart cat. I mean, ant.

The Prestige – a movie with Bowie playing Tesla is worth appreciating.

Raise the Red Lantern – Gong Li at her height.

Little Voice – Jane Horrocks!

In Bruges – maybe not for a re-watch but charming the first time.

Rounders – ‘cause, y’know, life’s a bad beat.

Europa – a rather tame but effective Lars von Trier.

Dr. Strangelove – Kubrick’s pretty much on every list.

Ghost World – where have you gone, Terry Zwigoff?

Sweet Smell of Success – a quintessential American movie—directed by a Scot.

The Coca-Cola Kid – Greta Scacchi at her height.

Contact – never mind the “faith” issue.

Being There – sure the book’s better but this list is about movies!

Midnight Run – DeNiro sometimes makes us laugh—see King of Comedy.

Mad Dog and Glory – genius Bill Murray.

Gun Shy – this (token) picture is added for list diversity.

Man on the Moon – actually, watch the Youtube vid of Kaufman doing Mighty Mouse instead.

The Master – yo Joaquin can act!

Dead Ringers – doppelganger deluxe. Ok, they’re twins but you get the idea.

The King of Comedy – ain’t life a joke?!

Ghost Town – Gervais makes it work. Barely.

Force of Evil – we like the look.

Map of the Human Heart – y’know, for your date night.

The Last Picture Show – that little speech by Johnson is worth the price of admission.

My Own Private Idaho – Gus Van Sant at his height. Don’t know why but I like it. 

Eternal Sunshine of a Spotless Mind – hey, who doesn’t like it.

Man Bites Dog – a movie with subtitles that Americans can love!

Syriana – it’s a drone eat drone world.

Posted in Uncategorized Tagged with: , ,

Being Politically Correct Shouldn’t Make You Stupid

m3tiqZh

Political correctness gets a bad rap, because often there are valid points at heart. The epithet ‘retarded’ is a case in point. But a recent Huffington Post piece on the subject shows just how stupidly some people trip over themselves in their haste get themselves—and the rest of the world—on the right side of the PC line.

Unlike the term ‘faggot,’ which was hateful from its outset as an epithet, the origin of ‘retarded’ as applied to the “intellectually disabled” (the currently accepted term) was not meant offensively. The verb ‘retard,’ which dates back to the 15th century, means “to slow or delay,” and so the logic of applying the term to people deemed to have a slower rate of intellectual developmental is more or less neutral (although inaccurate as regards the condition, considering that it is less the rate of development than its upper limit that is central to the label).

While “retarded” persons have been much ridiculed—as traditionally happens with anyone perceived to embody too much otherness—more often than not labeling someone “retarded” was meant simply as a descriptor:  Caucasian, American, retarded. Certainly this was the case in the medical community, where “retarded” (“mental retardation,” etc.) was simply a diagnosis.

‘Retarded’ became of a term of offense much in the way ‘gay’ became a term of offense (e.g., “That’s gay” = “That’s dumb [ridiculous, etc.]“) after it became synonymous with homosexuality, with the user targeting the group in question indirectly—often unconsciously and unintentionally—by negatively impugning those who fit into the category. To call a person of “normal” intellectual development “retarded” is to put her down by letting her know that she is lesser. It’s the exact inverse of calling someone “Einstein.” It’s nice to be called “Einstein” (if it’s not meant ironically) because Einstein is superior; it sucks to be called “retarded” because retarded people are inferior.

I grew up using ‘retarded’ this way. Never in my life did I regard the intellectually challenged as lesser. Never once did I insult someone for having any sort of handicap. Doing so was actually anathema to me, and even as a young child I was appalled when I would hear others do such a thing. Nonetheless, in words I impugned the intellectually challenged, just like I unthinkingly impugned those of Polish descent—myself included—when I referred to a foul ball that cleared the backstop as a “Polish home run.” Such is the nature of (mis)acculturation.

Eventually I became deliberate enough in my choice of words to stop using ‘retarded’ in this way, and I am hopeful that ever more persons will do the same. Life is hard enough for most all of us without one’s culture consciously and unconsciously reinforcing the idea that one is innately lesser. Besides, it’s simply good to live and speak mindfully.

No doubt that was the intent of Eleanor Goldberg, editor of the Huffington Post‘s Impact section, when she recently penned a laconic article on the topic of when it’s okay to use the word ‘retarded.’ Her answer: never, in any context. And that is, to use a word I hope couches no etymological traps I’d like to avoid, idiotic.

The centerpiece of Goldberg’s article is a flow chart created by the Military Special Needs Network. And before you jump to the conclusion that the only target is the usage of ‘retarded’ to describe people, the first Yes/No fork on the chart comes at, “Is it describing a person?” Whether you take the Yes or No route here or at any other fork, ultimately you arrive at the same conclusion: “Find a different word.”

“That should about settle it,” writes Goldberg, as if we’ve just been in the presence of logical greatness. But of course the chart is ill conceived, omitting usages of the term that in no way pertain to persons with intellectual disabilities. One’s emotional growth may be retarded by excessive consumption of alcohol. My progress down the street might be retarded by a limp. Your latest musical composition might retard during the bridge.

‘Retarded’ has a meaning—an original meaning—that has nothing to do with people’s intellectual capacity, and therefore there are myriad ways in which none of us should think twice about using it. And while the Military Special Needs Network might get a pass for being too focused on their special cause to be clear-headed about the semantics in play, Goldberg deserves no such consideration. Her trade, after all, is words, and she has a forum where she can make an impact. But instead of using it thoughtfully, she does so without due deliberation, blunting the point of her article by adding fuel to the fire burning in those who recoil at anything that smacks of political correctness. Many people hate the PC world not because of its main thrust, but because of its overreaches.

Goldberg writes, “As is often the case, language evolves, connotations change.  And in this particular situation, the word ‘retard’ now carries a pejorative meaning that reinforces painful stereotypes.” But while she is on-course in the first sentence, she goes off the rails in the next.  What she fails to grasp is what Ludwig Wittgenstein taught us: language is use. Words—’retard’ and all the rest of them—don’t carry meaning. The misconception that there is something inherent to words is an essentialist hangover. Sans context, words are ambiguous at best, meaningless at worst. If I wear a T-shirt with nothing but the word ‘retard’ printed on its front, you cannot know what I am intending to communicate without asking me.

The irony of Eleanor Goldberg’s making a case against speaking unthinkingly by way of speaking unthinkingly is rich. Surely she means well—she just doesn’t do well in this case. Political correctness gets a bad rap because it is often applied moronically. (Don’t worry: moron comes from the Greek word for “foolish.”) That’s no way to win hearts and minds.

Love and formic acid,
G.I.ant

 

Posted in G.I.ant Tagged with: , , , , , , ,

Is Pope Francis the First True Saint?

Pope_Francis_Korea_Haemi_Castle_19_(cropped)

The descriptor saint is used way too often. “Your mother was a saint,” Sheriff Sam Deeds is told in Lone Star, the context being that it’s saintly to stay with a philandering husband, the sort of talk that accurately reflects the way laypeople use the term. Patience, generosity, warmth—pick a positive quality broadly applied by an individual, and informally it’s a mark of sainthood.

Of course, the concept of saints is far from informal. Originating from the Latin sancire (consecrate)—which itself derives from the Greek word ἁγιάζω, meaning something along the lines of being holy by virtue of being set apart from the world—with literal sainthood lying in the province of the Catholic Church.  Protestants may play sainthood’s greatest hits (John, Paul, Matthew, and Ringo…wait, not Ringo), and religious scholars have a few faves, for whom the title is used honorifically (Augustine, Aquinas), but in so-called modern times, only Catholics get all official about it.

There’s a very specific road to sainthood. An investigation must find that nothing hinders the case for your canonization. Further study of your life must find that you embodied “heroic virtues.”  And at least one miracle must be on your résumé—two if you weren’t martyred.

Of course, you have to be dead first. Usually the canonization process doesn’t begin until the would-be saint has been deceased for at least five years, but as happened with Roberto Clemente in Major League Baseball and its rough equivalent to sainthood, the Hall of Fame (which also has a five-year waiting process after the “death” of that player’s career. Coincidence?), special candidates can be fast-tracked. Mother Teresa is a case in point, as Pope John Paul II waived the waiting period.

There’s something at least a little disingenuous about sainthood in our relatively modern age. Our vastly improved understanding of the physical world and its processes, coupled with an ability to document most everything in myriad ways, leaves little leeway for claims of miracles being accepted by any but the super superstitious. The rest of us need, if not proof, at least hard evidence.

Any unbiased look at the lives of Mother Teresa (sainthood pending) or John Paul II (canonized April 2014) will find them wanting on this score. Teresa may have done much to ease the suffering of many, and John Paul II may have done yeoman’s bringing the papacy into technological times. But miracles? Nah. And it will be the same for every John, Joan, and Regular Joe from here on out.  There’s a reason that today the “miracles” that get you sainthood are no more than someone’s being gravely ill but recovering for reasons leaving their doctor’s stumped and saying it was because of So-and-So (as happened in the case of John Paul II). A combination of unlikelihood, ignorance, and faith passes muster. Former supposed saints earned their stripes in a world without scrutiny.  But that world is no more, and the miraculous ain’t what it used to be.

But don’t put away the censer just yet, because we may have a real-life saint (relatively speaking, at least) right here in our midst. His name is Pope Francis, and he’s the right guy for the job.

Frankly (ha!), I was slightly disappointed when Jorge Mario Bergoglio was elected pope. Not because I had anything against him—I’d never heard of the fellow—but because an African cardinal was on the shortlist. On a planet where Africa is far and away the most troubled, neglected, and needful continent—in no small part due to lingering bigotry against dark-skinned peoples by the economically superior light-skinned world—for a non-Afrikaner African leader of the Church might have been a game-changer for the Dark Continent.

But Pope Francis is a winner. He has taken a religion that for its entire history has taken Dark Ages stances toward homosexuality and science and aligned it with enlightenment. The “Big Bang” happened and evolution is happening, he affirms. He says he is in no position to pass judgment on homosexuals.

Yes, these are ideas that almost any adolescent is capable of grasping. And for all the dramatic headlines about Francis’s comments, he’s not recommended doctrinal changes regarding homosexuality; and he’s not the first pontiff to be open-minded to “Big Bang” cosmology and Darwinian evolution. Then there’s his affirmation of some of the Church’s most backward-looking positions (e.g., no artificial contraception). But there is just something about this particular Bishop of Rome makes us feel like he’s worthy of job. That’s more than a neat trick.

Ironically, it seems Pope Francis is less widely loved within the Church itself. New York Times columnist Ross Douthat warns that although “Francis is charismatic, popular, widely beloved [… and] until this point, faced strong criticism only from the church’s traditionalist fringe, and managed to unite most Catholics in admiration for his ministry,” continuing down the progressive path he seems to be beating might “leave many of the church’s bishops and theologians in an untenable position, and it would sow confusion among the church’s orthodox adherents—encouraging doubt and defections, apocalypticism and paranoia (remember there is another pope still living!) and eventually even a real schism.”

But it’s not going to happen. Just like with cannabis criminalization, the old guard is dying off. Through the intercession of Pope Francis, the Catholic Church is becoming a holier house, filled with less ignorance, less prejudice, less hostility.

And why not? Mirabile dictu, the honest-to-God truth is that belief in the Holy Trinity does not entail a literal belief in Genesis or strict adherence to the laws of Leviticus. Barriers between Catholics and the acceptance of evolution and homosexuality are manmade, not divine constructs. The foremost experts in inflationary cosmology will freely admit they don’t know what started it all, just as the foremost experts in biology and neuroscience and psychology will say they can’t say whether we have souls.

The world at large is encouraged by Pope Francis. We like it when leaders—any leaders—have their hearts and minds in the right place. We like it when our traditions fit our times. We like to see society being reshaped along the lines of what most believers and nonbelievers alike picture when we imagine God: reasonable, merciful, just, accepting, loving, egalitarian, pragmatic; free from pettiness, disingenuousness, prejudice, and hatred.

And so we like Pope Francis. Hell, we love him. Yeah, we’re overenthusiastic. Yeah, we’re jumping the gun. But he’s humbly been going about his business for less than two years, and so far, so good. What might he do in a decade?

So, while it may be premature, if you’re thinking about popes as saints, may we present St. Francis of Argentina for your consideration?  Hey, if the halo fits.

Love and formic acid,
G.I.ant

 

(Photo credit: Korea.net / Korean Culture and Information Service)

Posted in G.I.ant Tagged with: , , , , , , , , , , ,

The (com)modification of Renée Zellweger, and how blaming society misses the mark

renee

“Maybe Renée Zellweger wouldn’t look like a complete stranger if society wasn’t such a judgmental dick,” read a female friend’s Facebook post.  To me that’s a far catchier headline than. “You Won’t Believe These 48 Things My Dog Did.  #12 Alone Will Make You Smile Forever,” so I did a quick Google search to find out what was up with the gal many people remember most for being had at “hello.”

Even if you don’t know what Renée Zellweger has looked like for most of her public life, a single glance at a recent picture tells the story plainly:  she has fucked herself up with botox and/or plastic surgery and is beginning to take on the unnatural physiognomic shrivel that will forever blemish on the look of American society in the decades immediately surrounding the start of the 21st century.

My friend’s post quickly attracted sympathetic comments from women similarly blaming society for Zellweger’s transformation.  But they’re throwing out the baby of personal responsibility out with the bathwater, as if societal standards of female beauty aren’t partly set by women and somehow sap all women—even those who have consciously exploited those standards to enrich themselves—of their agency.

Let’s call a spade a spade. Renée Zellweger may be a fine talent (her stellar performance in the brilliant film adaptation of Chicago emphatically proves the point), but her small fortune has come largely from her “50 Most Beautiful People“-worthy looks.  Scan her acting credits, and you find mostly roles built for beauty. Then there’s been all the “Look at how gorgeous I am!” photo spreads (Maxim, Glamour, etc.).  Zellweger has commodified her beauty, and done it profitably.

No-one should blame her for that.  And this is one player who has enjoyed the game’s rewards despite the fact that her beauty is unorthodox.  To some extent her particular pulchritude—a round and noticeably compact head, squinty face, “imperfect” skin, a mouth that perpetually seems to be sucking on something sour—enabled her to color a bit outside the lines of the Hollywood glamour template.  She never fit the mold of a classic beauty; rather, she seemed more earthy, more natural.  Whether she’s been as ripped as a pro athlete or as round as healthily plus-sized model, whether her hair was lusciously lengthy or tomboy short, she’s met with approbation.  She was never required to look the part of a typical Hollywood starlet; she got the treatment anyway.

But then she reached her mid 40s, and they say that film roles for older women don’t grow on trees. Certainly there are a plethora of data comparing the number of film roles for men and women, as well as how meaty or superficial those roles are. But almost inevitably those studies focus only on top-grossing films—generally the shallowest of the shallow.  And if you’re surveying the lay of that fallow land, what do you expect to find?

This isn’t to diminish the fact that it really is easier for men to make a living in Hollywood than for women. But as Hymen Roth would say, this is the business Zellweger chose. And business has been good, so good that she has no need to work another day in her life.

Presumably, though, she wants to work—and not necessarily (just) for the money.  Perfectly understandable. Perhaps that’s part of why she is distorting before our eyes. Whatever procedure(s) she has undergone (there was much speculation that she could not have looked the way she did at last year’s Academy Awards without some cosmetic help beyond makeup), her choice seems to signal her desire to perpetuate an artificial model of aging, some unnatural standard of what it is to be beautiful.

That’s her prerogative.  It’s her face, after all, her career, her life.  She can do whatever she wants with it for whatever reasons she wants.  But like my Facebook friend, I feel a bit sad that she feels the need to go this route.

Where I part company with my friend is in simply blaming society for Zellweger’s choice.  I agree that society is patriarchally sexist, and so I’m happy to echo my friend’s labeling of society as a dick.  Society is a dick.  But neither society at large nor Hollywood has fucked Renée Zellweger.  If she’s fucked, she’s partly fucked herself.  And while she may deserve sympathy for her insecurity or whatever other neuroses she harbors (as we all do.  And here’s a news flash:  aging is somewhat troubling for just about everybody), society didn’t exactly demand this bit of supposed beauty enhancement.

Not to mention the obvious but often overlooked:  society is a totality of constituent parts. Renée Zellweger has played her part in pop culture’s beauty game. She’s playing it still by being the latest in a line of women subjecting themselves to needle and knife (and in Zellweger’s case, telling the world that the difference in her appearance is simply that she’s healthier), women who play a part in setting the standard for girls who one day may do the same to themselves.

Blame society for Zellweger’s new look, if you like, but don’t forget her part in the whole business. We’re all influenced by our culture. Truly, some people are more or less simply victims of their lot in life.  That is not the Renée Zellweger story.  If hers is a cautionary tale about the nefarious influence of pop culture, it’s also a narrative that touches on questions of personal responsibility.

As Mohandas K. Gandhi said, “If we could change ourselves, the tendencies in the world would also change.”  Easier said than done, to be sure.  But the more we (individuals, institutions) keep these words in mind, the better off we’ll be.

Love and formic acid,
G.I.ant

Posted in G.I.ant Tagged with: , , , , , , , , , , ,

5 Ways to Improve the NFL Between the Lines

nfl-logo-blue-607x370

You don’t have to be a football fan to have gotten an earful lately about what’s wrong with the National Football League off the field.

This isn’t about that. This article focuses solely what happens between the sidelines. And as great as NFL football is, several rules currently in play are hurting the integrity of what happens on the field. Here’s a list of changes that would make the great sport of NFL football that much greater.

1. Mandate that only players on the field can call timeout.

In the Week 2 matchup between the Jets and the Packers, with five minutes remaining the Jets were trailing 31-24, when Jeremy Kerley made a fantastic catch of Geno Smith’s 37-yard pass for an apparent touchdown.  The play was disallowed, however, when the officials ruled that a timeout had been called by Jets head coach Rex Ryan. It wasn’t Ryan, though, but Offensive Coordinator Marty Mohrninweg, who was frantically trying to get play stopped and to whom the officials responded.  By rule, the only person on the sidelines allowed to call timeout is the head coach. The officials got it wrong, but as they pointed out, with their eyes on the field they can’t very well turn around and confirm whose calling timeout.

The problem is the rule that a timeout can be called from the sidelines at all. Not so long ago timeout could only be called by a player on the field—which, obviously, would be seen by the officials, since that’s where they’re looking. Just as obviously, coaches would signal to players to call a timeout—and that’s even before the aid of direct line of radio communication to one of his 11 players on the field. Allowing coaches to call timeout from the sidelines is not only superfluous, but it has mucked up the game on occasion. Recall, for example, what happened in 2007 during the Week 13 matchup between the undefeated Patriots and the Ravens. With the Ravens up 24-20 and the Pats facing a 4th-and-1, Tom Brady was stuff attempting a quarterback sneak, and all that was left was for the Ravens to run out the clock. But the officials ruled the play null and void because someone from the Ravens sidelines had called a timeout just before that snap. That someone turned out to be Defensive Coordinator Rex Ryan. The Patriots went on to win the game 27-20 and became on the second team in NFL history to go undefeated—the first in a 16-game season. It’s an ugly blemish to the NFL’s beautiful history to think a record like that exists only because a stupid rule took away the ability for the game to be decided on the field.

Maybe Rex Ryan should start driving the bandwagon to rid the league of this rule? It’s obviously not doing his teams any good.

2. Let Al Michaels handle all replay reviews.

It’s easy to recall officiating teams getting it wrong after they’ve reviewed video, but I’m not so sure even the worst announcers in the broadcast booth ever have. I’m not sure why NFL officials have proven so bad at reviews, but history has shown that they are. And while it seems that this season’s move to have all replay reviews conducted by a headquarters in New York will be an improvement, in Week 2′s Seahawks-Chargers game somehow the replay officials missed Percy Harvin clearly stepping out of bounds on his way to the endzone, even though all scores are automatically reviewed. I know the officials got it wrong because the announcers told me so, and they provided video evidence that proved it beyond all doubt.

Like any job, give this job to the people who do it best.  Since the NFL clearly can’t do this one right, outsource it. You know who’s never wrong in the world of the NFL? Al Michaels. If he’s unwilling to take on all that extra work, it’s probably okay to have network broadcast teams make the calls for their particular game. No, they’re not all Al, but this is one thing the rest of them (with the help of their production crews) almost always get right.

3. Properly differentiate between an actual pass attempt and when a quarterback’s hand is simply moving forward while holding the ball.

The infamous “tuck rule” play that propelled Brady’s Patriots onto their first Super Bowl victory is one of the sport’s greatest travesties. Everyone even slightly familiar with football knew he fumbled; it took the supposed experts to muck it up with ludicrous technicalities and sin against the entire history of the sport.

Amazingly, the “tuck rule” wasn’t done away with until 2013, but there’s still something very wrong going on, as was evinced late in Week 2′s game between the Chiefs and the Broncos as the former moved down the field to close in on what would have been a game-tying score. With mere minutes to play, DeMarcus Ware knocked the ball out of Alex Smith’s hand. But because Smith’s passing arm was angling in a forward direction, even it was pinned to his side at the elbow and by no means was he trying to pass, the play was ruled an incomplete pass, with the broadcast team’s “officiating expert” (they all have them now) telling the announcers that this was absolutely the proper call. It was a proper fucking joke, and any official prior to 1990 calling that a pass might have found himself without a job before too long. You don’t need to go to officiating school to understand the concepts forward pass and fumble; you certainly shouldn’t get your stripes at the expense of knowing which is which.

4. Get rid of this “defenseless receiver” bullshit.

Football is a rough sport.  It’s designed that way.  If a receiver goes to catch a pass, one option a defender has is to separate him from the ball.  But lately the NFL has made going over the middle to catch a pass more aligned with flag football than tackle.  The idea that a receiver has to be allowed to come down with a pass and take a couple of steps without being concerned with taking a hit is ludicrous.  Imagine a receiver who goes high for a pass at the goal line. The defender should let him come down in the endzone unmolested?!  That is the ultimate repercussion of consistently adhering to the “defenseless receiver” rule, and it has already come close to affecting the outcome of a game in 2014.  During Week 3′s Bears-Jets matchup, Kerley caught ball at 2-yard-line, but was separated from the ball by a beautiful hit delivered to Kerley’s body with the shoulder of a Bear defender. Nonetheless, officials flagged the defender for a hit on a defenseless receiver.

Once upon a time a receiver’s willingness to go over the middle to get a ball—knowing he was going to take a hit—was an invaluable attribute. Not all receivers are tough enough, physically and/or mentally. But in today’s NFL the distinction between those who will and those who won’t is fading fast, because the NFL has made it a far less risky proposition.

Don’t get me wrong. I don’t like seeing players get hurt. I’m generally okay with rule changes related to blows to or using the head. But the “defenseless receiver” concept goes far beyond this, as can be seen from the NFL rulebook, which prohibits a defender from “leav[ing] both feet prior to contact to spring forward and upward into his opponent,” as well as from in any way contacting “[a] receiver attempting to catch a pass; or who has completed a catch and has not had time to protect himself or has not clearly become a runner.”

That is not football: that is bullshit.

5. Do away with all recent rule changes designed solely to boost the offensive production.

Part of the beauty of football is the balance of elements. Offense, defense, special teams.  Specialists on both sides of the ball.  A salary cap so that major markets can’t simply spend other teams into oblivion. Which wins championships: offense or defense?  Are you a passing or running team, hard-nosed or elegant?

In a shift that surely has Vince Lombardi spinning in his grave, the league he helped refine to its modern beauty has been bastardized by a fantasy-football culture populated by dilettante fans who care more about statistics than good football. Not so long ago passing for 4,000 yards in a season was a pantheonic achievement. But thanks to an increasing number of rule changes (some of which we visited above), the 4,000-yard passing season has gone the way of the 50 homeruns during the Steroid Era.

Some of this can be attributed to genuine innovation on the part of players and coaches. Teams of earlier eras could have run out of shotgun more often, offenses could have come to the line with multiple plays and audibled to the seemingly best one based on the defensive alignment in front of them, quarterbacks could have thrown more jump balls and back-shoulder fades; they just didn’t think of it.

But rule changes have given the offense multiple legs up.  Why move the kickoff up to the 35-yard-line, even though on the whole kickers have gotten stronger and not weaker? It’s to reduce the impact of the return game, leaving more yards on the field for the offense.  Why make it so that a field goal on the first drive of overtime doesn’t end the game? It’s so you can squeeze in a few more drops of offense.

Every rule change that has been made solely to disfavor the defense should be repealed, including allowing radio communication between the sideline and the QB. Let’s keep the game on the field.

Speaking of QBs, perhaps no series of rule changes have been made solely for the purpose of favoring the offense than the progress towards what may end up as putting flags on the QB in lieu of allowing him to be tackled.

To be fair, the idea that quarterbacks are overly protected is nothing new. In 1978, Steelers linebacking legend Jack Lambert famously opined, “Quarterbacks should wear dresses.” But 1978 was a free-for-all on the QB compared to today, as many former NFL players have observed. On Week 4′s first play from scrimmage, for example, Redskins defensive end Jason Hatcher received a 15-yard roughing-the-passer penalty for what announcer Phil Simms—himself a former Giants QB—called “grazing the helmet of Eli Manning as he went by.” “A tough call,” Simms said, by which he clearly meant: Not a fair deal for the defense.

The NFL rulebook is now so biased toward giving the offense free reign (sic) that “[i]t is a foul if a player initiates unnecessary contact against a player who is […] in the act of or just after throwing a pass.” In other words, by the letter of the law you could actually get flagged for touching a QB while he’s throwing a pass, let alone if your momentum carries you into him after he’s let it fly. Yes, “unnecessary” allows officials to exercise some discretion. But you tell me: what is “unnecessary contact” while a QB is in the act of throwing a pass? The logical inconsistency of such a notion highlights that this rule is all about pumping up the passing game.

***

NFL football is a fantastic. But that doesn’t mean it should rest on its laurels. In some ways it’s been better. Therefore, there’s obvious room for improvement, even if parts of blueprint can be read in the rear-view mirror.

Love and formic acid,
G.I.ant

Posted in G.I.ant Tagged with: , , , , , , , , , , , , , , , , , , , ,