One
of the more interesting comments I received on a student evaluation was
in my Science Fiction Film course, with the student saying, "I'm glad I
took this course; now I understand why my parents find The Simpsons so much funnier than I do."
Good point.
Homer Simpson floating around in the space shuttle, snarfing up potato chips with schmaltzy music in the background is mildly amusing. Homer Simpson in a shot-by-shot parody of the docking sequence in Stanley Kubrick's 2001: A Space Odyssey (1968), complete with Johann Strauss's The Blue Danube waltz is funny. And if you grew up on 2001 as a semi-sacred film "text," then a Simpsons parody of the docking sequence or (more so) Homer as the Star-Child is borderline blasphemous and very, very funny.
Similarly, one might chuckle at Lisa Simpson's conditioning Bart with
electric shocks as negative reinforcement for stealing cupcakes. Bart reaching up for cupcakes with
frosting and red cherries with stems, desperately trying to overcome
his conditioning, exactly like Alex reaching for a pair of
perfectly-formed model's breasts in Kubrick's A Clockwork Orange (1971) — Bart reaching for cupcakes topped with cherries pretty much where the model's nipples are: that cracked me up.
In my Satiric Film course, I showed a Reader's Digest version of the 1979 Monty Python's Life of Brian
(and for any studio lawyers reading this blog: I have long since Ceased
and Desisted in this exercise, so don't send me threatening letters).
The clips got some snickers, but only one student really laughed: the
one leftist who'd been to meetings of contemporary versions of, so to
speak, "The People's Front of Judea" — and not, definitely not "The Judean People's Front." If you've ever been to an interminable meeting of political radicals, you know how accurate is the satire of radicals in Life of Brian. Python's Brian includes
also highly effective satire of imperialists, gullible fanatics,
militant optimists, and law-and-order conservatives; and close to the
end, Brian gets to the
most concentrated, devastating critique I have ever encountered on
Liberals: Michael Palin's rosy-cheeked Roman officer (Nissus Wettus)
handing out crosses at a mass crucifixion, and trying to make everything
nice — while continuing to participate in an atrocity.
Anyway, "The first duty of a critic is to state the obvious," as I was
taught in graduate school, and we can pause here with the obvious point
that different people will get different jokes in movies depending on
which allusions and parodies they get, and whether or not they can
relate a comic bit to their own experience, eliciting (or not) "the
laughter of recognition."
From this obvious small point, however, critics go on to the larger
point that coming from different experiences, different people will see,
while watching the same movie, at least slightly different movies. And
as we get older and develop new experiences, we will see slightly
different movies each viewing.
And similarly with other works of art — and stuff more important than
art; but arguments over movies are usually, not always, but usually,
nonlethal, and familiar ("Everybody's a critic" — especially movie
critics).
So I'll continue on toward larger points with another movie from my
Satiric Film course, a film I included as one kind of "limiting case": Animal House,
by Harold Ramis et al. (1978), with the et al. including Chris Miller,
Dartmouth alum, class of 1962, member of Alpha Delta Phi social
fraternity, and author of the series, Tales of the Adelphian Lodge, of which Animal House, set in 1962, is one.
Chris Miller is a satirist and a lot nastier than I am, but we are roughly contemporaries, and his AΔΦ isn't particularly like the fraternity chapter I ended up pledging but is very much like the TΔΦ chapter I walked out of during Rush Week. More to the point, I knew Doug Neidermeyer, the cadet commander of the Faber College ROTC Corps. That is, I knew a "Neidermeyer"
still in his larval state: the cadet lieutenant who was my platoon
leader my sophomore year of Army ROTC. So I laughed a laugh of
recognition with Neidermeyer and laughed a laugh of cruel satisfaction
when the "Where Are They Now?" portion of Animal House
had Neidermeyer killed by his own troops — fragged — in Vietnam. I had
also "wallowed in Watergate" and laughed at the notice of Greg
Marmalard's becoming a Nixon White House aide and ending up raped in
prison. (I am not as nasty as Chris Miller, but being a nicer person
than a satirist still leaves a lot of room for nasty; plus, when the
theater lights go down, we can all get pretty amoral.)
For my students most of my teaching career, Animal House was a classic comedy. For my students near the end of my career, by the time one could teach a course in Satiric Film, Animal House
was mostly old, politically incorrect, sexist, gross-out farce, and it
took a good deal of explaining to my students ca. 2004 how Animal House in 1978 could have been mostly a sexist gross-out farce but also subversive satire.
Two final examples, from outside of classes: The first was my baptism
in watching different movies while watching the same movie; the second
was my final confirmation in the theory — at least before MetaCritic
came along and one can demonstrate the phenomenon a couple times a week.
Exhibit One: Franklin J. Schaffner's, director, Francis Ford Coppola,
first-listed writer, and George C. Scott, very memorable star: the 1970
movie, Patton. By 1970, I had read and once or twice taught Dwight Macdonald's essay "My Favorite General" — and felt pretty sure the Scott movie had cleaned up Patton, most particularly omitting his anti-Semitism. Still, I saw Patton as a worthy successor to Shakespeare's Henry V
in giving a Machiavellian, objective view of a military winner. (If you
grew up on Lawrence Olivier's World War II propaganda film Henry V, check out Olivier's cuts from
Shakespeare's script and one big addition: the Battle of Agincourt as a
romantic visual spectacular. Olivier did a very elegant script-editing
job to make Shakespeare's Henry V an unambiguous boy-scout hero.) At the
end of Patton, I gathered
up my stuff to leave the theater thinking about how beautifully the
film had shown the General to be a dangerous whack-job, but situated —
destined? — to be the exactly right whack-job to lead the US Third Army
against German forces in Europe. After the War, however, Patton was
mostly just dangerous, including dangerous to important policies of the
United States (although I doubt he was assassinated for his views — however much that would make a hell of a movie).
As I started to walk out of the theater, I heard behind me what sounded
like a grandfather telling his grandson how good it was that Hollywood
had finally made an old-fashioned patriotic movie about a pure hero:
George S. Patton. That Richard Nixon might come to love the movie I had
seen didn't surprise me; but that old man and perhaps his grandson had
experienced a film very different from the one I saw and heard.
That was in Champaign-Urbana, Illinois in 1970. Much later, in 1997, I
was in Hamilton, Ohio — "The Heart of It All," USA-wise — watching Paul
Verhoeven's Starship Troopers
and eventually noticed I was the only one in the auditorium laughing.
And among the cast, only Neil Patrick Harris made it clear he understood
his character was among a group of "fresh-faced fascists" in a "twisted
space opera" that isn't "a sendup for the ages," as the USA Today
reviewer called it, but certainly a sendup. I was waiting for Harris's
young-20's Colonel Carl Jenkins to light a cigarette in a cigarette
holder, adjust his monocle, and say "Ve haf vays of makink der Bug
Brains talk …."
And we have arrived at serious issues.
I am a staunch member of "the reality-based community" and will stress that Starship Troopers
or any other movie — or anything else on the human scale — has real
existence, Out There, independent of our perceptions and opinions.
Meaning, however, is created in the interaction between a work of art
and its audience: which does not mean anything goes and all
interpretations are equal but that we have to be careful and responsible
in constructing meaning.
We have to watch and listen attentively and speak carefully.
We have to bring background information to the work and be aware of the
likelihood of our ignorance of other background.
We have to listen to other people's opinions and understand that we're
all likely to miss and to misunderstand aspects of a work.
And we have to think through our experience of the work and the experiences we bring to the work.
Or we can say, "To hell with it; that was fun, but it ain't worth
talking about." And then shut up. Or talk about the movie, but admit
cheerfully we are, as the picturesque expression has it, talking out of
our asses.
The background part (and lack thereof) is what can get scary, that and
the values people bring to movies, other art, politics. An American
audience should know when we're dealing with Fascism and the fascistic,
and Starship Troopers is
not exactly subtle. An American audience should catch on to, "Oh, I'm
being asked to enjoy a fascistic fantasy. I can do that — but I don't
approve of the upshot of the fantasy." And then, having that bit of
conflict, think about it.
There's nothing new here. Back around 1600, Elizabethan audiences
should have thought, Gee, Shakespeare's Henry V threatens to spit naked
infants upon pikes if Henry doesn't get his way with the French city of
Harfleur (see 3.3.3-43); how should I feel about that? Henry doesn't as
things work out — doesn't kill babies or massacre the people of Harfleur
— but he says he would, and if, in the affairs of princes results, as in winning, is all that counts, shouldn't Henry spit a few French infants (etc.) if that's what it takes to win?
All things considered, General George S. Patton is one of the good guys
of World War II; but with this guy there are a lot of things to
consider. Indeed, should we complain too loudly if it turns out Patton
was assassinated shortly after the war? Old "Blood and Guts" was willing
to spill a lot of other people's blood to achieve US war aims in his
way. If we approve of Patton's philosophy and actions, should we be
very upset if his figurative blood was part of the cost of as much peace
as was had at the end of World War II?
All things considered with Starship Troopers,
even if our enemies are giant Bugs — should we engage in species-cidal
war against them if it's just possible some of our people provoked the
war? Should species-cide be a war aim, even if our opponents are literal
Bugs? And what if, in the real world, we come to see our enemies as
"bugs"? Is genocide OK if we are convinced our enemies are vermin?
What we bring to a work of art and "where we're coming from" will
greatly influence what we experience in the work and how we experience
it. The work, though, isn't passive in all of this: once we've
experienced the work, it is a new experience, one we bring to future
experiences. To some extent, we shape the film as we watch it; to a
lesser extent, it shapes us.
We need to be aware of how this works, or, minimally, at least that it's happening.
Showing posts with label television. Show all posts
Showing posts with label television. Show all posts
Tuesday, March 24, 2015
Literally (Word Usage Literal and Figurative) [28 April 2013]
I'm going to start out with a kind of disclaimer. Shortly I'm going to talk a bit about a segment on The Colbert Report and note that it's kind of fan-boy esoteric. Since I don't often write about The Colbert Report, my reference below might imply that there's something unusual about a bit of allusive complexity on Colbert,
and I want to clarify that the segment I'll refer to is a tad more
esoteric than the Colbert-ian norm, but not much; indeed the one I'll
get to is barely in the same sport, let alone in the same league, as the
awesome nerdocity of the first J. R. R. Tolkien-geek smackdown between Colbert and James Franco (5 April 2011). The Colbert/Franco scherzo and fugue on Lord of the Rings
trivia reached levels of dorkoid esoterica I have never encountered,
and I am a member of the International Association for the Fantastic in
the Arts, have friends who have written books on Tolkien, and was named
first faculty advisor (as the first chapter prank) by the Miami
University Society for Creative Anachronism.
That being clear — I've hesitated to comment on misuses of the word
"literally" in part because Stephen Colbert said all that needed to be
said on the subject — for some audiences.
The occasion was one of the long series of Republican 2012 Presidential
Primary Debates, one a couple weeks after President Obama announced the
withdrawal of most US combat troops in Iraq. Candidate Rick Perry said
in the debate that he'd return US troops to Iraq, lest we "see Iran […] move back in [into Iraq] at literally the speed of light." Colbert
sensed the figurative blood in the figurative water and figuratively
did whatever the technical term is for what sharks do when they take a
big bite out of a thrashing, wounded prey creature: “The speed of light," Colbert quoted. "Not figuratively, literally. […] Folks, forget nuclear weapons. Iran has developed the warp drive.
Those centrifuges were actually enriching dilithium crystals. And
unless we stop them, Captain Mahmoud Ahmad-kirk-ejad will soon be
getting it on with 72 space virgins.”
To truly, if figuratively, savor this moment, it helps to know that
there seems to be no way that matter in our universe can move as fast as
or faster than the speed of light; as one version of the T-shirt has it
"299,792,458 meters per
second. It's not just a good idea, it's the law!" It also helps to know
that science fiction writers fudge one way around the limitations of the
speed of light barrier by invoking theories of a space warp where the
shortest distance between two points in our universe can be through one
or more higher dimensions so that a space ship at point A can get to
point B in what looks like a speed faster than light.
This idea became canonical for a generation with the Warp Drive, especially on classic Star Trek,
the series featuring William Shatner as Captain James Tiberius Kirk. It
made sense in the 1960s. One way to get FTL (Faster Than Light) travel
was to have a chapter explaining theories how it might be possible, as
Arthur C. Clarke did in his novel of 2001: A Space Odyssey
(1968). Another way was just to say, "Warp two, Mr. Sulu." If humans can
break the Sound Barrier and go faster than sound at Mach 2 and 3 and
all, it stands to whatever substitutes for reason while we watch a movie
or TV show that going "Warp 2" or "Warp 3" will get us FTL. It's what
Walt Disney called "The Plausible Impossible" (Disneyland 3.08, which is blocked on YouTube, so don't bother).
And the Warp Drives on vessels on Star Trek were powered by "dilithium crystals," which have something to do with antimatter.
Further, the President of Iran at the time was Mahmoud Ahmadinejad, and
one set of Muslim beliefs has it that male martyrs to the faith will be
greeted and serviced in Paradise by seventy-two virgins — morphed into
"space virgins" by the old SciFi convention of "skify-ing up" a phrase
by just putting "space" before all manner of words.
Follow that?
Enough members of Colbert's audience could that Colbert got a laugh,
and the clip seems to be popular. So for some folks, for a while, a
figurative stake has been put into the figurative heart of "literally."
It's not enough, though, and it is too much.
The Colbert routine isn't enough because of the limited demographics;
it's too much for a number of reasons, starting with all my uses above
of "figuratively."
We use figures of speech all the time, including the hyperbole of "all
the time." To say "figuratively" every time would, as the figure has it,
get old fast.
And the word in question, "literally," can get complicated.
In the section on his "[…] Theory of Symbols" in Anatomy of Criticism (1957), Northrop Frye contrasts literary symbols that are descriptive and refer to things outside the text, and those that are what he calls literal. As the Wikipedia article has it, "To Frye, literal
means nearly the opposite of its usage in common speech; to say that
something 'literally' means something generally involves referring to a
definition external to the text. Instead," in Frye's usage, "literal refers to the symbol's meaning in its specific literary situation while descriptive refers to personal connotation and conventional definition."
Uh, huh. Even among those of us who continued to like Frye after the
1960s, this usage of "literal" never caught on. But no less a serious
thinker about language than Northrop Frye used "literal" that way, and,
if you go back far enough, he had a point. The Oxford English Dictionary's
etymology for "literal" begins, "of or relating to literature
(beginning of the 14th cent.), of or relating to the ‘letter’ of a
text," — and only after the 1300s came to mean "of or relating to the
‘letter’ of a text, obtained by taking words and passages in their
primary or usual meaning" — i.e., what we usually mean by "literal" and
"literally": the words as they usually mean, not getting figurative
(metaphorical, allegorical, mystical — fancy schmancy) on their
figurative asses.
Fairly recently, but long before the 2012 elections, some rhetorically
daring people came upon the (bad) idea of using "literally"
figuratively, as in the OED's definition 6.c, noting a colloquial literally,
"Used in figurative or hyperbolic expressions to add emphasis or as an
intensifier: veritable, real; complete, absolute, utter" and giving as
their first example a quotation going back to a magazine article from
1857: "We hurried on to Baden Baden. Let no American send his son
thither if he have any penchant for the card-table or the roulette. It is a literal hell." The OED
editors immediately add that this usage is "Often considered irregular
in standard English, since it reverses the earlier sense ‘without
metaphor, exaggeration, or distortion’."
No shit?!
Baden Baden is a spa town in southern Germany, and it may've been a
very sordid place in the 1850s. Still, it wasn't hell or even,
literally, "a hell"; at worst, it could be seen as a very large
"gambling hell," in the manner of, say, Las Vegas.
Irregular usage, though, is no big deal, and no one should spend time
attacking Northrop Frye for getting ingenious with "literal" or for
anyone still using the phrase "gambling hell." What is a big deal is
"literally" as a hyperbolic intensifier. Nowadays we suffer
from serious language inflation, and overstatement is becoming not just
an issue of grammar but of politics and ethics.
"He's literally as bad as Hitler" is figurative language; the figure of
speech is hyperbole, overstatement — and unless that statement refers
to a mass murderer on the order of Genghis Khan, Tamburlaine, Stalin, or
Mao, the statement on its face — no argument necessary — is
overstatement that's massive and bordering on the obscene. Matthew White gives
for the death toll for World War II as some 66 million people and cites
for "Who usually gets the most blame: the Axis, especially Hitler" (The Great Big Book of Horrible Things, p. [400]). Steven Pinker, in The Better Angels of Our Nature
passes along as at least plausible, theories of "No Hitler, no
Holocaust" and no Hitler, no World War II, at least not as the
"hemoclysm" World War II became (208-9).
"As bad as Hitler" — literally? Cut the crap. Figuratively.
That being clear — I've hesitated to comment on misuses of the word "literally" in part because Stephen Colbert said all that needed to be said on the subject — for some audiences.
The occasion was one of the long series of Republican 2012 Presidential Primary Debates, one a couple weeks after President Obama announced the withdrawal of most US combat troops in Iraq. Candidate Rick Perry said in the debate that he'd return US troops to Iraq, lest we "see Iran […] move back in [into Iraq] at literally the speed of light." Colbert sensed the figurative blood in the figurative water and figuratively did whatever the technical term is for what sharks do when they take a big bite out of a thrashing, wounded prey creature: “The speed of light," Colbert quoted. "Not figuratively, literally. […] Folks, forget nuclear weapons. Iran has developed the warp drive. Those centrifuges were actually enriching dilithium crystals. And unless we stop them, Captain Mahmoud Ahmad-kirk-ejad will soon be getting it on with 72 space virgins.”
To truly, if figuratively, savor this moment, it helps to know that there seems to be no way that matter in our universe can move as fast as or faster than the speed of light; as one version of the T-shirt has it "299,792,458 meters per second. It's not just a good idea, it's the law!" It also helps to know that science fiction writers fudge one way around the limitations of the speed of light barrier by invoking theories of a space warp where the shortest distance between two points in our universe can be through one or more higher dimensions so that a space ship at point A can get to point B in what looks like a speed faster than light.
This idea became canonical for a generation with the Warp Drive, especially on classic Star Trek, the series featuring William Shatner as Captain James Tiberius Kirk. It made sense in the 1960s. One way to get FTL (Faster Than Light) travel was to have a chapter explaining theories how it might be possible, as Arthur C. Clarke did in his novel of 2001: A Space Odyssey (1968). Another way was just to say, "Warp two, Mr. Sulu." If humans can break the Sound Barrier and go faster than sound at Mach 2 and 3 and all, it stands to whatever substitutes for reason while we watch a movie or TV show that going "Warp 2" or "Warp 3" will get us FTL. It's what Walt Disney called "The Plausible Impossible" (Disneyland 3.08, which is blocked on YouTube, so don't bother).
And the Warp Drives on vessels on Star Trek were powered by "dilithium crystals," which have something to do with antimatter.
Further, the President of Iran at the time was Mahmoud Ahmadinejad, and one set of Muslim beliefs has it that male martyrs to the faith will be greeted and serviced in Paradise by seventy-two virgins — morphed into "space virgins" by the old SciFi convention of "skify-ing up" a phrase by just putting "space" before all manner of words.
Follow that?
Enough members of Colbert's audience could that Colbert got a laugh, and the clip seems to be popular. So for some folks, for a while, a figurative stake has been put into the figurative heart of "literally."
It's not enough, though, and it is too much.
The Colbert routine isn't enough because of the limited demographics; it's too much for a number of reasons, starting with all my uses above of "figuratively."
We use figures of speech all the time, including the hyperbole of "all the time." To say "figuratively" every time would, as the figure has it, get old fast.
And the word in question, "literally," can get complicated.
In the section on his "[…] Theory of Symbols" in Anatomy of Criticism (1957), Northrop Frye contrasts literary symbols that are descriptive and refer to things outside the text, and those that are what he calls literal. As the Wikipedia article has it, "To Frye, literal means nearly the opposite of its usage in common speech; to say that something 'literally' means something generally involves referring to a definition external to the text. Instead," in Frye's usage, "literal refers to the symbol's meaning in its specific literary situation while descriptive refers to personal connotation and conventional definition."
Uh, huh. Even among those of us who continued to like Frye after the 1960s, this usage of "literal" never caught on. But no less a serious thinker about language than Northrop Frye used "literal" that way, and, if you go back far enough, he had a point. The Oxford English Dictionary's etymology for "literal" begins, "of or relating to literature (beginning of the 14th cent.), of or relating to the ‘letter’ of a text," — and only after the 1300s came to mean "of or relating to the ‘letter’ of a text, obtained by taking words and passages in their primary or usual meaning" — i.e., what we usually mean by "literal" and "literally": the words as they usually mean, not getting figurative (metaphorical, allegorical, mystical — fancy schmancy) on their figurative asses.
Fairly recently, but long before the 2012 elections, some rhetorically daring people came upon the (bad) idea of using "literally" figuratively, as in the OED's definition 6.c, noting a colloquial literally, "Used in figurative or hyperbolic expressions to add emphasis or as an intensifier: veritable, real; complete, absolute, utter" and giving as their first example a quotation going back to a magazine article from 1857: "We hurried on to Baden Baden. Let no American send his son thither if he have any penchant for the card-table or the roulette. It is a literal hell." The OED editors immediately add that this usage is "Often considered irregular in standard English, since it reverses the earlier sense ‘without metaphor, exaggeration, or distortion’."
No shit?!
Baden Baden is a spa town in southern Germany, and it may've been a very sordid place in the 1850s. Still, it wasn't hell or even, literally, "a hell"; at worst, it could be seen as a very large "gambling hell," in the manner of, say, Las Vegas.
Irregular usage, though, is no big deal, and no one should spend time attacking Northrop Frye for getting ingenious with "literal" or for anyone still using the phrase "gambling hell." What is a big deal is "literally" as a hyperbolic intensifier. Nowadays we suffer from serious language inflation, and overstatement is becoming not just an issue of grammar but of politics and ethics.
"He's literally as bad as Hitler" is figurative language; the figure of speech is hyperbole, overstatement — and unless that statement refers to a mass murderer on the order of Genghis Khan, Tamburlaine, Stalin, or Mao, the statement on its face — no argument necessary — is overstatement that's massive and bordering on the obscene. Matthew White gives for the death toll for World War II as some 66 million people and cites for "Who usually gets the most blame: the Axis, especially Hitler" (The Great Big Book of Horrible Things, p. [400]). Steven Pinker, in The Better Angels of Our Nature passes along as at least plausible, theories of "No Hitler, no Holocaust" and no Hitler, no World War II, at least not as the "hemoclysm" World War II became (208-9).
"As bad as Hitler" — literally? Cut the crap. Figuratively.
A Tale of Two Commercials: "1984" vs. "Agent Smith" (5 May 2013)
Super
Bowl Sunday, 22 January 1984, saw the one-time airing of one of the
most elegant pieces of cinematic art of which I am aware: Ridley Scott's
TV ad for the Apple Macintosh, "1984."
Into a gray, Orwellian, almost entirely male world runs a woman in color, including red shorts. Into a world of robotized/roboticized people and Stalinist-Modern semi-high-tech, she brings a hammer. And into a very literally Orwellian view-screen out of Nineteen Eighty-Four, she throws that hammer.
Right on, lady!
To combine a couple ads for the Macintosh, and add one widely held prejudice, the athlete in red helps introduce "for the rest of us" a computer — never named — that will free us from, say, IBM products, "So 1984 won't be like … 1984."
As anyone who knows me and/or my writing knows, I really despise advertising and urge all and sundry to turn away from commercials, and warn that every time ads get even a microsecond of our attention, that much the hucksters have won. Still, Ridley Scott's "1984" ad is freaking brilliant.
It is also, looking back — and allowing for a whole lot of contradictions and ironies and hypocrisies — a cultural marker of some significance.
Fast-forward thirty years, then back up a bit.
April 2013 saw wide-spread airing of what Lewis Murphy of a couple respectable ListServs describes as "the latest in a sequence of SF themed commercials from GE," that's General Electric, each featuring "various famous robots (or A.I.), including the Lost In Space robot and Data from Star Trek: TNG" — that's The Next Generation, the one with Patrick Stewart. "There is another [commercial] featuring KITT from Knight Rider."
The recent commercial appears to be titled "Agent of Good" on the GE home page as of 4 May 2013 and is listed as "GE Commercial — Agent of Good: Connected Hospitals" in its most popular YouTube incarnations. I just went with "Agent of Good" in my wiki on "The Human Machine Interface" and described it like this:
Scott
and the folk at AppleCorp had a large supply of gonads, gall, and
chutzpah to appropriate George Orwell's dystopic vision in Nineteen Eighty-Four — and more gall to directly rip off the telescreen in Michael Radner's 1984 film Nineteen Eighty-Four
(either that or Radner ripped off Scott or both followed Orwell very
carefully — or there was one hell of a coincidence). Still, allowing for
all those contradiction, ironies, and hypocrisies, AppleCorp and Scott
were on the side of the angels in 1984 in pitting user-friendly,
decentralized, and relatively democratic — in 1984 — little Macintosh
against IBM-style computers and the IBM business model.
Today we know that web-based, "iTech," Little Brother technology is a major threat, but there is still much to be said in favot of attacks on Big Brother, and that is what we see in Scott's "1984"; however hypocritically, Scott's female little-David-Macintosh symbol smashes the telescreen of IBM-ish Big Brother.
The "Agent Smith" commercial supports GE — the Corporate Person and human people who gave us the politicized Ronald Reagan — and big-machine technology. The recent commercial "normalizes" technology that is omnipresent and invasive and eases us over our fears.
Medical software can indeed be "an agent of good": in a sense a "recuperated," rehabilitated Agent Smith. Big GE medical machines can also be good — and rationalizing and making more efficient the connections among the machines, software, and humans is mostly a good thing.
Mostly.
We should still fear Agent Smith and all he stands for and remain very, very cautious in dealing with the medTech wonders from GE and other huge corporate entities.
Big Brother still needs the occasional hammer-throw hammer thrown into his telepresent face. Agent Smith will never be unambiguously "an agent of good," someone to trust offering candy or life-determining decisions to children.
We may be fogetting such lessons, and the Agent Smith commercial may be a sign of that dangerous amnesia.
Into a gray, Orwellian, almost entirely male world runs a woman in color, including red shorts. Into a world of robotized/roboticized people and Stalinist-Modern semi-high-tech, she brings a hammer. And into a very literally Orwellian view-screen out of Nineteen Eighty-Four, she throws that hammer.
Right on, lady!
To combine a couple ads for the Macintosh, and add one widely held prejudice, the athlete in red helps introduce "for the rest of us" a computer — never named — that will free us from, say, IBM products, "So 1984 won't be like … 1984."
As anyone who knows me and/or my writing knows, I really despise advertising and urge all and sundry to turn away from commercials, and warn that every time ads get even a microsecond of our attention, that much the hucksters have won. Still, Ridley Scott's "1984" ad is freaking brilliant.
It is also, looking back — and allowing for a whole lot of contradictions and ironies and hypocrisies — a cultural marker of some significance.
Fast-forward thirty years, then back up a bit.
April 2013 saw wide-spread airing of what Lewis Murphy of a couple respectable ListServs describes as "the latest in a sequence of SF themed commercials from GE," that's General Electric, each featuring "various famous robots (or A.I.), including the Lost In Space robot and Data from Star Trek: TNG" — that's The Next Generation, the one with Patrick Stewart. "There is another [commercial] featuring KITT from Knight Rider."
The recent commercial appears to be titled "Agent of Good" on the GE home page as of 4 May 2013 and is listed as "GE Commercial — Agent of Good: Connected Hospitals" in its most popular YouTube incarnations. I just went with "Agent of Good" in my wiki on "The Human Machine Interface" and described it like this:
Agent Smith from the Matrix series […] pushes General Electric hospital products. Significant for recycling a major virtual/cybernetic villain into a spokesbeing who is believed by the GE advertising people appropriate to offer lollipops to a boy as Neo is offered the red pill or the blue pill by Morpheus in the initial MATRIX film: a scene of potential child seduction older viewers might find highly creepy. Smith, opening lines: "I have found software that intrigues me; it appears it is an agent of good," connecting GE hardware and software allowing virtual multiple-presence and connecting patients via data (sic) to "software, to nurses to the right people and machines." Images feature multiple Smiths (as in the later MATRIX movies) roaming a hospital, where we also see impressive contemporary medical machinery, with visible, but not stressed, GE logos. This technology, Smith tells us, is "Helping hospitals treat people even better, while dramatically reducing waiting time. Now a waiting room" — nearly empty waiting room shown — "is just a room." Title card: "BRILLIANT MACHINES ARE TRANSFORMING THE WAY WE WORK."
I
commented to Lewis Murphy and the others on the Science Fiction
Research Association List, that General Electric's SF commercials make
sense as an advertising campaign aimed at middle-age/middle-management
folk who order Hospital-size medical devices and at the secondary target
of old-fogey doctors and patients who are uncomfortable about
technological take-over.
The
commercials' message, which I'll her format suitably for relatively
subtle, subliminal screen titles: Machines Are Our Friends (like Robbie
and Data and KITT); Even The Really Scary Threats Are Our Friends Now, or at least not too scary anymore (maybe like the comic Nazis on Hogan's Heroes).
Some
politically trivial geeks and academics aside, few people think much
about commercials. Among those in the target audiences who know and
remember Agent Smith, the warm and fuzzy nostalgia of the Matrix memories will mostly overpower any worries about the technologies that Agent Smith quietly endorses and powerfully symbolizes.
In
its way, the Agent Smith commercial is as technically brilliant as
Ridley Scott's "1984 (For the Rest of Us)" pushing Mac v. IBM. That
Agent Smith/GE is the hero of this latest one indicates important
changes, a couple of which are good.
Today we know that web-based, "iTech," Little Brother technology is a major threat, but there is still much to be said in favot of attacks on Big Brother, and that is what we see in Scott's "1984"; however hypocritically, Scott's female little-David-Macintosh symbol smashes the telescreen of IBM-ish Big Brother.
The "Agent Smith" commercial supports GE — the Corporate Person and human people who gave us the politicized Ronald Reagan — and big-machine technology. The recent commercial "normalizes" technology that is omnipresent and invasive and eases us over our fears.
Medical software can indeed be "an agent of good": in a sense a "recuperated," rehabilitated Agent Smith. Big GE medical machines can also be good — and rationalizing and making more efficient the connections among the machines, software, and humans is mostly a good thing.
Mostly.
We should still fear Agent Smith and all he stands for and remain very, very cautious in dealing with the medTech wonders from GE and other huge corporate entities.
Big Brother still needs the occasional hammer-throw hammer thrown into his telepresent face. Agent Smith will never be unambiguously "an agent of good," someone to trust offering candy or life-determining decisions to children.
We may be fogetting such lessons, and the Agent Smith commercial may be a sign of that dangerous amnesia.
Labels:
1984,
advertising,
agent smith,
apple,
applecorp,
blue pill,
business,
computers,
general electric,
ibm,
macintosh,
matrix,
orwell,
red pill,
ridley scott,
technology,
television,
video
Remembrance of Horrors Past (18 May 2013)
I'm
going to start sidling toward my topic with bragging about a relative
of mine you probably have never heard of. My cousin (of some degree) Joy
Erlichman Miller organized the Holocaust memorial in Peoria, Illinois,
and tried to make the body-count more understandable by collecting
buttons: eleven million of them. The strategy of collecting buttons is
brilliant, and, more to the point I'm slowly moving toward, the number
is correct. Humans aren't wired to understand deaths in even the
thousands, but the sight of millions of buttons can aid our
imaginations. More, having kids collect everyday items like buttons is a
good way to get them to relate to the extraordinary human costs of
slaughters such as the Nazi Holocaust.
The number, though, may also be unfamiliar to you. The Peoria committee used the figure of approximately eleven million murders, and they were wise to do so: both truthful to the best estimates, and politically prudent. Some five to six millions Jews were murdered in the Nazi extermination programs, plus some five to six million Roma ("Gypsies"), Communists, homosexuals, unionists, and other "inferiors," or real or imagined enemies of the Reich. That adds up to eleven million people, approximately, not the more frequently heard figure of six millions. Some six million Jews died, and even if the actual figure is "only" five million, it is a number to remember in itself and is central to the exterminations: "The Final Solution of the Jewish Problem" was the impetus for large-scale, systematic, routinized massacres. Still, if the Shoah is uniquely Jewish and unique in more than just the technical sense applicable to all historical events — if it's literally and absolutely unique, "sui generis," one of a kind — then the Shoah is of only limited usefulness for historical understanding: There aren't many lessons to be learned from a literally unique event. If it is "The Holocaust," and that is that, there is little to be learned beyond "Sh*t can really happen to the Jews." Using the eleven million figure teaches that once a program of genocide gets started, all sorts of people can be sucked in and destroyed. And that point is crucial; if the Shoah just happened to Jews, there's no reason non-Jews today should do much more than sympathize. Fitting the Hitlerian Holocaust into a larger pattern of massacres, as Hannah Arendt does in detail in Origin of Totalitarianism, makes it historically and politically relevant for many people, and aids building "Never Again" coalitions.
Outside of the Peoria Holocaust Memorial and the reference to my cousin Joy, I expect most of what I've just said will be familiar and, with most folk who read a column by a Left-leaning Jew, unexceptional. It's also stuff I've said before (I did once teach a course titled "Massacres").
What's been getting to me lately is watching the last few episodes of the first season of The Borgias, listening a couple times each to Neal Stephenson's BAROQUE CYCLE books and to Steven Pinker's The Better Angels of Our Nature: Why Violence Has Declined — even going so far as to buy a Pinker's book in paperback — plus reading and consulting hard-cover hard copy of Matthew White's The Great Big Book of Horrible Things: The Definitive Chronicle of History's 100 Worst Atrocities.
As The Borgias first season moved toward a final tableaux celebrating traditional family values, Renaissance aristocratic Machiavellian style, we meet Paulo, a definitely non-aristocratic stable hand, who's a nice kid who gets involved with abused-wife Lucrezia Borgia, eventually aiding Lucrezia neutralize for a while her brutish warlord husband, and becoming Lucrezia's lover. Significantly here, even in the more innocent stages of Paolo's involvement with Lucrezia, he worries aloud about getting whipped, and by season's end we see and hear him whipped by his lord and get good indications that he was probably hanged shortly thereafter.
[CORRECTION (and SPOILER): Paolo was spared as it turned out, to be hanged instructively in the following season.]
Servants ca. 1500 were whipped, and the servants of an ignoble nobleman might indeed find themselves hanged for offenses less serious than conspiracy to cripple and cuckold their masters.
And not just servants had to fear the whip: soldier and (notoriously) sailors, and to a lesser degree wives and children and, moving toward but not very far into the Enlightenment, lunatics in places like London's Bedlam Hospital. One can argue that it's not entirely progress that we beat children and the insane much less nowadays and drug them more; one can argue that the "smother love" and constant surveillance and supervision of privileged children isn't 100% superior to benign neglect with occasional brutality — but, come on! As Steven Pinker insists, in these areas we've made serious progress.
What such historical and history-based works as The Borgias and Better Angels make clear is that things have trended toward less violence over human recorded history. Reading them while thinking of what White calls the 20th-c. "hemoclysm" — i.e., blood deluge, the warfare and atrocities 1914-1945 — one might conclude however, and fear, that the trend is indeed toward a decrease of violence but there can be perverse and sudden "regressions to the norm" (sic) — returns of barbarism — that can produce immense suffering.
Two points from such cogitations and my occasional fearful twitchings.
First, Stephenson's BAROQUE CYCLE covers, more or less, the period from the execution of Charles I of England in 1649 to the beginning of the Hanover dynasty in 1714. Stephenson is very clear that the worst of the human social pathologies he deals with was slavery, continuing in the East and, during this period, increasing in the Americas — and with increasing British involvement. White estimates the suffering in deaths alone of the slave trade alone at 18.5 million for the Mideast slave trade from the 7th through the 19th centuries, and some sixteen million for the Atlantic slave trade, 1452-1807. (The two slave trades rank at #8 and #10 of White's ranking of our species' "One Hundred Deadliest Multicides," edging out, for instructive examples, the Conquest of the Americas and First World War, both coming in at 15 million deaths.)
It is clear, however, that state-sanctioned and enforced slavery was the extreme of a continuum of cruelty, or, changing the image, an extreme area on a web of cruelty that permeated everyday life. Even as Jews should put the Shoah into it a larger context of massacres, African-Americans should put American chattel slavery onto that continuum, or at the center of a figurative web of oppression, exploitation, and cruelty.
People can argue, and I do so, that racism developed in part to allow continued cruelty to Black people (and, later, Jews and Slavs and Roma) after it became increasingly unfashionable — war excepted — for Moderns to be cruel to people seen as people and even bad form to brutalize sympathetic nonhuman animals.
Pinker talks of "The Humanitarian Revolution" (ch. 4) that slowly came in with the Modern Era and the Enlightenment, increasing humanitarianism that included, eventually, the elimination of slavery. While slavery continued and was profitable, however, it needed justification, and it is no coincidence that racism came along to provide that justification: it was becoming increasingly "unacceptable," as we so weakly say — Not Done — to grossly abuse people (again, war-time enemies excluded); a theory was necessary that made Blacks less than people, and that theory was racism.
We need to be clear on "racism": that's an "-ism," an ideology, a theory, and one with a history. Bigotry is more or less natural to humans: a subset of the nastier parts of the "amity/enmity complex" to use Robert Ardrey's formulation; or "Let 'em all go to hell, / Except Cave Sev-enty-six!" in the formulation of Mel Brook's 2000-Year-Old Man. For seeing the difference between bigotry and racism, and for dating racism in England, note Thomas Rymer's argument against "Othello: A Bloody Farce" in his Short View of Tragedy (1693, ch. VII; reprinted in Frank Kermode's Four Centuries of Shakespearian Criticism [1974: 461-69]). In Othello, Shakespeare shows ample bigotry against the Moor: that "old black ram," "the thick lips." The bigotry, however, is within the world of the play. Rymer laments that the play remained highly popular with English audiences into his time in the late 17th century — popular in spite what Rymer saw as its gross errors and absurdities. To start, Othello is not a properly Neoclassic play, but along with that error — Rymer was a militant neoclassicist — and relating to it, it's hero just isn't, well, appropriate. In Othello, Othello is a general of the armies of Venice and, apparently, one of the great military leaders of Christian Europe. Rymer allows that the Venetian Republic hired foreign mercenaries for their armies,
There was bigotry aplenty in Shakespeare's real-world England, from prejudice against to loathing of foreigners and others, but not enough animosity against Blacks to keep English audiences from sympathizing with "black Othello"; by the time we get to Rymer, something has changed.
That "Humanitarian Revolution" was getting started, and for screamingly obvious commercial reasons Blacks from Africa, for the consciences of many people, had to be excluded for the circle of humanity. There was a great deal of money to be made from the slave trade and from the stolen labor of slaves to produce high-profit commodities like tobacco, sugar, and rum. If we allow sugar as a "food-drug" — and no less an authority than Sidney W. Mintz says we should do so — then the institution of Black chattel slavery in the New World came about in large part because there was a lot of money to be made then, as now, pushing drugs (Sweetness and Power: The Place of Sugar in Modern History, 1985).
And, of course, before the more northern European competition got into the tobacco, sugar, and rum rackets, New World enslavement of Indians and, later, Blacks had served a lust for lucre more directly in Spanish America in the mining of gold and silver.
Most of our European ancestors most of the time dealt with horrors like the slave trade the way we Americans today deal with exploited labor in, say, the manufacturing of our clothing: they ignored it. I don't ask about how I can order on line sweatpants for six bucks a pair; if you had high-flown English ancestors ca. 1692, they didn't ask too many questions about the sugar in their coffee or hot chocolate (or, for that matter, ask inconvenient questions about the coffee or chocolate).
More important, though — and, finally, my point here — is the one stressed by Steven Pinker: how much cruelty for how long was not actively ignored but seen casually and just accepted as part of the "warp and woof," deeply woven into, the web of everyday life.
"The wogs begin at Callus," as the stereotypical racist Colonel Blimp's of Great Britain used to say: i.e., the range of the inferior peoples of the Earth began as soon as one crossed the English (by God!) Channel and disembarked at the carefully mispronounced French port of Calais. That attitude, however — hell, racism for some of them! — was progress: for most high-born English for most of their history, inferiors started a whole lot closer to home than France, including in one's home with one's servants or (if male) wife and (for both parents) children. One could feel downright righteous birching one's kids bloody, to beat hell and the devil out of them; and, of course, even the lower orders could watch for entertainment the torture of animals — bear-baiting, bull baiting, ratting — or condemned criminals (if you couldn't afford to pay off the hangman for a longish drop, "hang by the neck until dead" could take a long time).
So Black African slavery was horrific, but its horrors were tolerated as long as they were not only because of racism but also because of a general casualness about cruelty.
If we are to be serious and effective about "Never Again," Jews should remind all and sundry that the Hitlerian Holocaust was emphatically not limited to Jews and fits into a larger tradition of massacres: far from unique, the Shoah is an instructive extreme on a continuum of atrocities. American Blacks and Africans should remind people that the Atlantic slave trade was part of a long tradition of murderous exploitation of Africans and others: including at one time the enslavement of just about anyone who could be taken prisoner in war or stolen.
Americans who hear of atrocities and say, "It can't happen here" forget that chattel slavery, for one very big thing, did happen here, until 1865, as did the terrorism of the Ku Klux Klan into my lifetime. And this is not frightening so much because we Americans are particularly evil but because we are well within the normal range of humanity. As Stephenson, Pinker, White, and the makers of The Borgias make clear, routine, banal, horrible cruelty is part of the repertoire of human behavior.
So is the capacity for good, those "Better Angels of Our Nature." If Pinker is right, those Better Angels, our inclinations to the good, have been trending upward since the Bronze Age; as the reintroduction of slavery in the Early Modern period indicates and the 20th-c. hemoclysm drives home; the tendency is far from inevitable. Indeed, often it's just "Sh*t happens"; but far too often "sh*t" is done to all sorts of people, and emphatically not just Jews and Blacks. So, to repeat a useful cliché, following Pinker: Let us be hopeful, but also ready to aid the exploited, oppressed, and abused not just out of decency, but because, always, any of us can be on the list — and for most of human history, most people were.
The number, though, may also be unfamiliar to you. The Peoria committee used the figure of approximately eleven million murders, and they were wise to do so: both truthful to the best estimates, and politically prudent. Some five to six millions Jews were murdered in the Nazi extermination programs, plus some five to six million Roma ("Gypsies"), Communists, homosexuals, unionists, and other "inferiors," or real or imagined enemies of the Reich. That adds up to eleven million people, approximately, not the more frequently heard figure of six millions. Some six million Jews died, and even if the actual figure is "only" five million, it is a number to remember in itself and is central to the exterminations: "The Final Solution of the Jewish Problem" was the impetus for large-scale, systematic, routinized massacres. Still, if the Shoah is uniquely Jewish and unique in more than just the technical sense applicable to all historical events — if it's literally and absolutely unique, "sui generis," one of a kind — then the Shoah is of only limited usefulness for historical understanding: There aren't many lessons to be learned from a literally unique event. If it is "The Holocaust," and that is that, there is little to be learned beyond "Sh*t can really happen to the Jews." Using the eleven million figure teaches that once a program of genocide gets started, all sorts of people can be sucked in and destroyed. And that point is crucial; if the Shoah just happened to Jews, there's no reason non-Jews today should do much more than sympathize. Fitting the Hitlerian Holocaust into a larger pattern of massacres, as Hannah Arendt does in detail in Origin of Totalitarianism, makes it historically and politically relevant for many people, and aids building "Never Again" coalitions.
Outside of the Peoria Holocaust Memorial and the reference to my cousin Joy, I expect most of what I've just said will be familiar and, with most folk who read a column by a Left-leaning Jew, unexceptional. It's also stuff I've said before (I did once teach a course titled "Massacres").
What's been getting to me lately is watching the last few episodes of the first season of The Borgias, listening a couple times each to Neal Stephenson's BAROQUE CYCLE books and to Steven Pinker's The Better Angels of Our Nature: Why Violence Has Declined — even going so far as to buy a Pinker's book in paperback — plus reading and consulting hard-cover hard copy of Matthew White's The Great Big Book of Horrible Things: The Definitive Chronicle of History's 100 Worst Atrocities.
As The Borgias first season moved toward a final tableaux celebrating traditional family values, Renaissance aristocratic Machiavellian style, we meet Paulo, a definitely non-aristocratic stable hand, who's a nice kid who gets involved with abused-wife Lucrezia Borgia, eventually aiding Lucrezia neutralize for a while her brutish warlord husband, and becoming Lucrezia's lover. Significantly here, even in the more innocent stages of Paolo's involvement with Lucrezia, he worries aloud about getting whipped, and by season's end we see and hear him whipped by his lord and get good indications that he was probably hanged shortly thereafter.
[CORRECTION (and SPOILER): Paolo was spared as it turned out, to be hanged instructively in the following season.]
Servants ca. 1500 were whipped, and the servants of an ignoble nobleman might indeed find themselves hanged for offenses less serious than conspiracy to cripple and cuckold their masters.
And not just servants had to fear the whip: soldier and (notoriously) sailors, and to a lesser degree wives and children and, moving toward but not very far into the Enlightenment, lunatics in places like London's Bedlam Hospital. One can argue that it's not entirely progress that we beat children and the insane much less nowadays and drug them more; one can argue that the "smother love" and constant surveillance and supervision of privileged children isn't 100% superior to benign neglect with occasional brutality — but, come on! As Steven Pinker insists, in these areas we've made serious progress.
What such historical and history-based works as The Borgias and Better Angels make clear is that things have trended toward less violence over human recorded history. Reading them while thinking of what White calls the 20th-c. "hemoclysm" — i.e., blood deluge, the warfare and atrocities 1914-1945 — one might conclude however, and fear, that the trend is indeed toward a decrease of violence but there can be perverse and sudden "regressions to the norm" (sic) — returns of barbarism — that can produce immense suffering.
Two points from such cogitations and my occasional fearful twitchings.
First, Stephenson's BAROQUE CYCLE covers, more or less, the period from the execution of Charles I of England in 1649 to the beginning of the Hanover dynasty in 1714. Stephenson is very clear that the worst of the human social pathologies he deals with was slavery, continuing in the East and, during this period, increasing in the Americas — and with increasing British involvement. White estimates the suffering in deaths alone of the slave trade alone at 18.5 million for the Mideast slave trade from the 7th through the 19th centuries, and some sixteen million for the Atlantic slave trade, 1452-1807. (The two slave trades rank at #8 and #10 of White's ranking of our species' "One Hundred Deadliest Multicides," edging out, for instructive examples, the Conquest of the Americas and First World War, both coming in at 15 million deaths.)
It is clear, however, that state-sanctioned and enforced slavery was the extreme of a continuum of cruelty, or, changing the image, an extreme area on a web of cruelty that permeated everyday life. Even as Jews should put the Shoah into it a larger context of massacres, African-Americans should put American chattel slavery onto that continuum, or at the center of a figurative web of oppression, exploitation, and cruelty.
People can argue, and I do so, that racism developed in part to allow continued cruelty to Black people (and, later, Jews and Slavs and Roma) after it became increasingly unfashionable — war excepted — for Moderns to be cruel to people seen as people and even bad form to brutalize sympathetic nonhuman animals.
Pinker talks of "The Humanitarian Revolution" (ch. 4) that slowly came in with the Modern Era and the Enlightenment, increasing humanitarianism that included, eventually, the elimination of slavery. While slavery continued and was profitable, however, it needed justification, and it is no coincidence that racism came along to provide that justification: it was becoming increasingly "unacceptable," as we so weakly say — Not Done — to grossly abuse people (again, war-time enemies excluded); a theory was necessary that made Blacks less than people, and that theory was racism.
We need to be clear on "racism": that's an "-ism," an ideology, a theory, and one with a history. Bigotry is more or less natural to humans: a subset of the nastier parts of the "amity/enmity complex" to use Robert Ardrey's formulation; or "Let 'em all go to hell, / Except Cave Sev-enty-six!" in the formulation of Mel Brook's 2000-Year-Old Man. For seeing the difference between bigotry and racism, and for dating racism in England, note Thomas Rymer's argument against "Othello: A Bloody Farce" in his Short View of Tragedy (1693, ch. VII; reprinted in Frank Kermode's Four Centuries of Shakespearian Criticism [1974: 461-69]). In Othello, Shakespeare shows ample bigotry against the Moor: that "old black ram," "the thick lips." The bigotry, however, is within the world of the play. Rymer laments that the play remained highly popular with English audiences into his time in the late 17th century — popular in spite what Rymer saw as its gross errors and absurdities. To start, Othello is not a properly Neoclassic play, but along with that error — Rymer was a militant neoclassicist — and relating to it, it's hero just isn't, well, appropriate. In Othello, Othello is a general of the armies of Venice and, apparently, one of the great military leaders of Christian Europe. Rymer allows that the Venetian Republic hired foreign mercenaries for their armies,
But shall a Poet thence fancy that they will set a Negro to be their General; or trust a Moor to defend them against the Turk? With us a Black-amoor might rise to be a Trumpeter; but Shakespear would not have him less than a Lieutenant-General. With us a Moor might marry some little drab [= a whore], or Small-coal Wench: Shake-spear, would provide him the Daughter and Heir of some great Lord, or Privy-Councellor [Othello' elopes with the daughter of a senator]: And all the Town should reckon it a very suitable match. […] Nothing is more odious in Nature than an improbable lye [=lie]; And, certainly, never was any Play fraught, like this of Othello, with improbabilities. (Kermode volume, page 462).
There was bigotry aplenty in Shakespeare's real-world England, from prejudice against to loathing of foreigners and others, but not enough animosity against Blacks to keep English audiences from sympathizing with "black Othello"; by the time we get to Rymer, something has changed.
That "Humanitarian Revolution" was getting started, and for screamingly obvious commercial reasons Blacks from Africa, for the consciences of many people, had to be excluded for the circle of humanity. There was a great deal of money to be made from the slave trade and from the stolen labor of slaves to produce high-profit commodities like tobacco, sugar, and rum. If we allow sugar as a "food-drug" — and no less an authority than Sidney W. Mintz says we should do so — then the institution of Black chattel slavery in the New World came about in large part because there was a lot of money to be made then, as now, pushing drugs (Sweetness and Power: The Place of Sugar in Modern History, 1985).
And, of course, before the more northern European competition got into the tobacco, sugar, and rum rackets, New World enslavement of Indians and, later, Blacks had served a lust for lucre more directly in Spanish America in the mining of gold and silver.
Most of our European ancestors most of the time dealt with horrors like the slave trade the way we Americans today deal with exploited labor in, say, the manufacturing of our clothing: they ignored it. I don't ask about how I can order on line sweatpants for six bucks a pair; if you had high-flown English ancestors ca. 1692, they didn't ask too many questions about the sugar in their coffee or hot chocolate (or, for that matter, ask inconvenient questions about the coffee or chocolate).
More important, though — and, finally, my point here — is the one stressed by Steven Pinker: how much cruelty for how long was not actively ignored but seen casually and just accepted as part of the "warp and woof," deeply woven into, the web of everyday life.
"The wogs begin at Callus," as the stereotypical racist Colonel Blimp's of Great Britain used to say: i.e., the range of the inferior peoples of the Earth began as soon as one crossed the English (by God!) Channel and disembarked at the carefully mispronounced French port of Calais. That attitude, however — hell, racism for some of them! — was progress: for most high-born English for most of their history, inferiors started a whole lot closer to home than France, including in one's home with one's servants or (if male) wife and (for both parents) children. One could feel downright righteous birching one's kids bloody, to beat hell and the devil out of them; and, of course, even the lower orders could watch for entertainment the torture of animals — bear-baiting, bull baiting, ratting — or condemned criminals (if you couldn't afford to pay off the hangman for a longish drop, "hang by the neck until dead" could take a long time).
So Black African slavery was horrific, but its horrors were tolerated as long as they were not only because of racism but also because of a general casualness about cruelty.
If we are to be serious and effective about "Never Again," Jews should remind all and sundry that the Hitlerian Holocaust was emphatically not limited to Jews and fits into a larger tradition of massacres: far from unique, the Shoah is an instructive extreme on a continuum of atrocities. American Blacks and Africans should remind people that the Atlantic slave trade was part of a long tradition of murderous exploitation of Africans and others: including at one time the enslavement of just about anyone who could be taken prisoner in war or stolen.
Americans who hear of atrocities and say, "It can't happen here" forget that chattel slavery, for one very big thing, did happen here, until 1865, as did the terrorism of the Ku Klux Klan into my lifetime. And this is not frightening so much because we Americans are particularly evil but because we are well within the normal range of humanity. As Stephenson, Pinker, White, and the makers of The Borgias make clear, routine, banal, horrible cruelty is part of the repertoire of human behavior.
So is the capacity for good, those "Better Angels of Our Nature." If Pinker is right, those Better Angels, our inclinations to the good, have been trending upward since the Bronze Age; as the reintroduction of slavery in the Early Modern period indicates and the 20th-c. hemoclysm drives home; the tendency is far from inevitable. Indeed, often it's just "Sh*t happens"; but far too often "sh*t" is done to all sorts of people, and emphatically not just Jews and Blacks. So, to repeat a useful cliché, following Pinker: Let us be hopeful, but also ready to aid the exploited, oppressed, and abused not just out of decency, but because, always, any of us can be on the list — and for most of human history, most people were.
Labels:
animals,
belief/religion,
business,
cruelty,
family,
feminism,
food/drink,
gay/lesbian,
genocide,
holocaust,
massacres,
matthew white,
politics,
slave trade,
slavery,
steven pinker,
television,
torture
High-Speed Car Chases (2 June 2013)
I just watched a Fast and Furious movie — it was the 2013 edition, but that's irrelevant — and then Now You See Me, a somewhat more thoughtful film, although that's not saying a whole lot.
Anyway, I enjoyed both movies, and the previews for similar films, and I got thinking about a lot of movies I've enjoyed less because I've been bored by the car chases. And then I got thinking of my life in the greater Cincinnati area 1971-2007 and about real-world high-speed police chases.
And then I got pissed off at the makers of many "action" films.
The web is unfortunately overstocked with troll-attorneys trawling for business, but try a search for actual "high-speed police chase" and accusations of "wrongful death." What popped up first in my search was the story of how "Kelly Spurlock, the widow of NASA engineer Darren Spurlock who was killed when he crossed the path of a high-speed police chase in 2008, is moving forward with her wrongful death lawsuit against the fleeing driver, the City of Huntsville [Alabama], and three officers who participated in the chase" — apparently a "high-speed chase at midday" in a busy area "where it […] put other motorists at risk."
The Cincy case that comes up early on Google is the recent one reported by the AP with the lead: "The wife of [Mohamed Ould Mohamed Sidi,] a Cincinnati cab driver killed in a crash at the end of a high-speed police chase" in March of 2011 "is suing the city." The short form of the story ends with the sentence, "Cincinnati Solicitor John Curp said the city is not responsible for the criminal acts of others." Now it is certainly true that the City of Cincinnati is not responsible for the criminal acts of its "civilian" residents, but there remains the question of the potential responsibility of the City if three of its police officers "were negligent and caused," fairly directly, "Sidi's death."
(The "civilian" isn't part of what I'm quoting; it's in "scare quotes" because I recall the chuckles when I first heard cops talk of "civilians," meaning their non-cop fellow citizens. That was during The Troubles in the spring of 1970, on a college campus in central Illinois and far, far away from the slaughter of the Vietnam War, where there were more old-fashioned varieties of civilians and combatants. This encounter with a new word-usage occurred at a meeting between police and protesters arranged by local Christians who took their faith seriously ["Blessed are the peacemakers"]. We protesters chuckled and then both groups adopted the usage: "'civilians', plural noun: neutrals at demonstrations or just in town, not cops and not protesters — folks neither out protesting nor policing the protests." But I digress, though relevantly.)
It may be hidden on the web — at least without LexisNexis — but very strong in my memory is the debate over high-speed police chases in Cincinnati occasioned by one that killed, injured, and/or endangered a mother and child. My memory is a baby in in a baby carriage while the mother pushed it across the street, resulting in legal action where the cops were cleared and so was the City — although the City at least undoubtedly struck some deal out of court.
The point here is the debate, which included some older police officers' complaining in public about cop cowboys and an tendency among their younger colleagues to substitute for "Serve and Protect," "Get the Motherf*ckers!"
I paraphrase, or at least I paraphrase what went into the media, but some older cops did criticize strongly a kind of macho cop culture. The accusation was general, but as the debate went on, at least one cop blamed COPS, the TV show, and nowadays I'll blame even more the techno-porn speed-worship of the high-speed car-chase movie, especially when those chases are engaged in by officers of the law. Cops and many other armed agents of the State are sworn to serve and protect; and the have at their disposal such low-key technological wonders as radio, not to mention nowadays traffic cameras and computers and devices for tracking down an escaped suspect.
Some place along the line, I want to see a movie where some sympathetic hot-shot cowboy cop buddies hit that baby carriage, kill the little girl therein (and her new puppy) and have to live with that the rest of their lives. I want to see some movies where the camera goes back to a flipped car and gets some medium-duration fairly close shots of what a real-world-style traffic disaster actually does to the human body.
I came out west mostly for the climate, but in part to live on the edge of the film industry and as much as I can whore myself to Hollywood. So far — in my usual joke — I've made it only to Chicago chippy and Toronto trollop, with some hope of getting to Burbank bimbo; and, indeed, I lust after the sort of resources that allow filmmakers to execute a high-speed chase sequence. Also I taught literature for forty years and film long enough to know that most people can differentiate quite well between real life and power fantasies.
But come on, guys!
Those chase sequences normalize policemen behaving badly, irresponsibly; they set up a kind of perverse ideal of disregard for everyday people. This fits into a larger pattern of normalizing bad behavior by cops and other sworn agents of the State — more on that elsewhere — and such normalizing (romanticizing, idealizing) is not right.
The City of Cincinnati is not responsible for the criminal acts of criminals, and moviemakers are not responsible for stupid and dangerous acts by people who don't know fantasy when they see it. The City of Cincinnati is, however, responsible at least in part for the actions of its employees and agents; American cities and film-makers, story-tellers and artists are responsible when they encourage bad behavior by peace officers by presenting as an ideal the shift from the wimpoid "To Serve and Protect" to macho (and nowadays macha), "Get the Motherf*ckers! (And if Some 'Civilians' Get Smashed, Well, They're Not Our People)."
Anyway, I enjoyed both movies, and the previews for similar films, and I got thinking about a lot of movies I've enjoyed less because I've been bored by the car chases. And then I got thinking of my life in the greater Cincinnati area 1971-2007 and about real-world high-speed police chases.
And then I got pissed off at the makers of many "action" films.
The web is unfortunately overstocked with troll-attorneys trawling for business, but try a search for actual "high-speed police chase" and accusations of "wrongful death." What popped up first in my search was the story of how "Kelly Spurlock, the widow of NASA engineer Darren Spurlock who was killed when he crossed the path of a high-speed police chase in 2008, is moving forward with her wrongful death lawsuit against the fleeing driver, the City of Huntsville [Alabama], and three officers who participated in the chase" — apparently a "high-speed chase at midday" in a busy area "where it […] put other motorists at risk."
The Cincy case that comes up early on Google is the recent one reported by the AP with the lead: "The wife of [Mohamed Ould Mohamed Sidi,] a Cincinnati cab driver killed in a crash at the end of a high-speed police chase" in March of 2011 "is suing the city." The short form of the story ends with the sentence, "Cincinnati Solicitor John Curp said the city is not responsible for the criminal acts of others." Now it is certainly true that the City of Cincinnati is not responsible for the criminal acts of its "civilian" residents, but there remains the question of the potential responsibility of the City if three of its police officers "were negligent and caused," fairly directly, "Sidi's death."
(The "civilian" isn't part of what I'm quoting; it's in "scare quotes" because I recall the chuckles when I first heard cops talk of "civilians," meaning their non-cop fellow citizens. That was during The Troubles in the spring of 1970, on a college campus in central Illinois and far, far away from the slaughter of the Vietnam War, where there were more old-fashioned varieties of civilians and combatants. This encounter with a new word-usage occurred at a meeting between police and protesters arranged by local Christians who took their faith seriously ["Blessed are the peacemakers"]. We protesters chuckled and then both groups adopted the usage: "'civilians', plural noun: neutrals at demonstrations or just in town, not cops and not protesters — folks neither out protesting nor policing the protests." But I digress, though relevantly.)
It may be hidden on the web — at least without LexisNexis — but very strong in my memory is the debate over high-speed police chases in Cincinnati occasioned by one that killed, injured, and/or endangered a mother and child. My memory is a baby in in a baby carriage while the mother pushed it across the street, resulting in legal action where the cops were cleared and so was the City — although the City at least undoubtedly struck some deal out of court.
The point here is the debate, which included some older police officers' complaining in public about cop cowboys and an tendency among their younger colleagues to substitute for "Serve and Protect," "Get the Motherf*ckers!"
I paraphrase, or at least I paraphrase what went into the media, but some older cops did criticize strongly a kind of macho cop culture. The accusation was general, but as the debate went on, at least one cop blamed COPS, the TV show, and nowadays I'll blame even more the techno-porn speed-worship of the high-speed car-chase movie, especially when those chases are engaged in by officers of the law. Cops and many other armed agents of the State are sworn to serve and protect; and the have at their disposal such low-key technological wonders as radio, not to mention nowadays traffic cameras and computers and devices for tracking down an escaped suspect.
Some place along the line, I want to see a movie where some sympathetic hot-shot cowboy cop buddies hit that baby carriage, kill the little girl therein (and her new puppy) and have to live with that the rest of their lives. I want to see some movies where the camera goes back to a flipped car and gets some medium-duration fairly close shots of what a real-world-style traffic disaster actually does to the human body.
I came out west mostly for the climate, but in part to live on the edge of the film industry and as much as I can whore myself to Hollywood. So far — in my usual joke — I've made it only to Chicago chippy and Toronto trollop, with some hope of getting to Burbank bimbo; and, indeed, I lust after the sort of resources that allow filmmakers to execute a high-speed chase sequence. Also I taught literature for forty years and film long enough to know that most people can differentiate quite well between real life and power fantasies.
But come on, guys!
Those chase sequences normalize policemen behaving badly, irresponsibly; they set up a kind of perverse ideal of disregard for everyday people. This fits into a larger pattern of normalizing bad behavior by cops and other sworn agents of the State — more on that elsewhere — and such normalizing (romanticizing, idealizing) is not right.
The City of Cincinnati is not responsible for the criminal acts of criminals, and moviemakers are not responsible for stupid and dangerous acts by people who don't know fantasy when they see it. The City of Cincinnati is, however, responsible at least in part for the actions of its employees and agents; American cities and film-makers, story-tellers and artists are responsible when they encourage bad behavior by peace officers by presenting as an ideal the shift from the wimpoid "To Serve and Protect" to macho (and nowadays macha), "Get the Motherf*ckers! (And if Some 'Civilians' Get Smashed, Well, They're Not Our People)."
"Words Mean": Precision, Clarity, and Royal Pudding (13 July 3013)
Royal...Pudding...
Rich, rich, rich in flavor!
Smooth, smooth, smooth as silk,
Few of us, though, have such power, and most of us, most of the time, use words to communicate and have to stick to generally accepted definitions.
Still, many were the times my students (and older folk who should know better) said, "Oh, you know what I meant!", and sometimes I did know. "Pedagogical confusion" is an old trick, where the teacher pretends not to understand what the student is saying and prods the little sod … uh, encourages the pupil into clarifying the point. Sometimes, though, no, I didn't understand; sometimes I'd been grading essays into the small hours of the morning and had trouble even seeing the words on the page. Sometimes, yeah, I understood — but I understood because I knew the material, knew the context, knew the student's dialect, and was a professional with a lot of experience and someone paid to read and figure out what authors mean; it's really arrogant for most writers to place such expectations on normal people, working for free.
Sometimes, though, people expect you to just know what they mean and not challenge them because they're passing on received wisdom and speaking in clichés. A few such folk have an agenda, as your boss or coach very well may when urging from you a "110% commitment"; that 110% is a figure of speech and a cliché, but it's also a veiled demand for a blank check for your time and effort.
Most clichés, though, are less pernicious than bosses of various sorts setting themselves up to demand way more of your time and effort than you should reasonably give. (If you give more than, say, 33.3% commitment to your job or 22.7% to a sports team, you're probably short-changing your family, community, and yourself.)
I was generally spared motivational and inspirational balderdash, and most often just got innocent lemming-thought like a fair number of my more sentimental students' seeing Romeo and Juliet's being together "forever."
I would quote Hamlet at them and suggest that "The rest is silence" is probably the best way to think about deaths at the end of tragedies. Romeo and Juliet, as tragedies go, is highly romantic and upbeat, and ends like an Italian comedy gone horribly wrong: a new and better world coalesces around a young married couple — Verona will finally know some peace — but the newly-weds are dead. So I would repeat back to my students, "they will be together forever" and ask, where?
Romeo and Juliet are Catholics, you know, and so were most of my students; the earliest reference to Montagues and Capulets is in Dante's Purgatorio (canto 6, lines 106-08). If Catholics get the doctrine right, and Dante correctly supplied the gory details, Romeo and Juliet are together forever in the middle ring of the Seventh Circle of Hell, as trees or bushes eaten by Harpies.
If the Protestants are right, R&J are less graphically burning in the lake of fire.
Taking a more materialist or spiritualist view, the question is still, if "You can never kill love / When love is true" — as The Kingston Trio affirmeth unto this day on my old tapes — where does love live eternally. Unless you're a Buddhist, the fashionable cosmologies open to you insist on a doomed universe. "Some say the world will end in fire, / Some say in ice," but there's wide agreement that the world will end. Mythically, there's the Christian Apocalypse — possibly that "fire next time" — or Nordic Ragnarok for the ice alternative. Or the universe peters out through entropy or comes to a fiery pop out of existence with The Big Crunch. Take your pick: if you want to see love lasting forever, it has to be outside our universe.
Even a saved pair of lovers up in heaven will have their love subsumed in an infinite divine love, or they will be in a variety of hell. Think seriously for a moment on heaven as an eternal family reunion. Then wait for a Rod Serling voice-over reminding you that an eternity of anything not infinitely interesting will become torture. However much Juliet et al. think their loves infinite, eternity had better be an "eternal Now," 'cause a century or so of human time with Romeo, and she's going to try to strangle him (if he doesn't snap first).
Romeo and Juliet's love is a bright flash, tragically beautiful, tragically brief; it's of infinite value, if you like that idea and believe, but also "sudden; / Too like the lightning, which doth cease to be / Ere one can say, 'It lightens'" (2.2.119-20): its an awesome fireworks of "fire and powder, / Which, as they kiss, consume" (2.6.10-11).
Let it go at that, kids, I'd say; it's plenty.
All of us some of the time and some of us much of the time just don't think much about what we're saying. And that's OK, so long as we note that when we get serious about, say, love and marriage, the legal formula is precisely, clearly, and with hope but minimal bullshit, "until death do us part."
In less exalted matters, and more generally, precision and clarity can still be difficult.
I recently contributed to a tribute to my colleague Brit Harwood upon his retirement from the Miami University (Oxford, OH) English Department. My comments included the point that he improved our vocabularies.
Brit was into precision, and his suggested wording for rules and policies and reports and such were very, very precise — and sent us to the dictionary. On at least one occasion, the trip, so to speak, was worth it. Brit taught us the word "cognizable," which is a very useful word for white-collar peasants and the professorial proletariat: in an extended meaning. it's a claim that actions by one's superiors can be reviewed and potentially blocked.
I didn't know "cognizable" but do now. Similarly with a number of words Brit used, including in communicating with the Department's Promotion and Tenure Committee. Some of Brit's totally precise terms were totally unclear to your average full professor in English at a respectable university, though not a first-rate one.
Which brings us to another question of precision and clarity.
The closest thing I ever had to an argument with the Chair of my Department came when I was making claims for more appreciation of my scholarship or some such thing, and my Chair said to me, "Oh, come on Rich! You've got a second-rate mind." To which I replied, "Yeah, but I'm at a third-rate university," an assertion with which he didn't argue.
My boss and I were both PhDs from the University of Illinois at Urbana, which is a second-rate school: usually in the top twenty-five world-wide, but not in the top ten or even up there with my other alma mater, Cornell: ranking thirteenth in the Shanghai Rankings of 2012. If Miami University has ever appeared in the top 500 in the Shanghai Ranking, then I missed it.
Talking about Miami U as "third-rate" was precise (arguably even generous) but maybe misleading. Something like "third-rank" might be better, ranking just behind, say, the academically major Big Ten schools. Miami University is a good school and nearly a perfect "fit" for many of our students; it's a place where an undergraduate and even many graduates can get a good education, usually more than adequate for their needs and abilities.
That the whole ranking system might be legitimately characterized as having "all the intellectual respectability of a pecker contest": that's a different issue. Still, if you can't make the top 500 on an international list, calling yourself a "top-rank" institution is either really parochial — "Best School in the Miami Valley!" — or it's hyper, or lying.
My favorite example in this area combines precision with obfuscation disguised as clarity, and with gleeful dishonesty.
I refer to a Royal Pudding jingle from my childhood, the one I quote as my headnote and can sing from memory, and the associated Royal Pudding ads.
The ad copy touts that Royal Pudding has "74% more food energy than fresh whole milk" in large red type and informs you that that extra energy is "in every delicious serving!" That last part is undoubtedly true, given that it'd be 74% more food energy in any size serving. The ad continues in smaller, black type, "Milk is Nature's best food — as every mother knows. It's needed for strong, sturdy bodies, for growth, for vitality. But a serving of ROYAL PUDDING gives you all the benefits of the milk you make it with plus 74% more Food Energy," and then goes on with useful and straight-forward reminders that you can buy Regular ROYAL or Instant ROYAL.
The Royal Pudding people got into trouble for misleading advertising.
Regular or Instant, Royal or Jell-O or homemade, you make pudding by adding to milk a thickening ingredient, stuff for the specific taste (color preservation), a touch of salt, and sugar or sugars, lots of sugar.
Sugar is the point with most desserts — well, and fat, which the whole milk supplies — and Royal Pudding delivered sugar, and with it a whole bunch of food energy.
Food energy in the US and UK is measured in "food calories" or "dietary calories" or kilocalories.
Now consider a modified form of the jingle (and ad): " Rich, rich, rich in flavor! / Smooth, smooth, smooth as silk, / More food cal-o-ries than (even) fresh, whole milk!" Since the extra calories are (a) plentiful and (b) pretty much uncontaminated by vitamins, minerals, protein, fiber, or nutrients you might want in addition to high-power carbs and fat, tight-ass nutritionists and various health agencies would deride them as "empty calories" and assert that rich, sugary, smooth Royal Pudding and its successor products are pretty much just fattening.
Still, the jingle and ad might be misleading, but they are eminently defensible as precise.
Verily, even (sort of) as it is more exact to talk of "the environment" rather than "the ecology" and "rock formations" rather than "the geology," so it is more exact to talk of The Thing Itself, food energy, than the units in which food energy is measured, i.e., for most of us, calories.
Changing "energy" to "calories" would make the ad less precise.
My memory is that the ad was eventually pulled but not before it had run its course and drummed that dumb-ass jingle into at least hundreds of thousands of little minds. And frankly, I kind of admire the hucksters who developed the Roy-al Pud-ding! ad campaign. It's a rare and brilliant example of how to be precise and apparently clear while being memorable, effective, and really, really, dishonest.
"Words mean," but if you're wily enough they can mean in nastily misleading ways and get the gullible public buying a lot of product.
Labels:
advertising,
afterlife,
clichés,
cosmology,
definitions,
doctrine,
health,
humpty-dumpty,
nutrition,
politics,
romance,
romeo and juliet,
science,
semantics,
sentiment,
television
Trayvon Martin, Yoshihiro Hattori: Law and Logic ("Indirect Proof") [17 July 2013]
Mathematics
and logic games can get very difficult, but their world is essentially
simple, elegant and frequently beautiful. Liars always lie;
truth-tellers always tell the truth; and statements can be simply true
or simply false and mutually exclusive, and that is that. For good and
for ill, the real world is typically complicated, usually inelegant, and
only occasionally beautiful.
In the neat world of math and logic, let's say you want to prove that a statement, a "proposition" is true or false, but you can't do so directly. Let's name it California style, Prop-1, and you want to prove it true but can't figure out a way to do so and have at least a strong feeling that a direct proof just isn't possible. What you can do — in a world of abstraction — is note that if Prop-1 is true its negation is false, and vice versa. Let's continue California style and piss off some logicians and call that negation Proposition-2.
What you can assume that Prop-2 is true and proceed logically from there, and, if you're clever, you can show that accepting the truth of Prop-2 leads to an absurdity. Then you go back and check your logic. If accepting Prop-2 leads to something clearly wrong, and your logic is OK, then Prop-2 is wrong. If Prop-2 is wrong, Prop-1 is right, and you get to write "Q.E.D." at the bottom of the proof, with seventeen additional Lifetime Geek Points if you add "per Reductio ad Absurdum." ("It has been proven by a reduction to the absurd.")
In one variation of this game, you can assume that it's OK to divide by zero and work out from there a logical proof that 1 = 2 (or maybe it was 1 = 0). Since we can be pretty sure that 1 = 1, we can conclude that in ordinary arithmetic and algebra problems, it's a bad idea to divide by zero. (If you want to get into infinities, that's a different thing — and probably a more advanced math course.)
In literature, things are more complex than math, and there is usually more emotional content: only a small number of people get emotionally involved with math, while more people react emotionally to literature and art generally than is healthy. (If you spend time and expend a good deal of energy complaining about the crap on TV or in the movies, you probably should consider writing letters for Amnesty International or protesting at your State capital against some significant evil.) Still, most of us, most of the time, can keep emotional distance in art, even when that art bears directly on current events.
This ability of art to distance an issue allows artists to — sometimes, quietly — walk around our prejudices and other defenses and work on our heads, including working on our figurative heads and hearts logically.
In science fiction, there's the kind of narrative called "If This Goes On," named after a 1940 story by Robert A. Heinlein. In stories like these, a contemporary trend is extrapolated into the future, where we see its very bad results. That is, the science-fictional technique of extrapolation is combined with what in satire can be called — and is called by the SF writer Yevgeny Zamyatin — reductio ad finem, pushing a bad trend to its conclusion, e.g., Vladimir Lenin's going over to capitalist "scientific management" getting trashed in Zamyatin's get SF satire, We (ca. 1920). If we don't like what will happen "If This Goes On," then we should try to stop it now.
The dramatic action in William Shakespeare's King Lear pretty well undercuts any idea of natural bonds between people logically justified in terms of human society as part of some cosmic Great Chain of Being. In such a world, one can't prove that bonds of compassion exist, but Shakespeare does a nice job showing an effect of not feeling such bonds: in an action and scene often called — SPOILER! — The Blinding of Gloucester (3.7). The Earl of Gloucester is an old man and a feudal vassal of the Duke of Cornwall; he's the Duke's host, and, old Gloucester might think, the Duke's friend. But Cornwall and, maybe more so, his wife Goneril, are angry at Gloucester, feel politically harmed by Gloucester, and have Gloucester in their power. So, what the hell, they gouge out his eyes. On stage. Traditionally, they "Pluck out his eyes" as graphically as the state the art of stage effects can manage.
The scene is obscene, in the old sense of the word, and intentionally so. Done well, even with an audience raised on torture-porn Saw films, it ought to send one or two wise-ass groundlings running to the john to puke.
Old Will is saying to us, "as 'twere": okay, I cannot prove that you should — absurdly perhaps, imprudently most likely — feel compassion, but here is one logical consequence if you don't; if that turns your stomach, think about it.
The classic example, however, is Jonathan Swift's once well-known essay, "A Modest Proposal" (1729), originally published to look like an anonymously written but legitimate and earnestly-argued political pamphlet. In the early 18th century, poor Irish people were again, as often, starving, and at least part of the cause of the famine was a set of conscious economic policies by the ruling elites in England and Ireland. Swift could argue wonkish economics with the best of his time, and had a price on his head for doing so in The Drapier's Letters; but he got to very basic basics in "A Modest Proposal." In classic economics, "The value of a thing is what the thing will bring," and "thing" obviously included sub-human animals — sheep or pigs, say, or even horses — which were bought and sold in markets and whose value clearly was the market price. (Human slaves were also bought and sold, but that wasn't an immediate issue in Ireland — although it's a useful point for background.)
What's the value of poor Irish?
The "modest proposal" is that poor Irish should sell their babies at a year old to be slaughtered for food for the rich; and it is crucial that that conclusion follows rigorously from "the cold equations" derived from seeing Man as (Solely) an Economic Animal. More poignantly, Swift's Projector demonstrates that the baby-meat option is less despicable, indeed, more compassionate, than current policies wherewith weakest of the poor are "already dying, and rotting, by cold and famine, and filth, and vermin, as fast as can be reasonably expected."
Q.E.D., baby — and if you're out there puking again, consider strongly the possibility that the "Proposal" has just reduced to the grotesque, if not precisely the absurd, the idea that "The Irish are animals" or that the only value to a human being is a kind of economic price.
Got the principle?
Now consider at least as a thought experiment that in the summer of 2013 trial of George Zimmerman for the slaying of Trayvon Martin, the six women, good and true, of a Florida jury reached a logical decision, to wit, that a large, fully adult White man could shoot and kill an unarmed Black teenager of slight build — legally.
Given Florida law on self-defense, given instructions to the jury that incorporated Florida's "Stand Your Ground" law, such a conclusion was possible.
Given the premise that any White person might well fear any noncrippled Black male, that conclusion was pretty likely.
Given the premise that fear is not just an explanation but a generally acceptable justification for even violent action, that conclusion was probable.
And given the assumption under Florida law and Southern culture that White men at least have the right to literally bear arms and go out armed — added to the assumption that such armed White men would do well to literally go out practicing "self-help justice" and look for trouble — then the "not guilty" verdict was pretty much inevitable.
As many are saying as I write, "Do not blame the jury."
Indeed.
Instead note that the George Zimmerman case is hardly unique even recently. On 17 October 1992, Rodney Peairs shot and killed Yoshihiro Hattori in Baton Rouge, Louisiana. Mr. Peairs stood his ground and defended his castle against a 17-year old exchange student who got the wrong house for a Halloween party. Pearis, too, was initially not prosecuted and, upon criminal prosecution, was acquitted. Mrs. Pearis had been frightened seeing the teenager come onto their property, and "The defense argued that Peairs was in large part reacting reasonably to his wife's panic." The trial took a week, and the jury voted to acquit in under four hours.
If you don't like the verdict in the George Zimmerman/Trayvon Martin case, or that in the shooting of Yoshihiro Hattori, consider that those verdicts might be logical and legal and start working to change the premises that make them that way.
We cannot insist that people be heroic; we can insist that armed men act, as the expression has it, as if they had some balls, and some brains. We can insist that laws in action lead to justice, and when they don't that those laws get changed. We can get appropriately angry and verbally attack films and TV and news shows that demonize young Black men and teenagers generally and romanticize vigilantism and Wild-West gun worship.
And if the English and, a bit earlier, the Americans, can accept the Irish as fully human, so we can start valuing all human life, even teenagers near our property.
In the neat world of math and logic, let's say you want to prove that a statement, a "proposition" is true or false, but you can't do so directly. Let's name it California style, Prop-1, and you want to prove it true but can't figure out a way to do so and have at least a strong feeling that a direct proof just isn't possible. What you can do — in a world of abstraction — is note that if Prop-1 is true its negation is false, and vice versa. Let's continue California style and piss off some logicians and call that negation Proposition-2.
What you can assume that Prop-2 is true and proceed logically from there, and, if you're clever, you can show that accepting the truth of Prop-2 leads to an absurdity. Then you go back and check your logic. If accepting Prop-2 leads to something clearly wrong, and your logic is OK, then Prop-2 is wrong. If Prop-2 is wrong, Prop-1 is right, and you get to write "Q.E.D." at the bottom of the proof, with seventeen additional Lifetime Geek Points if you add "per Reductio ad Absurdum." ("It has been proven by a reduction to the absurd.")
In one variation of this game, you can assume that it's OK to divide by zero and work out from there a logical proof that 1 = 2 (or maybe it was 1 = 0). Since we can be pretty sure that 1 = 1, we can conclude that in ordinary arithmetic and algebra problems, it's a bad idea to divide by zero. (If you want to get into infinities, that's a different thing — and probably a more advanced math course.)
In literature, things are more complex than math, and there is usually more emotional content: only a small number of people get emotionally involved with math, while more people react emotionally to literature and art generally than is healthy. (If you spend time and expend a good deal of energy complaining about the crap on TV or in the movies, you probably should consider writing letters for Amnesty International or protesting at your State capital against some significant evil.) Still, most of us, most of the time, can keep emotional distance in art, even when that art bears directly on current events.
This ability of art to distance an issue allows artists to — sometimes, quietly — walk around our prejudices and other defenses and work on our heads, including working on our figurative heads and hearts logically.
In science fiction, there's the kind of narrative called "If This Goes On," named after a 1940 story by Robert A. Heinlein. In stories like these, a contemporary trend is extrapolated into the future, where we see its very bad results. That is, the science-fictional technique of extrapolation is combined with what in satire can be called — and is called by the SF writer Yevgeny Zamyatin — reductio ad finem, pushing a bad trend to its conclusion, e.g., Vladimir Lenin's going over to capitalist "scientific management" getting trashed in Zamyatin's get SF satire, We (ca. 1920). If we don't like what will happen "If This Goes On," then we should try to stop it now.
The dramatic action in William Shakespeare's King Lear pretty well undercuts any idea of natural bonds between people logically justified in terms of human society as part of some cosmic Great Chain of Being. In such a world, one can't prove that bonds of compassion exist, but Shakespeare does a nice job showing an effect of not feeling such bonds: in an action and scene often called — SPOILER! — The Blinding of Gloucester (3.7). The Earl of Gloucester is an old man and a feudal vassal of the Duke of Cornwall; he's the Duke's host, and, old Gloucester might think, the Duke's friend. But Cornwall and, maybe more so, his wife Goneril, are angry at Gloucester, feel politically harmed by Gloucester, and have Gloucester in their power. So, what the hell, they gouge out his eyes. On stage. Traditionally, they "Pluck out his eyes" as graphically as the state the art of stage effects can manage.
The scene is obscene, in the old sense of the word, and intentionally so. Done well, even with an audience raised on torture-porn Saw films, it ought to send one or two wise-ass groundlings running to the john to puke.
Old Will is saying to us, "as 'twere": okay, I cannot prove that you should — absurdly perhaps, imprudently most likely — feel compassion, but here is one logical consequence if you don't; if that turns your stomach, think about it.
The classic example, however, is Jonathan Swift's once well-known essay, "A Modest Proposal" (1729), originally published to look like an anonymously written but legitimate and earnestly-argued political pamphlet. In the early 18th century, poor Irish people were again, as often, starving, and at least part of the cause of the famine was a set of conscious economic policies by the ruling elites in England and Ireland. Swift could argue wonkish economics with the best of his time, and had a price on his head for doing so in The Drapier's Letters; but he got to very basic basics in "A Modest Proposal." In classic economics, "The value of a thing is what the thing will bring," and "thing" obviously included sub-human animals — sheep or pigs, say, or even horses — which were bought and sold in markets and whose value clearly was the market price. (Human slaves were also bought and sold, but that wasn't an immediate issue in Ireland — although it's a useful point for background.)
What's the value of poor Irish?
The "modest proposal" is that poor Irish should sell their babies at a year old to be slaughtered for food for the rich; and it is crucial that that conclusion follows rigorously from "the cold equations" derived from seeing Man as (Solely) an Economic Animal. More poignantly, Swift's Projector demonstrates that the baby-meat option is less despicable, indeed, more compassionate, than current policies wherewith weakest of the poor are "already dying, and rotting, by cold and famine, and filth, and vermin, as fast as can be reasonably expected."
Q.E.D., baby — and if you're out there puking again, consider strongly the possibility that the "Proposal" has just reduced to the grotesque, if not precisely the absurd, the idea that "The Irish are animals" or that the only value to a human being is a kind of economic price.
Got the principle?
Now consider at least as a thought experiment that in the summer of 2013 trial of George Zimmerman for the slaying of Trayvon Martin, the six women, good and true, of a Florida jury reached a logical decision, to wit, that a large, fully adult White man could shoot and kill an unarmed Black teenager of slight build — legally.
Given Florida law on self-defense, given instructions to the jury that incorporated Florida's "Stand Your Ground" law, such a conclusion was possible.
Given the premise that any White person might well fear any noncrippled Black male, that conclusion was pretty likely.
Given the premise that fear is not just an explanation but a generally acceptable justification for even violent action, that conclusion was probable.
And given the assumption under Florida law and Southern culture that White men at least have the right to literally bear arms and go out armed — added to the assumption that such armed White men would do well to literally go out practicing "self-help justice" and look for trouble — then the "not guilty" verdict was pretty much inevitable.
As many are saying as I write, "Do not blame the jury."
Indeed.
Instead note that the George Zimmerman case is hardly unique even recently. On 17 October 1992, Rodney Peairs shot and killed Yoshihiro Hattori in Baton Rouge, Louisiana. Mr. Peairs stood his ground and defended his castle against a 17-year old exchange student who got the wrong house for a Halloween party. Pearis, too, was initially not prosecuted and, upon criminal prosecution, was acquitted. Mrs. Pearis had been frightened seeing the teenager come onto their property, and "The defense argued that Peairs was in large part reacting reasonably to his wife's panic." The trial took a week, and the jury voted to acquit in under four hours.
If you don't like the verdict in the George Zimmerman/Trayvon Martin case, or that in the shooting of Yoshihiro Hattori, consider that those verdicts might be logical and legal and start working to change the premises that make them that way.
We cannot insist that people be heroic; we can insist that armed men act, as the expression has it, as if they had some balls, and some brains. We can insist that laws in action lead to justice, and when they don't that those laws get changed. We can get appropriately angry and verbally attack films and TV and news shows that demonize young Black men and teenagers generally and romanticize vigilantism and Wild-West gun worship.
And if the English and, a bit earlier, the Americans, can accept the Irish as fully human, so we can start valuing all human life, even teenagers near our property.
Subscribe to:
Posts (Atom)