Sunday, March 17, 2019

A Few Words on the Anti-Vaccination Movement

At least since people figured out that PLAGUE!!! is contagious, public health issues have been special cases where community safety overrides what are usually individual rights. So long as herd immunity is enough to ensure the health of the group, we should defer to people's: 
• (highly justified) suspicions of Big Pharma, 
• somewhat less justified dedication to the unsapped power and purity of their precious bodily fluids,
• desires to keep out of their bodies the artificial, "unnatural," and/or toxic or related to the toxic or pathological,
• beliefs in belief and the healing power of faith and/or Nature,
• usually true and useful ideology that their bodies are their own to do with as they will,
• usually true and useful ideology that "Freedom isn't free" and requires taking risks and the occasional literal or figurative blood of patriots and/or the innocent to figuratively water and fertilize the tree of Liberty. 
When herd immunity is insufficient, however — uh, no. Then we-all usually understanding and peaceful folk should use social pressure to encourage and if necessary State power to coerce getting the goddamn shots already.

Saturday, March 2, 2019

Climate Change / Political Changes: The Need for Historical Background

We need more books like Brian Fagan's 2008 The Great Warming: Climate Change and the Rise and Fall of Civilizations: history books, mostly, that help us understand that climate change isn't just  about changes in some abstract "the environment," but about great changes in human politics.

This book is about "the Medieval Warm Period," ca. 800-1300 C.E. and its generally positive effects in, e.g., northern Europe and utterly devastating effects elsewhere, e.g., in already warm, dry places such as long-term droughts in what's now the U.S. southwest.

These Medieval data can cut different ways, as a warning to try to reduce the speed of global warming and try to mitigate its effects — or to say that climate change ca. 900 obviously wasn't caused by human industrial activity, so we needn't reduce current economic activity with its benefits and irrelevant, or even beneficial greenhouse gas emissions. 

To quote for background one on-line pundit, "Climate scientists now understand that the Medieval Warm Period was caused by an increase in solar radiation and a decrease in volcanic activity, which both promote warming. Other evidence suggests ocean circulation patterns shifted to bring warmer seawater into the North Atlantic." So why should we try to reduce greenhouse gasses?

The question isn't rhetorical. For one thing, the article linked last paragraph notes that there's no evidence for a recent increase of solar radiation, nor, in our times, significant decrease of volcanic activity;, and I'll note that currently there's not a damn thing the human species can do about either. We can reduce emissions of greenhouse gasses, which will slow the rate of warming, and we should reduce those emissions anyway to save some easily extractable hydrocarbons for our descendants. They might find safe and efficient ways to use petrochemicals and need some — and be very pissed off at earlier generations who went and burned them. 

They may also be pissed off at a re-run, but worse, of, especially, long-term, catastrophic droughts. 

In addition to the obvious, some of the effects of climate changes indeed include the geo-political. William Grimes's New York Times review of The Great Warming has an arresting sentence indicating that what was so good for much of Europe may have had negative effects: "Although data remain sketchy, it seems probable that extended droughts dried up pastureland on the Central Asian steppe, propelling the armies of Genghis Khan westward." The career of the Great Khan (1206-1227 C.E.) achieved some impressive empire building that, in Europe, arguably set the stage for what was once called the Renaissance, but unarguably resulted in an extraordinarily high body-count. Matthew White tallies up Genghis Khan's "multicide" score at 40 million, which puts the Mongol invasions just behind World War II (1939-1945) for destruction of human life. 

And destruction of the great Muslim civilizations at the center of Eurasia — which in a twisted way brings me to where I started thinking about this post. 

I had typed out on Facebook a quotation from Introduction to Medieval Europe: 300-1500 by James Westfall Thompson and Edgar Nathaniel Johnson (New York: Norton: 1937). In an early chapter on "The Empire of the Arabs," Thompson and Johnson give Mohammed and the early leaders of the Umma credit for organizational genius and make the point that medieval Europe was a side-show as civilizations of the time went. They are far from "climatic determinists," with an Index innocent of such references. And they fully recognized the arrogance and stupidity of the remaining Roman Empire and Persia in their continuing wars and machinations. Still, Thompson and Johnson asserted (in 1937!) that "The expansion of the Arabs is best understood in the light of previous movements out of the desert [...]. These were constant phenomena, to be explained by the vicissitudes of climatic conditions, which always drove nomadic peoples outwards. [...] The [Arabian] peninsula itself was experiencing a periodic desiccation, which made life within it ever more unbearable and drove its inhabitants to seek relief elsewhere. It seems, accordingly, highly probably that what occurred would have happened even without Mohammed and Islam" (p. 166). 

"What occurred" was the end of Late Antiquity, with the final end of the Roman Empire, and the Persian — and Rome in the East (the Byzantine Empire), reduced to a regional power. T & J and those with similar theories may be wrong, but it's a point to consider, along with the possibility that the following round of "desiccation" in the Medieval Warm Period led to that drying out of parts of the Eurasian steppe leading to the western thrust of the Mongol invasions and, let's say, a figurative arrow from a powerful bow through the figurative heart of "The Empire of the Arabs."


It's possible that two major changes in the course of human history had as one basic cause the long-term droughts — the "desiccation" — brought on by global warming. Mohammed and Genghis Khan count, and their decisions are important. Still, one thing Karl Marx got right was that "Men make their own history, but they do not make it as they please [...], but under circumstances existing already, given and transmitted from the past." And those circumstances include a physical environment that is both beyond human control and influenced by our actions.

Unless we want some really bad circumstances for the next generation, we'd better get right some important decisions — and soon. 

Tuesday, February 26, 2019

Democratic Authority (Mostly Small-Scale)

"We elected you, and we can diselect you." —
Member of Chicago Grammar School Club to
President of the Club (me,  mid-1950s)

“And this took place in the United States, a
culture that educates its children against
blind obedience.” — Irenäus Eibl-Eibesfeldt
on Milgram obedience experiments, in Ethology:
The Biology of Behavior(1970: p. 448; ch. 18)



Part of the lore of US warfare in Iraq is that the neoCons et al. who devised it didn't plan much for the aftermath in part because they firmly believed that the default setting — the universal ideal — for human government is what we in the US vaguely call "democracy." Get rid of oppressors like Saddam Hussein or the Taliban, and voilà! soon, very soon the society is moving toward becoming Denmark or even the greatness of America. Similarly for the disintegration of the USSR and Warsaw Pact — and, for a while, it indeed did look like a number of countries would “have a new birth of freedom, and that government of the people, by the people, for the people” might actually expand. 

That Big Idea didn't hold up well, which did not surprise those who studied the development of actually-existing societies we call, still very loosely, democratic. That's mostly because the range of what we (loosely) call "democracy" does develop and has social and economic and cultural roots, roots that may not go down as deep as we believe — but it needs those roots.

I'm not going to deal much with Big Ideas, though there is an idea here: by age 20 I knew that democracy is far from natural and the general culture does not do a good job teaching it.

Back in high school Civics — and in grammar school before that — back in a time and place where one had to pass an exam on the US and State Constitutions and governments to get a grammar school or high school diploma — in Chicago in the mid-1950s, Mr. James Connelly taught us in Civics that the United States was a federal republic, where sovereignty rested in the People, who established a constitution giving authority to a government of elected and appointed officials, officials who then ran the government but served the People. That was our official ideology, our small "r" republican doctrine, and I believed it and figured most Americans believed ... except —

Except there was that memory from back with my grammar school club and the doctrinally ambiguous challenge to me, personally, "We elected you, and we can diselect you." Okay, "potestas in populo, auctoritas in senatu" in a formula I'd later learn from Hannah Arendt and have driven home in street demonstrations: as Mr. Connelly said, the People always retained sovereign power, from which they conferred authority  which they could take back. Except that my grammar-school classmate had questioned my authority precisely because it had been given to me by him and the other members of the club. The very limited authority of club officers was something he understood and figuratively owned and ... therefore, it seemed didn't see it as very binding.

Weird. We were taught and told and, well, indoctrinated that legitimate authority came from the People. The kid back in high school accepted — willingly and perhaps too eagerly — the authority of parents and teachers and others he had no say about, but resisted even highly limited peer authority over himself that he himself had granted.

The old “consent of the governed” bit wasn’t working out, and my fellow American youngster preferred authority over him to be built into the system and pretty much based in age and status and other criteria beyond his control. I saw that, felt it a bit as disrespect, and then did what most of us most of the time do when dealing with contradictions and what I much later learned to call cognitive dissonance: I mostly ignored it and moved on.

Mostly, but the experience stuck, and moving on included high school and college fraternities where I served a term as secretary of each and used the office to rewrite portions of our constitutions and make sure the guys debated the matter and voted on it. Get them to "buy in" as we would later say by exercising their power over our organizing documents, acknolwedge the authority and feel the worth of the group by participating in governing the group.

My college fraternity chapter in the 1960s, though, offered additional opportunities. At least back then, and on our campus, pledges lived in the house, which offered ... well, some pretty obvious opportunities. Our pledge-training (sic) policy was laisse-faire through the class of 1965: laisse-faire combined with occasional strong punishments for screwing up (“PT,” “sweat sessions”). The class of ’65 had problems, and it became clear we, the fraternithy Chapter, were doing things wrong.

So a few of us checked out how parts of the military handled training, and in my course work I was also studying some relevant anthropology. We went over to a system of “little things”: rules for minor behaviors, none of which individually worth rebelling against but all of which together were practice in accepting the Chapter’s authority.

It worked. 

Usually it worked, and in one case that impressed me, with a guy in the class of ’66 I’ll call Terry. 

Now, a couple of upperclassmen in the chapter were outright geniuses. Terry wasn’t, but he was brilliant, going on to Harvard Law after graduation and not long after that doing some pro bono work that established some important law. Me? Well, an eminent Medievalist, after a couple or more gin and tonics once corrected some self-deprecating remark I made with, more or less, “No, Rich; you’re bright. Not brilliant, but bright” — and that’s about right. I was also a house officer when Terry pledged, and he kind of almost sort of respected my intelligence. He was smarter than I was or am — and as ... let’s say as firm in his opinions as I — but I had more experience; and as ambiguous as we arranged for pledges to feel about their status, he could figure out I outranked him. And the one time he screwed up (under the rules we’d set up), I was the one who quietly, privately, but in some detail, clarified for him that he was less clever and generally estimable than he thought. He was furious while being chewed out, but he submitted to it. 

We became friends, and one night after he initiated, and we were talking in my room, I said I really had to get to sleep and said good night, and he responded, “Good night, Mr. Erlich” — and then proceeded to pound his fists into the walls, while I said, “We got you! We got into your head!” 

As we had: I was a house officer, and when Terry was a pledge he called me to my face “Mr. Erlich” and threw in the occasional “sir.” (We hadstudied the military and some ideas on child-rearing of the traditional, though non-abusive, sort.)

Little rules, fairly easy to remember, very easy to obey, none worthy of rebellion — but often just there, frequently, calling for obedience and functioning to instill, figurative drop by figurative drop, some acceptance of the authority of the chapter.

I helped set up the program, but with a condition for my participation, one necessary for my integrity as someone who had issues with authority, even when I was in authority.

Between the end of Informal Initiation (“Hell Week”) and formal, ritualistic initiation, the guys undergoing initiation cleaned themselves up and then had this especially liminal period — I saidwe’d looked at some anthropology — marked by time alone in a quiet room, sitting for their Pledge Test. The test covered the usual quasi-useful history of the fraternity and such, but had one and only one question they had to get right, and keep taking the damn test until (sometimes with coaching) they did get. I had insisted that they answer the question, “What is the rationale for the pledge rules such as?”, and here some were listed. 

To initiate they had to figure out that many of the rules were arbitrary and intentionally so. If they studied during study hours that was in part because we told them to study, but also in part common sense. If they ordinarily used the back door to the house and the back stairs — that was onlybecause we told them to do so.

Part of the goal with a fraternity (beside and along with more serious partying) is to control to a fair extent where we lived: at least being able to paint a room the color we wanted and set rules for behavior. For that we needed pledges to go from being trained to accept authority of those above them in a hierarchy to active brothers — full citizens, so to speak — who would accept consciously the authority of the constituted group as group, and of peers they’d elected. We needed them to sit in a circle of approximate equals as a chapter and accept the authority of rules they’d help make.

And there was nothing inevitable or all that natural about the process, and it didn’t always work even for a small fraternity chapter, with well-schooled if not necessarily educated guys, who lived in a Republic with an official policy of popular government and official democratic ideals and vocabulary.

Note the official. About the time Terry was learning to call me “Mr.” and throw in the occasional “sir,” Stanley Milgram was conducting his problematic experiments on Obedience to Authorityand demonstrating how easy it is to get obedience where there’s mystique, in the Milgram case the mystique of “Science” and an authoritarian acceptance of rank. And Milgram et al. did that even “in the United States, a culture” far less than Austrian Irenäus Eibl-Eibesfeldt thought “that educates its children against blind obedience.” We are a culture that trainsmany in obedience, to those with real power over us — as in the ability to help or hurt us — but also to those with the right mystique.

Fraternity chapters are short on mystique. And the moral here, if you’re still with me, is that one of the obstacles to achieving democratic-republican ideals is that (statistically) normal humans are like that kid in my grammar school club with little respect for authority he understood and had granted — even if all too willing to obey people just there, over him in a hierarchy over which he has no power. N = 1, proves very little, and not more with N = 75 or so for my fraternity chapter over a couple of years; but these small experiences were enough to get me accept the possibility that even Americans really aren’t that big on democracy or republicanism but are susceptible to confident fanatics like the Taliban, or “strong-men” like Saddam Hussein or authoritative bullies like Donald Trump, even when those strong-men/bullies have only the most limited charisma. 

We need more teaching of Civics and teachers like Mr. Connelly. And we need more parents and teachers and administrators and coachesand other older folk more often stepping back and letting young people function in organizations of the kids, by the kids, and for the kids — even when the kids may seriously mess up. We need to provide training starting very young in choosing which authority and authorities to accept, and to prefer authority based in the ideal of republics with liberal-democratic aspirations. 






Thursday, February 14, 2019

Words of the Day: "Love" and "Life"

The company making and/or distributing old-fashioned Dominica bay rum aftershave has apparently gone out of business, but I was able to buy a couple bottles on e-Bay at an almost-reasonable price. I made the purchase as a "guest," and the confirmation e-mail I just got was (a) misleading — suggesting they were holding my purchase hostage until I signed up — and (b) a temptation to join e-Bay and use their services to buy items I "love." 
Okay, at least they didn't insist on an e-Bay "community" or "family," but items I love, like aftershave?! I know we have those hearts on Facebook and that at least one NPR station asks if we love their shows, but I think we've gotten to some serious word-inflation with loving anything short of puppies or kittens that I'm likely to see on e-Bay (and I assume they do not sell live mammals). Now living in California, far from the pollen-attack-plants of south-west Ohio, I still have some minor allergy issues, and I really like lime bay rum as an aftershave that doesn't aggravate those remaining allergies. I'm grateful for bay rum; I like the smell, especially with a touch of lime — but it's just a casual relationship, not love.
* * *
"All Lives Matter," a meme I received said, and people have pronounced in my hearing "All Life Is Sacred" — and in that second case I've told them not to assert the sanctity of all life while chomping on a hamburger or even a carrot. Or using your leather shoe to squash a cockroach — and then clean your hands off with a bacteria-killing hand sanitizer. Most life on Earth over Earth's history and at any time by weight and number of individuals is Archaea, Bacteria, or, among us "higher" organisms, plants and insects. The Jains come close to literally respecting the sanctity of all life, but if they're to live healthily even their bodies must kill intruding bacteria and viruses, and kill aging or over-active cells of the body itself.
If people mean "All human life is sacred" (or whatever), they should say that, or argue for a definition of "life" that excludes most creatures your high school life-sciences teachers made you study.
This last point is important.
There's a dangerous arrogance in restricting real life to human beings on religious grounds or some secular theory of human exceptionalism. And even if we arrogantly assume we humans are extra-special special, it's just silly to say that cats and dogs and even ants and worms somehow aren't alive. And it's irresponsible to kill living things and not be conscious of doing so. Personally, I've killed bacteria by the billions, and I won't apologize for that killing — but I will take responsibility, and a heavier responsibility for the laboratory work I did that involved killing dogs, one cat, a rabbit or two, and a lot of rats. I've killed my quota of mammals, I think, which is one of the several reasons I now decline to eat them.
Mammals are kin, and I draw the line there for food. I also draw a line at octopuses, which is a lot easier since I'm not even all that fond of squid. 
Octopuses are invertebrates and not in the web of life close to humans and chimps and dogs and birds. But my sister and I were once invited backstage, so to speak, at a major aquarium and got to see an injured octopus in their veterinary section, in an aquarium cage inside a larger aquarium. Anything simpler, and this octopus escaped, picking locks and going around opening other others. As far as the (human) staff could figure out, the octopus got bored, and the locks and cages were a challenge. And I figured any creature smart enough to get bored and pick locks was too smart to kill casually (or have killed and eat if there's a variation of [squid] calamari made with octopus). 
Not too far into the 20th century, there was a fair degree of agreement that humans shouldn't cause unnecessary suffering to sentient creatures, with "sentient" not in the sense of "smart and conscious" and "capable of thought" but just, well, sentient to external stimuli, able to perceive enough to suffer. Whether eating hamburgers and spare ribs is necessary is something we can argue about; cruelty to animals for sport isn't something most of us argue: it's just evil, period.
Our treatment of non-human animals will become a more pressing issue as, as the over-stated title to an article in The Atlantic has it, "Scientists Are Totally Rethinking Animal Cognition." The debate on human life will become more pressing as we in the US enter another round of debate on abortion, with a new US Supreme Court. As others and I have stated repeatedly, that conflict over abortion isn't about "When does life begin?" Life doesn't begin; it began. How do I know? The Bible and my biology courses tell me so. Life began and gets passed down. So, yes, indeed, "There's always a death in an abortion." The question is what's being killed and what status should he/she/it/they have ethically considered and under the law? A human fertilized egg — a human zygote — is a potential unique human individual or, on occasion, a set of genetically pretty much identical but still unique individuals. ("Monozygotic siblings" aren't really identical twins or triplets or whatever: They've had slightly different environments in the womb, and individuality is their birthright.) But a zygote is a one-celled creature, and you shouldn't mind killing it if you eat that hamburger or carrot without qualms or stomp that cockroach and thereby kill more "highly" developed life.
 To get a human zygote more valuable than a steer and important enough to compete with the rights of its human mother, it's most efficient to have a theory of a human soul and to argue that "ensoulment" occurs at the moment of conception. In an important traditional Roman Catholic view, you can have the zygote and succeeded embryos and fetuses as unborn babies and, perhaps more significant, unbaptized babies, who, if killed would wind up in Limbo (at one time) or Hell. As the thoroughly Puritan Reverend Mr. Michael Wigglesworth had God say to unbaptized dead babies it in "The Day of Doom" (1662 —a truly awful poem):
"You sinners are, and such a share        345
  As sinners may expect,
Such you shall have; for I do save
  None but my own elect.
Yet to compare your sin with their
  Who liv’d a longer time,        350
I do confess yours is much less,
  Though every sin’s a crime.
"A crime it is, therefore in bliss
  You may not hope to dwell
But unto you I shall allow        355
  The easiest room in hell.”

And here we get to serious arguments on secular vs. religious ideas on what "life" means in "Choose life" and what we mean by "love" with either a God of Love who'd damn babies on a technicality or in a love one might have for one's fellow human beings viewed realistically and materialistically — and when one becomes human on the way from zygote to embryo to fetus to baby to, well, a conscious, talking, definitely-a-person person. 
WORDS MEAN; and if they "mean" in complex ways, that's all the more reason to use them carefully.







Thursday, February 7, 2019

Truth, Trump, and Post-Modernism (a Repeated Post)


I'm repeating this post on the occasion of my listening to a book making many of the same points, but in greater detail, and with many, many more readersThe Death of Truth: Notes on Falsehood in the Age of Trump by Michiko Kakutani (published July 2018). I disagree with some of Kakutani's but recommend it highly. 


Also, there are two more quotes I use elsewhere that I want reader to have in mind approaching my argument here.


Practical men who believe themselves to be quite 
exempt from any intellectual influence, 
are usually the slaves of some defunct economist. 
Madmen in authority, who hear voices in the air, 
are distilling their frenzy from some academic 
scribbler of a few years back.  John Maynard Keynes

 “But it is very difficult,” Shan said, “to live without the notion 
that there is, somewhere, if one could just find it, a fact.”
 “Only fiction,” said Forest, unrelenting. “Fact is one of our finest fictions.”
— Ursula K. Le Guin,  “Dancing to Ganam” (1993), 
collected A Fisherman of the Inland Sea, 1994.



============================================

Monday, March 7, 2016

Truthiness: One Root in the Sins of Poststructuralism (a Long, Kind of Academic Rant)

Things fall apart; the centre cannot hold; 
[…]
The best lack all conviction, while the worst 
Are full of passionate intensity. 
— William Butler Yeats, "The Second Coming" (1919/20)

"Way worse than the so-called political correctness that Trump assails
 is the learned helplessness of journalists, public intellectuals and 
anyone else with half a brain and access to a media platform. 
Why be disingenuous about knowledge and learning? 
Why be defensive about objective criteria for true and false?"
— Marty Kaplan "Professor, You're Fired! 
Or, the Education of a Trump Voter"
 11 Jan. 2016, HuffPost Politics Blog



Some significant swaths of the academic/intellectual Left share responsibility for the lack of conviction that W. B. Yeats complained about in "The Second Coming" in the early 20th century, and, far more so, the "learned helplessness" Marty Kaplan laments in his early 2016 blog on Huffington Post.

            Before I get to that composite accusation, though, I will make some concessions and gesture toward "full disclosure." It won't be literal full disclosure — not even a long autobiography of the tell-all confessional variety will do that — but it will give an idea of "where I'm coming from," as we used to say, and what allowances should be made for my biases. 

            To begin with a concession or two (there will be more later): some of the "Sins of Poststructuralism" and schools of cultural and critical science studies, and the doctrine of "the Social Construction of Reality" include pernicious attacks directed at science as an endeavor and the scientific method; on the other hand serious critique of various sciences — "actually-existing Science" — has been long overdue. From its beginning through the 20th century, science as an endeavor has known its own sins, some stemming from the perverse universals of Platonic typologies and the arrogance of Eurocentrism, to others more nitty-gritty.
            For a striking but, for readers, painless introduction to late 17th and early 18th cruelty, silliness, and harm done in medicine and biology, see Neal Stephenson's, BAROQUE CYCLE — plus Stephenson's arguably unfair but still instructive introduction to the cruelty, silliness, and harmfulness of Sir Isaac Newton, and his pursuit of preeminence and dominance in the science (and politics) of his time. "Scientific" medicine and, so to speak, psychiatry in the early days of the Scientific revolution were licensed barbarism, and for much of the time sick people would have been better off with the local herb woman. For more recent cruelty in "rat-running" psychology of the Behaviorist persuasion, see Steven Pinker's confession in The Better Angels of Our Nature and elsewhere of carrying out "a procedure that turned out to be tantamount to torturing a rat to death" for no particularly pressing reason, to which I'll add my own killing of a significant number of lab rats in a project my employers would have known to be doomed to failure if they had done sufficient library research before handing it over to me. Well beyond rats — "beyond and above" from a human point of view — there are the Nazi experiments on homosexual prisoners at Buchenwald, and Jews and Roma and Russians at Dachau, Buchenwald, and elsewhere. In the United States, there was "the Tuskegee Study of Untreated Syphilis in the Negro Male" among other abhorrent abuses in the name of science of people without power; and other countries have had their share of what Salon.com in a reduction to the grotesque of lists, reported as the "10 of the most evil medical experiments in history."  And to end a very brief sampling of a long list of horrors, there were the sins of "scientific racism" examined elegantly in Nell Irvin Painter's fairly recent The History of White People (2010) covering not just the usual suspects in anthropology and history, but also in economics and statistical studies. 
            And, of course, "totalizing" theories giving people the Truth, the whole Truth and nothing but the Truth — theories giving the key to history and the universe — are a justification for fanaticism and when applied to the messy complexities of human life and politics demand fanaticism to be wholly believed. (Unless we're at some radically elegant simplicity at the root of physics, any Grand Unified Theory of Everything is going to be often wrong, and complete belief will require the epitomizing fanatic move of ignoring all data that don't fit doctrine.)

So much for now for concessions as such; here's where I am "coming from."

            Back during the US "Troubles" of the long 1960s, I was a registered Democrat and a liberal at the University of Illinois at Urbana and got a telephone threat on my life from the "Minutemen" in spring of 1970 and faced wordless and less lethal — but far more credible threats — on a couple of occasions from police (at the 1968 Democratic national convention in Chicago for one occasion). Moving on to a tenure-track job at Miami University in Oxford, Ohio, I soon consulted a shrink in part because I worried about going paranoid. He asked me a couple questions about vast conspiracies — which wasn't my perception — and then said something like, "Oh, you just have this feeling there are people in the administration who want to fire you?" And I said, "Yeah," and he said, "That's not paranoia. There are people who want you fired. I've heard that over here in [NAME OF PSYCH BUILDING]. You must be getting all kinds of signals over on your side of campus. You've got political problems; it's psychologically healthy to sense real threats." Over the years I got the fuller story, and it turns out there were Right-wing people of influence at and/or operating on Miami University who tried to get me fired from the moment my appointment letter went out hiring me.
            Still, the major annoyance on my politics came from some members of the capital "L" Left, people into games of "More Radical than Thou" and for whom "Liberal" was a term of abuse. I was "the little Lib with glasses" in their references to me, and the nicest thing one of my colleagues of the far-Left sort called me at the U of I was — when he saw me carrying Hannah Arendt's Origin of Totalitarianism — "Well, I guess you're an educable Lib."
            It was a bit more than an annoyance when I thought I'd gotten accepted for publication the essay that was going to make me a full professor more or less on schedule, and not, as with my getting tenure and, later, promotion to associate — that was possible at Miami U. back then — getting tenure and then promotion a bit or more than a bit behind the rest of my cohort.
            My PhD was in Shakespeare, and my graduate training was more generally in Renaissance drama and early British literature. I taught Shakespeare and "Beowulf through Paradise Lost" pretty regularly throughout my academic career, but as of the 1980s I hadn't published much in the area and was moving into science fiction and fantasy. And you need to know for this story that my dissertation centered on Shakespeare's mature tragedies, from which I excluded and continue to exclude the transitional Hamlet. For whatever reasons, though — including the possibility of a very large recruiting mailing list — I was invited to submit an essay for the Modern Language Association's volume of essays on "Approaches to Teaching Hamlet." Neat! The volume was already slated to include essays by some major scholars from the first part of the 20th century, and I could be the new kid on the block, relatively speaking, with a Bertolt-Brechtian, Maynard-Mackian, theatre-historical close reading of the play: for an approach I had used in my teaching with, I thought, a good deal of success. 
            My essay was accepted, and I did get promoted to full professor — but eventually and not because I had made a mark in Shakespeare pedagogy. The essay was accepted; the volume was put in the queue for publication — and then the MLA was taken over by its radical caucus, and the Hamlet volume was cancelled, to be replaced by a collection of Approaches more up-to-date, with it, and in line with what one might call "the hegemonic discourse in the field" — or what one might call it if one didn't know that "hegemony" is what's practiced by the forces of reaction and not by, say, the radical caucus if they take over.
            "When elephants fight, the grass gets hurt," and a few ants get trampled; or, to switch to a milder and more appropriate formulation, I ended up with a flesh-wound in my career as collateral damage in a battle over the MLA and the future of "The Profession of English" and other areas of the academy. 

            That's mostly blood long under the bridge; a more immediate occasion for my rehearsing old sins of old opponents has been living through "the faith-based presidency" of George W. Bush combined with — more relevant here — the balancing and complementing Machiavellian cynicism of Karl Rove. 
            Rove has been identified as the senior aide who had a revealing and much-quoted exchange with Ron Suskind, quoted in the New York Times Magazine for 17 October 2004

The aide said that guys like me were "in what we call the reality-based community," which he defined as people who "believe that solutions emerge from your judicious study of discernible reality." I nodded and murmured something about enlightenment principles and empiricism. He cut me off. "That's not the way the world really works anymore." He continued "We're an empire now, and when we act, we create our own reality. And while you're studying that reality — judiciously, as you will — we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors […] and you, all of you, will be left to just study what we do." — "Faith, Certainty and the Presidency of George W. Bush"

Without knowing that Rove was the aide here, Neal Gabler in the Los Angeles Times for 24 October 2004 could note that "All politicians operate within an Orwellian nimbus where words don't mean what they normally mean, but Rovism posits that there is no objective, verifiable reality at all. Reality is what you say it is, which explains why Bush can claim that postwar Iraq is going swimmingly or that a so-so economy is soaring. As one administration official told reporter Ron Suskind, 'We're an empire now, and when we act, we create our own reality [...]."

            Gabler was more right than he knew in citing George Orwell here. "Orwellian nimbus" refers to the Orwell of "Politics and the English Language" (1946) and the discussions of Newspeak in the text of Nineteen Eighty-Four (1949) and its appendix, a scholarly discussion of "The Principles of Newspeak." The satire of Nineteen Eighty-Four also targets specifically and powerfully ideas "that there is no objective, verifiable reality at all," no actual facts in a real world outside of human consciousness. There's a similar if more restrained critique in E.M. Forster's "The Machine Stops" from 1909, so the phenomenon may be cyclical. Anyway, every couple generations it may be that "the best lack all conviction" because highly trained philosophical types go from useful and legitimate ideas about the impossibility of knowing absolute Truth and how literal objectivity is impossible — "The observer is part of the system" — and start taking the concept of the social construction of reality as an excellent (epistemological) theory about the limits of knowledge to a literal (ontological) idea of constructed realities, none more true than another. 
            Modesty in making assertions is a virtue, but two points here: (1) Even for a dedicated poststructuralist professor of cultural or science studies, "There's a car coming at us from the right" is significantly different in truth value from "It's clear; pull out" if there is in a good-enough-to-deal-with real world reality a car rapidly coming in from the right. (2) If enough of us believe reality is constructed "between our ears," political reality will be determined by the guy with the gun in your ear. 

I grew up on the old Churkendoose record (1947), with the moral, "It depends on how you look at things" — which is correct, and part of the epistemological justification for tolerance and humility, or — and better —acceptance of the Other and celebration of difference. Still, more of us grew up on Hans Christian Andersen's "The Emperor's New Clothes" (1837), and it is now necessary to stress that however much "It depends on how you look at things," sometimes the emperor just isbare-ass naked, "And that's the truth": there can exist real, politically-relevant facts. 

==================================

We shape each other to be human […]. 
— Ursula K. Le Guin, "Coming of Age in Karhide" (1995)[1]

            In 1966, the same year as Jacques Derrida "subversively declared that structuralism was finished" and to be supplanted by post-structuralism (Lehman 97), Peter L. Berger and Thomas Luckmann published The Social Construction of Reality: A Treatise in the Sociology of Knowledge. For Berger and Luckmann, "reality," in quotation marks, is the subjective world that we human are "biologically predestined to construct and to inhabit," a social world of self and others. For each human, this subjective, learned world "becomes […] the dominant and definitive reality. Its limits are set by nature, but once constructed, this world acts back upon nature," e.g., determining what of all the biologically possible foods people (ordinarily) will actually eat, what of all the possible sexual acts most people in a culture will actually perform. "In the dialectic between nature and the socially constructed world the human organism itself is transformed. In this same dialectic man produces reality and thereby produces himself" (183 [180])—i.e., produces our social selves, our roles, our identities. Burger and Luckmann present an excellent and persuasive analysis of a "weak" form of social construction of reality, but still a powerful thesis. Human beings are socialized into human identities in specific societies, and those primary socializations give us, if not our worlds, at least our initial, and very strong worldviews: "The child does not internalize the world of his [initial] significant others as one of many possible worlds. He internalizes it as the world, the only existent and only conceivable world […]"; these primary socializations give each of us, "the world of childhood […,] the 'home world' (134, 136; ch. 3). 

------------------------------------------------------------------------------

MARXISM:

            Writing in 1909, E. M. Forster could see and foresee a world in flight from the human body and the world of matter. In the beginning of the third and last section of his far-future dystopia, "The Machine Stops," moving toward the end of the imagined world of the Machine, Forster presents "one of the most advanced" of future intellectuals, an antiEmpiricist historian who warns his audience to "Beware of first-hand ideas!" and asserts that "First-hand ideas do not really exist. They are but the physical impressions produced by love and fear, and on this gross foundation who could erect a philosophy?" Forster's crank exhorts his audience to let their ideas "be second-hand, and if possible tenth-hand, for then they will be far removed from that disturbing element—direct observation." Actually, the crank isn't all that crazy in suggesting getting a number of different points of view on a historical event, in this case, his specialty, the French Revolution. What is disturbing, though, even to those of us who want knowledge carefully located, is his idea that the farther away from events we get the better we can judge, until "[…] there will come a generation that has got beyond facts, beyond impressions, a generation absolutely colourless, a generation 'seraphically free / From taint of personality,' which will see the French Revolution not as it happened, nor as they would like it to have happened, but as it would have happened, had it taken place in the days of the Machine," i.e., in their own time, as ideologically constructed. "Tremendous applause greeted this lecture," Forster's Narrator tells us, because it "did but voice a feeling already latent in the minds of men—a feeling that terrestrial facts must be ignored." 
            Not very much later in the century, advanced thinkers will bring back the idea that "terrestrial facts" don't even exist. 
            According to Roy Bhaskar, there is "abundant textual evidence" for Karl Marx's "simple, commonsense realism," the sort of everyday, colloquial realism that asserts "the reality, independence, [and] externality of objects": i.e., that the "real" world we see out there is really there. Most modestly stated, as a character in Ursula K. Le Guin's "Dancing to Ganan" puts it, simple realism is "the notion that there is, somewhere, if one could just find it, a fact" (Fisherman118), maybe even many facts. There is also evidence, Bhaskar says, for Marx's "scientific realism": i.e., his belief in the reality of "the objects of scientific thought" and structures scientifically inferred — but scientific realism does not concern me here; what does is that "an entire tradition" of Marxism, i.e., much of "Western Marxism," has "interpreted Marx as rejecting" simple realism and has very influentially tended toward "some variety of epistemological idealism, normally anti-naturalistic and judgmentally relativistic" (Bhaskar, "Realism" 407, 408). 
            Bhaskar traces this tradition back to György Lukács's 1923 History and Class Consciousness, where Lukács rejects "any distinction between thought and being as a 'false and rigid duality.'" This anti-realist tradition "proceeds down to the extraordinary claims made on behalf of Marx by e.g. [Leszek] Kolakowski that the very existence of things comes into being simultaneously with their appearance in the human mind' (1958 […]) and [Alfred] Schmidt that 'material reality is from the beginning socially mediated' [1962]" ("Realism" 408). According to Bhaskar, no less a figure than Antonio Gramsci found "the very idea of a reality-in-itself […] a religious residue." In his Prison Notebooks (1929-35), Gramsci redefined "the objectivity of things […] in terms of a universal intersubjectivity of persons; i.e. as a cognitive consensus, asymptotically approached in history but only finally realized under communism." Pushing the matter, Gramsci, in Bhaskar's reading, holds "that human history is not explained by the atomistic theory, but that the reverse is the case: the atomistic theory, like all other scientific hypotheses and opinions, is part of the superstructure,'" i.e., part of the cultural superstructure raised up upon a deeper and more significant (human) reality. Suggesting that we take Gramsci on atomic theory quite literally — that atomic reactions happen because humans theorize atomic reactions — Bhaskar says that Gramsci's remark here "reminds one of Marx's jibe against [Pierre-Joseph] Proudhon that like 'the true idealist' he is, he no doubt believes that 'the circulation of the blood must be a consequence of Harvey's theory" of the circulation of blood (Poverty of Philosophy, ch. 2, sect. 3). Bhaskar finds Gramsci and some other Western Marxists "in favour of a historicized anthropomorphic monism," maintaining that "nature, as we know it, is part of human history." I.e., Bhaskar says these Western Marxist teach that the world is One and that One is centered in and made by humanity and our history ("Knowledge" 258-59) — an idea Bhaskar does not seem to like. Alternatively, we can see these Western Marxists returning, ironically, to the German Idealism of Georg Wilhelm Friedrich Hegel (1770-1831), where "The categories of human thought are […] at the same time objective forms of Being, and logic is […] ontology"—a path I will not follow (Fetscher 198). 
            Whatever its sources, it is something like Gramsci's vision of reality as the "intersubjectivity of persons" and "cognitive consensus" that George Orwell attacks in what is sometimes called «The Grand Inquisitor» section of Nineteen Eighty-Four (Part III, section 3-III.5). Orwell's immediate concern was the falsification of history under Hitler and Stalin (see Crick 119); still, at least for satiric purposes, Orwell in Nineteen Eighty-Four comes down squarely for old-fashioned British commonsense empiricism against the extreme idealism of Winston Smith's Inner Party torturer and instructor, O'Brien. O'Brien tells Smith,
The first thing you must realize is that power is collective. The individual only has power in so far as he ceases to be an individual […]. [I]f he can make complete, utter submission, if he can escape from his identity, if he can merge himself in the party so that he is the Party, then he is all-powerful and immortal. The second thing for you to realize is that power is power over human beings. Over the body—but, above all, over the mind. Power over matter —external reality as you would call it — is not important. Already our control over matter is absolute. * * *          […] We control matter because we control the mind. Reality is inside the skull[…]. You must get rid of those nineteenth-century ideas about the laws of nature. We make the laws of nature. * * *  […] Before man there was nothing. After man, if he could come to an end, there would be nothing. Outside man there is nothing. * * * […]This is not solipsism. Collective solipsism, if you like. But that is a different thing; in fact the opposite thing." (218-19; III.3) 

I doubt that Orwell's criticisms had much effect, but there was this much change in Western Marxism in the generation following Orwell: Jürgen Habermas, in Knowledge and Human Interests (1972), at least according to Bhaskar, allowed for the origin of "the human species as a purely natural process" even while seeing "reality, including nature, as constituted in and by human activity." And Theodor Adorno, a little earlier — Negative Dialectics (1966)—advised giving up trying to resolve objectivity and subjectivity "and argues against any attempt to base thought on a non-presuppositionless [sic] foundation and for the immanence of all critique" ("Knowledge" 260). 
            I'll put the matter that, by the 1970s, there was a long-standing debate on the Left that had pretty well concluded on the impossibility of finding transcendent epistemological "Archimedean points" outside the world from which to observe the world. This meant the inevitability of "the immanence of all critique." We judge situations more or less from within them and cannot get a godlike overview; we are located at certain points in space-time, and that is that. So our judgments are relative. At least among people with a lot of academic philosophy — including those coming from very unMarxist positions —this lead to a kind of crisis in epistemology. Matters soon got worse. In the words of my friend and colleague John H. Crow, as I recall them  Modernism's "Epistemological uncertainty yielded (to) ontological instability" — postmodernism. 

-------------------------------------------------------------------

LINGUISTICS/POSTMODERNISM-POSTSTRUCTURALISM/FEMINISM:

            In the 5th century BCE, in Elea, in southern Italy, the Greek philosopher Parmenides reacted against Heraclitus's theory of flux by, among other things, composing a poem "On Nature," privileging among the "ways of research" the "absolutely noncontradictory way that says only what is, Being, is really true." If you think something, you assert its existence (an idea that will get a lot of play later, with René Descartes' variation that one must first think oneself). Reality, then, is that which can be truly thought: grasped and communicated, and that which can be thought and communicated is reality. "The primal source of the Eleatic philosophy thus lies in the archaic sense of language, according to which one cannot pronounce 'yes' and 'no' without deciding upon the reality or unreality of the objects of the statements." So human language is not merely symbolic but "corresponds to reality in its structure," and it is "From the premise of the essential coalescence of language and reality follows Parmenides's theory of Being […]" (Calogero and Starkey 6.526; see also C. Benjamin 8-10). Parmenides and his school, then, introduced into philosophy ideas similar to those in magic, myth, and mysticism. 
            By the early twentieth century, the linguistic ideas of Parmenides et al. were again very fashionable in philosophy. Fritz Mauthner, one of the founders of modern linguistic analysis in philosophy, could hold that "Every attempt to tell what is true just leads back to linguistic formulations, not to objective states of affairs" — a position that "bears some affinities to the views expressed in Ludwig Wittgenstein's Tractatus Logico-Philosophicus (Popkin 16.855), another road I decline to travel. By the middle of the twentieth century, there was a good deal of emphasis among the intelligentsia for placing reality in words, but there was some hope that "The Age of Analysis" could produce, a science of "semiology" or a grounded philosophy of language, or (at least) an exposition of structures, that could serve well enough as a philosophy of the world. 
            As David Lehman tells the story — hostilely, among other things—that hope was called into question at Johns Hopkins in 1966 with Jacques Derrida's paper "Structure, Sign and Play and the Discourse of the Human Sciences" (coll. and trans. Writing and Difference 1978). The key passage, according to Lehman, takes precisely the view that can easily, and I stress can, lead to a vision of social construction. The moment came, Derrida asserts, in the development of the concept of structure "when language invaded the universal problematic, the moment when, in the absence of a center of origin, everything became discourse […] that is to say, a system in which the central signified, the original or transcendental signified, is never absolutely present outside a system of differences. The absence of the transcendental signified extends the domain and the play of signification infinitely" (Writing and Difference 278-80). Lehman paraphrases this — usefully for my point here, though arguably — as "Nothing exists ahead of language or outside it; there are no things or ideas except in words." Alternatively put, "Il n'y a rien hors du texte," which Andreas Huyssen renders "there is nothing outside the text." Possibly looking to J. Hillis Miller's statement that "Language […] thinks man and his 'world' […] if he will allow it to do so" (Miller 224), Huyssen couples Derrida on the textuality of reality with the "insight that the [perceiving] subject is constituted in language" (Huyssen 259). Or in Lehman's unnuanced reading, "Words speak us," whether we "allow" them or not: "[…] we are merely passive conductors of language" (106). 
            Human beings, according to such theories, seem to be in a Westernized, secularized, language-centered version of the relationships among "the personal self and the Self that is identical with Brahman, between the individual ego and the Buddha womb or Universal Mind." For the individual human mind, the mystic Huang Po taught, to go into the state of "no-mind," one "must not try to think it, but rather permit ourselves to be thought by it" (paraphrased Huxley, Perennial Philosophy 73; ch. 4). As the comedian George Carlin could state as part of his early-1970s act, "All we have is words," except Carlin's we in this sentence may also be constructed by words, so that instead of Universal Mind thinking us, we are spoken by language. 
            In modernism of the J-P Sartrean Existentialist variety, there was no God to create us; our existence preceded any essence we might have, which was our job to create. Philosophically, the action was in the perceiving subject rather heroically making something of himself (and that's himself: Existentialism is masculinist). Roughly speaking, consciousness was king. 
            Seyla Benhabib can sum up for the (post)modern philosophers—and help illustrate a problem with technical sources in the Theoretical disciplines:

Whether in analytic philosophy, contemporary hermeneutics, or French poststructuralism, the paradigm of language has replaced the paradigm of consciousness[…]. [T]he focus is no longer on the epistemic subject [a potential knower] nor on the private contents of its consciousness but on the public, signifying activities of a collection of subjects.[…] The identity of the epistemic subject has changed as well: The bearer of the sign cannot be an isolated self — there is no private language as Wittgenstein has observed; it is a community of selves whose identity extends as far as their horizon of interpretations ([Hans-Georg] Gadamer) or it is a social community of actual language users (Wittgenstein). This enlargement of the relevant epistemic subject is one option. A second option, followed by French structuralism, is to deny that, in order to make sense of the epistemic object [something external, potentially knowable], one need appeal to an epistemic subject at all. The subject is replaced by a system of structures, oppositions, and différances which, to be intelligible, need not be viewed as products of a living subjectivity at all. (112) 
            If our (post)moderns are correct, there is no single, unified, perceiving "I"; if the crucial structure is language, perhaps there is no world outside of the discourse among many "subject positions" — the people who say "I" and construct themselves and the world(s) they talk about. Again George Carlin: "All we have is words" — literally, except the words may have us even more. If this theory is correct, all we can do (and do do) to create the «facts» of our world is tell one another fictions, stories. 

Two points here.
            First, in the quotation above Benhabib really is trying to communicate, and in an essay in an anthology of essays the editor intended for a relatively wide and definitely disparate academic audience, using approaches both feminist and postmodernist and trying to build some bridges. Benhabib's paragraph is still tough going for the uninitiated and this points at a larger problem. There were and possibly still are writers of "pomo" who are happy that where the light has been seen and poststructuralism has become the norm, critical discourse has become philosophically respectable (although some of my colleagues in the Philosophy Department demurred on how much respect they'd show most poststructuralist thinkers). Indeed, Frederic Jameson's style has been admired at length by Terry Eagleton (op. cit.), and a difficult style has been presented in my presence as A Good Thing in slowing down reading and making pinning down an exact and constant meaning impossible. Okay … but my unvoiced personal response to unnecessary difficulty is the sarcastic, "Well thank you for being so generous with my time and effort!" My more political response comes from the memory of Carl Sagan's warning to scientists on "The Frailty of Knowledge" — which in my memory includes a warning to scientists of the dangers of their cutting themselves off from their surrounding communities. I haven't been able to find the clip with the warning, but Sagan's authority isn't necessary. When the library at Alexandria was destroyed, one of the reasons was the enemies of science and disciplined secular scholarship were "full of passionate intensity" and too many potential defenders "lack[ed] all conviction" and/or were too few in number to do much good.
                        The political problem of much poststructuralist writing is that it is too philosophically and scientifically respectable in being overly technical, too easily read by the initiate but unintelligible to the ignorant masses, including the unphilosophical who hold PhDs. The hard sciences can afford — if barely — being esoteric: they lie behind technologies of great power, and if that power frightens people it also impresses them: the physics that produced nuclear weapons obviously speaks a language, however unintelligible, of power.
                        The social sciences and humanities scare people less, which is both a good and a bad thing. But there are fairly large groups of public intellectuals, journalists, policy wonks and others who for a long time were able to read in our fields and could try to assure a broader public that we in the academy were earning our keep. When our most in-group respected books and articles in criticism and critique became unintelligible to this mid-level audience, when lay readers come to suspect that scholars in fields they'd studied were putting them on and holding them in contempt — that has been an invitation to trouble. Content aside, the turgid, convoluted, Continental-philosopher style of much postStructuralist writing (and the Structuralists before them) — writing innocent of revision for clarity and often militantly uninviting — has unnecessarily lost for the humanities and social sciences too many valuable allies.
            Content is another problem.
            If there is nothing outside texts and we're all just telling one another stories; if the two stories of Creation in Genesis give us two creation myths and Big Bang theory a third; if Coyote created human beings by tricking the other animals is one story and Charles Darwin's Descent of Man (1871) another, but just another — then, Houston, we do indeed have a problem. Or no problem: thoroughly postmodern Karl Rove was just right and "there is no objective, verifiable reality at all" or at least none we really should insist on if it would be inconvenient or dangerous to do so. It's not that "the best lack all conviction," but that The answer to the question, "Why be defensive about objective criteria for true and false?" is that one shouldn't be defensive or even make a defense of true and false if "True" and "False" are just words in tales and sentences inside our heads.
            My one real foray until now into this part of the culture wars was a poem published in a local anthology, Ambergris 1.2 (1987), and I think I'll end where I began. The poem is called "Andersen (post)Modernized": "The Emperor's New Clothes," tweaked a bit to bring it up to date.
"The Emperor's naked!"
The little boy yelled, 
"Bare-assed, buck, stark, unclothed!
"So they grabbed him up (the earplugged men, silver-eyed)
And threw him into the Official Car,
For quick trip to the Ministry.
Sat him down to discourse philosophical—and prudential.
Let him know 
Reality is made between our ears
(While beating him and shocking him
Raping him and breaking him).
Until the Minister came in
To tell him, 
"You've done sacrilege
Finding your view privileged"
And stuck a large pistol (a 45) into his little ear
And asked him what now was real,"
Relative to Emperors, sartiorialwise."
And found all monarchs fully clothed
In all Reality that is (or can be)
Inside the head of a little boy
Who's learned how worlds
Get made by human brains and
How guns dress emperors.

Older academics complaining about "Truthiness" on the American Right, or the contempt for facts from the George W. Bush administration through the election of 2016, should be pressed with the question, "And where were you / In '92?": not physically, but where their heads were at. As usual in human affairs, there's plenty of blame to go around, and Social Construction and related discourse(s) on the 20th-century Left provided rich soil in which pernicious trends of the later Right could take root and thrive. 
===========================================
===========================================


Selected Works Cited but not Hyperlinked

Benhabib, Seyla. “Epistemologies of Postmodernism: A Rejoinder to Jean-François Lyotard.” 1984. Rpt. Feminism/Postmodernism, q.v.
Benjamin, Cornelius. “Ideas of Time in the History of Philosophy.” In Voices of Time (q.v. below). 3-30
Bhaskar, Roy. “Knowledge.” In A Dictionary of Marxist Thought, q.v. 
Bhaskar, Roy. “Realism.” In A Dictionary of Marxist Thought, q.v.
Calogero, Guido, and Lawrence H. Starkey. “Eleaticism.” Encyclopaedia Britannica: Macropaedia. 1974. 
Crick, Bernard, ed., introd., and annotations. George Orwell: Nineteen Eighty-Four. Oxford, UK: Clarendon, 1984.
Derrida, Jacques. Writing and Difference. Trans. Alan Bass. Chicago: U of Chicago P, 1978.
A Dictionary of Marxist Thought. Ed. Tom Bottomore. Cambridge, MA: Harvard UP, 1983.
Eagleton, Terry. "Fredric Jameson: The Politics of Style." Diacritics 12.3 (Autumn 1982): 14-22
Feminism/Postmodernism. Ed. and Introd. Linda J. Nicholson. New York: Routledge, 1990.
Fetscher, Irving. “Hegel.” A Dictionary of Marxist Thought, q.v.
Huxley, Aldous, comp., commentary. The Perennial Philosophy. New York: Harper, 1945.
Huyssen, Andreas. “Mapping the Postmodern.” 1984, 1986. Feminism/Postmodernism.
Le Guin, Ursula K. “Dancing to Ganam.” Amazing Sept. 1993. Coll. A Fisherman of the Inland Sea. New York:   
      HarperPrism, 1994.
Lehman, David. Signs of the Times: Deconstruction and the Fall of Paul de Man. New York: Poseidon-Simon & Schuster, 
      1991. 
Miller, J. Hillis. “The Critic as Host.” In Miller et al. Deconstruction and Criticism. New York: Seabury, 1979. 
Orwell, George (pseud. of Eric Blair). Nineteen Eighty-Four. 1949. Rpt. 1984. New York: NAL, 1961.
Popkin, Richard H. “Skepticism.” Encyclopaedia Britannica: Macropaedia. 1974.
Voices of Time: A Cooperative Survey of Man’s Views of time as Expressed by the Sciences and by the Humanities. Ed. J. T. 
      Fraser. New York: George Braziller, 1966.



Also note: 
Danner, Mark (2007). "Words in a Time of War: On Rhetoric, Truth and Power". In András Szántó. What Orwell Didn't Know: Propaganda and the New Face of American Politics (First edition ed.). Philadelphia, PA: PublicAffairs Reports.




       [1] Much of the following comes from "Transition: The Social Construction of Reality (Background for Four Ways to Forgiveness and A Fisherman of the Inland Sea)," Chapter 10 of Coyote's Song: The Teaching Stories of Ursula K. Le Guin. My handling of social construction is much more positive in that context, as necessary for understanding how seriously Le Guin played with such ideas in the linked novellas in Four Ways and the major stories in Fisherman.