Saturday, November 9, 2013

Firing Editors Is Almost as Much Fun as Firing Guns

Dick Metcalf: Insufficiently fanatic about guns.
One of the more intriguing stories of the last few days concerns now erstwhile Guns and Ammo Contributing Editor Dick Metcalf, who was sacked by that magazine for penning an editorial in which he suggested the utterly reasonable suggestion that some restrictions on gun ownership are not only constitutional but possibly even a good idea:
The fact is, all constitutional rights are regulated, always have been, and need to be. Freedom of speech is regulated. You cannot falsely and deliberately shout “Fire!” in a crowded theater. Freedom of religion is regulated. A church cannot practice human sacrifice. Freedom of assembly is regulated. People who don’t like you can’t gather an “anti-you” demonstration on your front lawn without your permission. And it is illegal for convicted felons or the clinically insane to keep and bear arms….

I firmly believe that all U.S. citizens have a right to keep and bear arms, but I do not believe they have the right to use them irresponsibly….

I don’t think requiring 16 hours of training to qualify for a concealed carry license is an infringement in and of itself.
I mean, wow, right? This, like any manifestation of sanity or moderation, is tantamount to treason in the world of the echo chamber. This applies to the left and the right, of course, and the phenomenon doesn’t change based on whether or not I agree with the content of the material.

Naturally, the zealots called for blood, and, has become a depressingly familiar reaction of late, the less-crazy promptly capitulated. Editor Jim Bequette immediately (or nearly immediately) fired Metcalf, apologized profusely to the readership, and promptly resigned, himself. (He was planning to step down in another month or two, anyway.)

Let’s be clear: it is absolutely within Bequette’s rights to fire Metcalf. Guns and Ammo is a privately-owned enterprise. They’re under no obligation to provide a forum for views they don’t support. Nor can they be faulted for making a business decision to accommodate the collective will of their subscribers (barring some sort of discrimination, and there is no evidence of that). There’s nothing wrong, either, with those readers’ exercising their right to protest decisions they don’t like. And Metcalf’s response to the controversy—a column published on the Outdoor Wire website—comes perilously close to conflating the philosophical/ethical concept of freedom of speech with the constitutional/legal protection of freedom from governmental restrictions on free expression.

All that said, the story remains disappointing. First off, if Metcalf is to be believed—and I’ve seen no evidence to the contrary—he was asked to write the offending article on a page that was “intentionally designed to address controversial issues, and to invite reader response.” His intention in writing and, apparently, Bequette’s in publishing it, was to “provoke a debate” (Metcalf), or to “generate a healthy exchange” (Bequette). What they got was simply a shitstorm, which, to say the least, isn’t the same thing.

More disturbingly, Bequette apologized for what should have been a good thing: the attempt to apply a little nuance to the conversation, to assert that it is possible to be pro-gun without being doctrinaire about it. To be sure, there is nothing inherently positive about moderation, but Bequette’s servility is truly problematic:
I once again offer my personal apology. I understand what our valued readers want. I understand what you believe in when it comes to gun rights, and I believe the same thing.

I made a mistake by publishing the column. I thought it would generate a healthy exchange of ideas on gun rights. I miscalculated, pure and simple. I was wrong, and I ask your forgiveness.
To call this craven display troubling is akin to saying it can get a little brisk in Duluth in January. After all, Bequette effectively begs forgiveness for over-estimating his audience, for daring to believe that the G&A readership came to their conclusions based on a thoughtful analysis rather than as articles of faith surpassing any rational analysis. What “our valued readers want” is red meat, not… you know… thinking. Bequette suggests that anyone, including a long-time gun proponent, who suggests that recognizing judicial precedent and political realities might not be a bad idea… well, such a person is to be treated like crazy Uncle Carl. If anyone was mistreated by Bequette, it wasn’t the readership whose tender sensibilities were sore offended by the ravings of someone who agrees with them only 98% of the time. No, it was Metcalf, who was sent as a lamb to the slaughter, and then unceremoniously dumped when Bequette’s decision to publish an editorial with even a whiff of moderation blew up.

Curmie’s netpal Jack Marshall asks “Was ‘Guns and Ammo’ unfair to fire Dick Metcalf for writing a moderate and thoughtful opinion piece advocating some gun controls?” My answer is equivocal. On the one hand, Metcalf bears responsibility for his own words, and he had to know that suggesting anything other than conventional NRA-like dogma might end badly. Moreover, the whole comparison of gun ownership to car ownership may appeal to the moderately-minded, but anyone with a lick of sense knows that to conflate those ideas in something to be published in Guns and Ammo is to pour every manner of flammable liquid on an already raging conflagration. And, as noted above, G&A can hire and fire whomever they choose for whatever (legal) reason they choose. Metcalf knew that when he signed on. But yes, it is unfair that Metcalf was asked to write a piece intended to engender discussion and was fired for then doing so. Note: “unfair” is not to be confused with “immoral” or “outrageous.”

More significantly, the increasing unwillingness to admit even the possibility of compromise or even consensus signals not merely the immaturity of both the editors and the readership, it portends a crisis in democracy itself. (I wrote about this in one of my last essays in the old blog.) Jack Marshall observes that “Advocacy organizations cannot afford to let moderate positions weaken their absolute missions and credibility as champions for them, no matter how reasonable those moderate suggestions may be to objective parties. Indeed, properly used, extreme and absolute positions lead to more moderate policies.” I disagree. Refusing to consider alternate points of view runs counter to everything we in a democratic and deliberative society ought to hold dear. And the fact that moderate policies sometimes result from clashes of extreme ideologies (even then, only as the result of eventual compromise, which the G&As of the world hold as anathema) strikes me as merely consequentialism: puerile behavior cannot be legitimized by the off-chance of a positive result.

None of this, of course, means that the leadership at Guns and Ammo did anything wrong. Not unless they want to be taken seriously but anyone but True Believers, that is… and they pretty much gave up any claim to that a long time ago.

Sunday, November 3, 2013

Lexiles and the Fetishization of Quantification

A year and a half or so ago, I wrote about an inane report about how the level of discourse of our Congresscritters has generated in recent years. And, of course, there was “objective” analysis, which didn’t have anything to do with renewed fervor for partisan hackery, increasing disregard for the accuracy of one’s statements, or the apparently undeniable siren song of equating one’s political opponents with Hitler. No, the scourge blighting American political discourse was an insidious combination of monosyllabic words and simple sentences.

You see, there’s something called the Flesch-Kincaid score, which bases an objective (albeit worthless) ranking of the sophistication of one’s rhetoric on two factors: syllables per word and words per sentence. The F-K scores are alleged to match up with the grade level of the linguistic sophistication of the piece under consideration. So what are we to make of the fact that Martin Luther King’s “I Have a Dream” speech, one of the greatest displays of rhetorical brilliance ever created, checks in at freshman-in-high-school level? Or that the most famous line from that speech, “I have a dream today,” gets a score of 0.52? You and I, Gentle Reader, see a problem with the methodology.

Not so, however, the hack journalists of both sides of the political divide, who pounced on the alleged “evidence” to show the The Other Guy really is as ignorant as My Side says. Exhibit A, Fox News: “Obama’s SOTU Written at 8th Grade Level for Third Straight Year.” Exhibit B, The Inquisitr: “Congress Officially Dumber, Study Finds, Republicans in Particular Showing Marked Decline.”

In writing about this pseudo-evidence, Curmie exclaimed, he hoped rhetorically, “Seriously, is there a study that means less than this one?” Alas, someone took that as a challenge. Unfortunately, it’s a lot more serious, as this particular round of pseudo-scientific idiocy actually has consequences.

The concept of “lexiles” has apparently been around for a generation, and I think I knew that but the first I really registered hearing of it was reading a rather foam-flecked article on the New Republic site. Yes, the faux terror of “federal bureaucrats” is hyperbolic at best, and author (and University of Iowa professor) Blaine Greteman casually ignores the fact that the lexiles are only one means of determining “readability” levels. For all that, this is scary stuff, indeed.

Let’s face it. If it is even possible to begin a story the way Greteman does, the supposedly objective criteria being employed are worse the worthless:
Here’s a pop quiz: according to the measurements used in the new Common Core Standards, which of these books would be complex enough for a ninth grader?

a. Huckleberry Finn
b. To Kill a Mockingbird
c. Jane Eyre
d. Sports Illustrated for Kids' Awesome Athletes!

The only correct answer is “d,” since all the others have a “Lexile” score so low that they are deemed most appropriate for fourth, fifth, or sixth graders.
Kid stuff.
Wow. I was one of the top couple of students in my 10th grade English class when we read Huckleberry Finn, and to say it was plenty “complex” for me would be to err on the side of understatement rather than hyperbole. “Complexity” has more to it than vocabulary and sentence structure… you know, like subject matter and levels of ambiguity and symbology and… well, other things that actually matter.

Let’s face it: if you’ve got a mechanism that says that Mr. Popper’s Penguins is a tougher read than Hemingway, Steinbeck, or Graham Greene, you’ve got a system so flawed that it deserves to be promptly shit-canned. Naturally, the Common Core geniuses institutionalized it.

There’s a good discussion by Rider University prof Russ Walsh, in which he references the actual CCSS (Common Core State Standards):
Appendix A of the CCSS defines text complexity. It is very important that all teachers understand this definition because it is useful and empowering. Text Complexity is made up of three dimensions: quantitative, qualitative, and reader and task. Quantitative measures include the useful, but limited “reading level” measures like the Fry Readability Graph and Lexile scores. The CCSS has focused on Lexile scores and has recalculated them to reflect the push for greater complexity. Qualitative measures include such issues as considerateness of text (clear structure, coherence, audience appropriateness), knowledge demands, use of figurative language, vocabulary, etc. Reader and task measures concern themselves with the cognitive abilities, motivation and experience of students.
Walsh does describe the Lexiles as “useful,” but that’s because he’s nicer than me, not because we really disagree. Lexiles are also “limited” in Walsh’s world; more importantly, he “[worries] that simplistic readings of this call for greater text complexity will lead to disempowerment of teachers and inappropriate reading assignments for students.” The likelihood of that happening, from my perspective, borders on ontological certitude.

That’s because the fix is already in. Here’s Greteman again:
As the Common Core Standards Initiative officially puts it, “until widely available quantitative tools can better account for factors recognized as making such texts challenging, including multiple levels of meaning and mature themes, preference should likely be given to qualitative measures of text complexity when evaluating narrative fiction intended for students in grade 6 and above.” But even here, the final goal is a more complex algorithm; qualitative measurement fills in as a flawed stopgap….

… I believe that many educational leaders will hear the call for more complex texts and will try to force teachers to use texts that are simply harder. Remember “rigor” is not in the difficulty of the text, but in the vigor of the instruction. Texts that are too difficult for a particular reader are not rigorous; they are just hard. I also believe that publishing companies, under the guise of the publishers guidelines provided them by the authors of the CCSS for ELA, will soon be on the market with anthologies purported to “meet the standards” that will not meet the needs of many students.
Heady intellectual fare.
In other words: the CCSS folks aren’t going to budge on using their stupid rubric for grades 5 and below, and the expertise of teachers will become subservient to their divinely inspired claptrap as soon as they can develop quantifiable criteria that are merely inaccurate instead of So Freaking Imbecilic That Nobody Could Take Them Seriously. Well, nobody but the likes of The Atlantic’s Eleanor Barkhorn and whoever wrote that headline: “Teachers Are Supposed to Assign Harder Books, but They Aren't Doing It Yet.”

Implicit here is the double-whammy of shoddy journalism: the assumption that anything the corporatists behind the Common Core require is legitimate, and the implicit belief that the problem isn’t bullshit requirements, but teachers’ intransigence. Of course, teachers, principals, superintendents, and school boards have not only the right but the responsibility to reject nonsense like lexiles and to substitute their own… wait for it… expertise. (Anyone who has read Curmie’s page for a while knows my take on the two meanings of authority.)

Of course, using completely nonsensical means to measure literature isn’t new. Well over 2400 years ago, Aristophanes concluded his long farcical confrontation between Aeschylus and Euripides in The Frogs by having Dionysus challenge them to recite their “weightiest” verses. Aeschylus wins every round because rivers are “heavier” than cattle, death than persuasion, chariots than clubs, etc. “Weight,” then, no longer means “profundity of thought or expression,” but rather, “the gravitational attraction or solemnity of the subject matter.” It’s a joke, of course, and the overwhelming majority of the audience in late 5th-century BC Athens would have caught on. Somehow, however, I suspect that there was someone named Βιλλγάτες or Αρνεδύνκαν who saw nothing wrong with the test. It was objective, after all.

The attempt to scientize art got a huge jolt in the mid- to late 19th century, with the rise of naturalism, a movement headed by Émile Zola but ultimately inspired by social Darwinism. Zola, Balzac, and their contemporaries generated some stark portrayals of life as lived, especially in urban areas during the Industrial Revolution. They also propagated some really bad theatre (and quasi-theatre): since art was supposed to replicate life, the ultimate expression of theatrical naturalism was to charge admission to watch an actual butcher carve up a beef carcass. Really. No wonder Henrik Ibsen supposedly described the difference between realism and naturalism by tersely observing, “I go to the sewers to clean them; Zola, to bathe.”

The earnest desire to make art more scientific (Curmie resists the urge to use the phrase “reduce art to the level of science”) also manifested in the work of François Delsarte, best known for his voluminous commentaries on elocution, most especially his System of Oratory. Delsarte writes expansively about virtually every conceivable vocal inflection and gesture, seeking a communication system that is efficient and transparent. In one sense, this is a noble goal; in another, it’s transcendently naïve. The fact is that in life and especially in art, ambiguity is not merely ubiquitous but desirable—and this is true whether we ascribe to the modernist model of artists’ imbedding meaning to be discovered by the spectator or the post-positivist schema in which the artist’s function is to catalyze meaning which is ultimately constructed by the observer. Delsarte was a significant player in the development of modernism, he influenced a number of important figures in theatre history (Steele Mackaye, for example), and he is still highly regarded in much of the dance community. Theatre people today, however, regard him (rightfully, I’d suggest) as the quintessence of wrong-headed reductionism.

This idea that objectivity often creates precisely the opposite result of that which was intended serves as one of the themes of The Memorandum, one of the best plays by the late, great Václav Havel. Shortly after Havel’s death two years ago I dug up a piece I’d originally written for the Cedar Rapids Gazette back in 1990, and re-printed it on my other blog. I therefore quote myself from 23 years ago:
In The Memorandum (1968), the central character, Gross, heads an unnamed government agency. After authorizing the circumvention of a silly regulation, he allows himself to be half blackmailed, half cajoled, by an amoral underling into approving a new synthetic language called Ptydepe. Said to make communication more “efficient,” Ptydepe mandates that every word must differ from every other word of equal length by at least sixty per cent of its letters. Further, words are assigned to concepts by the relative frequency of their use: the more often a word is used, the shorter it will be. Hence, the shortest word in the language, “ng,” means “whatever”; the word for “wombat” has 319 letters.
I should add that in Ptydepe, vowels and consonants are used in random sequence, so a word with a half dozen consecutive consonants is not uncommon. Moreover, every variation of a conventional word, including whether it’s used in jest, or to surprise the listener, gets its own Ptydepe word—defeating the purpose of language play altogether. Eventually, of course, Ptydepe proves to be an unsuccessful experiment, so it is eliminated… to be replaced not by the perfectly workable normative language that people can understand, but by a new, totally different synthetic language with a completely different but equally inane set of rules.

You see where this is going, don’t you, Gentle Reader? Objectivity for its own sake is inherently idiotic. Eliminating ambiguity provides a false efficiency, but it destroys nuance, and ultimately both language and art suffer irreparable damage. Everybody knows that. Well, everybody but the educationists. They sort of think that an educational system based on the philosophical principles advocated by the Communist apparatchiks in post-Stalinist Eastern Europe is the way to go. How very “American” of them.