Wednesday, December 13, 2023

Context, and the Effort to Pretend It Doesn't Matter

Liz Magill testifying
For better or worse, the testimony of the presidents (well, one of them is now an ex-president) of Harvard, MIT, and the University of Pennsylvania before a congressional committee a little over a week ago remains a subject of considerable commentary, particularly from those who sought to denigrate the alleged evasiveness of the presidents’ responses.

Specifically, the trio wouldn’t answer “yes” or “no” to questions posed by Rep. Elise Stefanik about whether calling for the genocide of Jews violates their universities’ policies on bullying and harassment.  They gave boilerplate responses, suggesting that context would be a determining factor.  Yes, they looked sort of bad, but Curmie struggled to discern how their answers were inappropriate.  Context does matter, always; terms like “pervasive,” “severe,” and “directed” may seem legalistic, but Curmie isn’t sure how one can answer legalistic questions without responding in kind.

Moreover, Curmie can find no one who advocated for the “genocide of Jews” in those terms.  There were, apparently, chants of “intifada,” and of “from the river to the sea.”  The former refers to resistance against oppression; it suggests the possibility of violence, and that’s what Hamas means by it, but the word is apparently used not infrequently by others without that implication. 

Similarly, “from the river to the sea, Palestine shall be free” is a rallying cry for Palestinians.  But, as Laurie Kellman points out in an AP article, “what the phrase means depends on who is telling the story — and which audience is hearing it.  Many Palestinian activists say it’s a call for peace and equality after 75 years of Israeli statehood and decades-long, open-ended Israeli military rule over millions of Palestinians. Jews hear a clear demand for Israel’s destruction.”

In other words (wait for it), context matters.  When used to justify Hamas’s slaughter and kidnapping of innocent people whose only offense was to be in the wrong place at the wrong time, or as a means of intimidating specific Jewish people, it means something different than simply an declaration of general solidarity with the Palestinian cause.  Yes, it may be naïve to believe that one is expressing the latter sentiment without recognizing the distinct possibility that others, particularly but not exclusively Jews, will hear something different than what was intended.  But naïveté should not be a punishable offense, or, rather, it carries its own punishment.

Let’s be real for a second.  Congressional committee hearings are seldom intended to illuminate an issue, but rather to provide Congresscritters an opportunity to grandstand for their respective political bases.  Witnesses at such hearings are generally to be treated as piñatas to be pummeled by the smug and censorious pols who are blinded by their moment in the spotlight.  Stefanik did admirably at getting her name in the papers (be honest, Gentle Reader, had you ever heard of her before last week?), largely by asking questions that could not be reasonably answered with the “yes” or “no” responses she so petulantly demanded.  And she got a particularly high score on the smugometer.  The link included above is from her YouTube page.  She’s pretty damned proud of herself.  Curmie doesn’t think she should be.

As expected, and no doubt intended, the press was all over the hearing.  We heard lots about the presidents’ “inability” to give straight answers.  Their performance was derided by pro-Israeli pundits and parodied by Saturday Night Live (to be fair, the show pretty well skewered Stefanik, too).  Omitted from this commentary was a recognition that there’s a difference between speech which is repulsive and that which ought to be suppressed.  Nor was there any indication that the chattering class understands that criticism of the Israeli government is not intrinsically anti-Semitic, any more that criticizing the Obama administration was intrinsically racist.  Sometimes, yes.  Always?  No.

These reports also gloat about Ross Stevens, the UPenn alum who withdrew a $100 million donation because he decided the university was doing too little to suppress what he regarded as hate speech.  This, probably more than the reaction to now ex-president Liz Magill’s testimony at the committee hearing, is likely to have been the deciding factor in her no doubt forced decision to resign.  Curmie, who would prefer educational policy to be determined by other than plutocrats, is less enthusiastic about this maneuver. 

No one, certainly not Curmie, is countenancing the actual intimidation and harassment of Jewish students, which has in fact been occurring on college campuses across the country.  How widespread it is may be difficult to determine—Curmie has seen none of it first-hand—but it unquestionably exists, and needs to be aggressively addressed.  The cries from Muslim advocates that this situation is no different than what their predecessors endured post-9/11 are probably accurate but ultimately irrelevant.  Two wrongs…

But revenons à nos moutons; back to that testimony.  As you can no doubt discern from the commentary above, Gentle Reader, Curmie was not and is not particularly appalled by those university presidents’ answers.  Guarantees of free speech mean nothing if they apply only to non-objectionable speech.  Whereas it is true that as private universities the three schools in question are not bound by the 1st Amendment, as elite secular institutions of higher learning they have a responsibility to be at least as open to speech, even that which they may find abhorrent, as do their public counterparts.  Plus, of course, their respective mission statements promise that level of protected speech.

But the apparent unanimity of response, even from trusted sources, gave Curmie pause.  It’s clear that the line between protected speech and harassment, intimidation, or incitement is very thin and virtually indiscernible.  Might Curmie have found himself on the wrong side of that line?  Was his contempt for Rep. Stefanik’s self-righteous (and, indeed, harassing) posturing affecting his judgment on the larger issue?

Then, yesterday morning, he read an editorial in the Los Angeles Times penned by Eugene Volokh and Will Creeley.  The former is one of the most-respected constitutional scholars in the country; the latter is legal director at the Foundation for Individual Rights and Expression (FIRE).  When it comes to defining what is and what is not protected speech, we could do a lot worse than listening to what either of these gentlemen have to say. 

Curmie encourages you, Gentle Reader, to read the article in its entirety, but here’s their most important paragraph: “Antisemitism on campus is a real problem, and in this fraught moment, many Jewish students are understandably scared. But if freedom of expression is to survive on American campuses — and for our nation’s vitality, it must — Magill’s original answer was right. Context does matter.”

They subsequently argue that calls for intifada are protected or not according to context.  And whereas the segment quoted above is the best encapsulation of their total argument, the part that most caught Curmie’s attention comes later, in a consideration of the ethics and morality of killing civilians in order to advance a politico-military agenda or, especially, to deter other killing.  They note that the same argument could be made by both Hamas in their self-image as freedom-fighters and by the Israeli government in terms of their willingness to allow predictable and significant collateral damage in the form of civilian casualties in their attacks on Hamas.

Significantly, write Volokh and Creeley, “a broad rule against ‘calling for mass killing’ would render this discussion subject to punishment. Indeed, it would mean students could be punished for using the same argument to defend the American bombing of Hiroshima and Nagasaki.” 

Curmie notes an even more significant example: the British decision to allow Coventry to be bombed in order that the Nazis not know that the Enigma Code had been cracked.  That decision—sacrificing not merely civilians, but one’s own civilians—was a key to the Allied victory.  It literally could have changed not merely the longevity of the World War II, but its very outcome.  It had to have been a horrible decision to have to make, but it isn’t difficult to understand the reasoning.  The idea that a discussion of the propriety of that strategy would somehow be off limits undercuts the entire rationale for the existence of universities.

Here is where Curmie goes all Confucian (again) and insists upon looking at these incidents on a case by case basis.  Some of the words and actions clearly ought to be forbidden; others, however upsetting they may be, ought just as clearly be allowed.  And then there are the ones in the middle, where honorable, objective, and well-intentioned people can disagree.  It would be nice to have an actual Confucius around to make those close calls, but (alas!) he’s unlikely to appear on the scene. 

The goal is, and must be, to protect one group of students and faculty from actual harassment and true threats while protecting the freedom of expression of a different group of students and faculty.  Curmie, who lived his entire professional life subject to the often stupid decisions of university administrators, is less than entirely sanguine about turning such cases over to them, but there is no better alternative.  Distinguishing between that which is threatening and that which is merely offensive isn’t easy, but the job must be done, and done well, or we all lose.

Monday, December 11, 2023

"Good" vs. "Great" Teaching

Thirty-something years ago, Curmie read an article—perhaps it was in an academic journal, but he can’t remember for certain—that sought to distinguish between “good” teachers and “great” teachers.  Interestingly (to Curmie, at least), the author expressed his preference for the former: the in-the-trenches professional who successfully imparts and explains the prescribed course material while maintaining decorum and respect for authority.  This teacher is not boring, but the class is unquestionably about the material, not the teacher.

The latter, exemplified by the Robin Williams character in “Dead Poets Society,” is a showman, someone whose passion and energy are inspirational.  The author felt that such a teaching style was narcissistic or something.  Curmie remembers not being quite able to wrap his head around the idea that “good” was somehow better than “great,” wondering if the author might be a little jealous that his colleagues were more popular than he.

Anyway, something a few days ago made me reminisce a little about the great—in both the generally accepted definition of that term and the one adopted by that author—teachers I experienced in my many years as a student.  Like everyone else, Curmie had more than a few teachers who were neither good nor great.  His high school Analytic Geometry teacher was expert at the pedagogical equivalent of burying the lede, that Economics 1 prof in college could put coffee to sleep, and so on.  But Curmie was lucky—there were few of that description and a lot of the other kind.

Curmie read a few months ago that the woman who taught his 3rd and 5th grade classes had passed away at the age of 93.  It was only on reading her obituary that he learned that she had become principal of the school sometime after he’d moved away from the town.  His first thought was that whereas he was glad she got that promotion if that’s what she wanted, he was sad for those who came after him that they wouldn’t experience her in the classroom.  She wasn’t “great” according to the definitions of that author, but she was great in the mind of at least this former student. 

There were others who were certainly very good but, at least in my pre-college days, one stands out.  He was a young man, probably in his mid-20s, who taught 7th grade Social Studies.  I don’t know how it was that this Floridian came to teach in my school an hour and a half or two hours north of New York City.  But I do know that few if any people outside my immediate family have had as much influence in determining the person I ultimately became.

Imagine this, if you will, Gentle Reader: a bunch of tweens in the mid- to late-‘60s talking about the world around them.  This was the era of civil rights demonstrations, the women’s movement (then called Women’s Liberation), and the Vietnam War.  Importantly, we weren’t allowed to simply spout opinions; we had to have done research… so we learned how to do that.  I remember spending a lot of time in both the school and public libraries.   

In class we were introduced to anthropology and through it to cultures unlike our own, to the rationale both for the status quo and for dissent, to the essence of the Bill of Rights.  We debated ideas and perspectives in a way I’d never seen or heard.  We wrote research papers—mine was on environmental policy, especially as it pertained to air pollution (pretty heady stuff for a 12-year-old!).  And we were expected to cite sources appropriately (in a way few of my 21st-century university students could manage). 

His class opened my mind to the idea that education wasn’t just about memorizing facts, but about considering how facts fit together.  He taught us to walk that thin line, simultaneously respecting authority and being willing to challenge it, and I don’t think I ever as much as suspected what his politics were.  He was tough because he cared.  He was, in short, one of the best teachers I ever had.

Ah, but there was a problem, as you’ve probably already ascertained, Gentle Reader.  You see, what he was supposed to be teaching us was New York State history.  He wasn’t from New York and he believed, correctly I think, that such a course was essentially useless.  Even then, the population was increasingly mobile, and many if not most of us would end up living elsewhere.  I doubt that my own history of spending less than 10% of my adult life in New York (and that small fraction only because of grad school at Cornell) is an outlier.  Moreover, the really important stuff about New York history would appear in American History courses, and it’s unlikely that anyone, even those who never left the state, would ever care about the rest.

He did try to be “good” for a few weeks, but he sort of ran out of gas in our chronological progression shortly after John Peter Zenger.  Perhaps I’ve forgotten our section on the American Revolution, but I honestly don’t think we got that far.  There was a fair amount of history we didn’t cover, in other words!  Anyway, we learned about the Seneca Falls Convention because it marked a seminal moment in the women’s movement, not because it happened in New York State. 

For all I know, the teacher went to the administration and got permission to teach something other than New York State history.  I kind of doubt it, but I suppose it’s possible.  What I know with ever fiber of my being is that I’m extremely glad I got the course I did.  But he probably was, as Dickens might have said, “the best of teachers, the worst of teachers.”

Let’s grant that permission to scrap the prescribed curriculum was probably not forthcoming, and that cashing a paycheck for doing X and not actually doing X is unethical: more so, in fact, than Robin Williams-like flamboyance.  That I, and I strongly suspect a majority of my classmates, got far more out of this man’s course than we would have by learning the details of building the Erie Canal is, of course, consequentialism.  

But what if those consequences are eminently predictable?  I was, and am, fascinated by history, but I can’t imagine a year-long course in New York State history that would interest tweens in the slightest: not then, not now.  The course was, no doubt, the brainstorm of some idiot state legislator (the usual apologies for redundancy).  From my perspective, that puts us into “any change is an improvement” territory.  And the course he did teach was wonderful. 

He was, in other words, not a “good” teacher.  He was, however, great.  He didn’t mock authority, but he certainly circumvented it.  His course was not traditional by any stretch of the imagination, but in the words of the great 20th century philosopher Frank Zappa, “without deviation from the norm, progress is impossible.”  There was nothing particularly exciting about his teaching style—he didn’t jump onto desks or start chants, and the fires he lit were purely metaphorical.  But those were substantial conflagrations, even if only in our minds.

He lasted only the one year at my school.  Chances are, he was fired, but it could be that he just didn’t like the winter cold (he mentioned that more than once after trudging through the snow) and made the voluntary choice to head south again.  Either way, I hope he had a good life, and that subsequent generations of students had the opportunity to learn from him.

There were times in my teaching career when I sought to be “great”; other times I settled for trying to be “good.”  I never sought to teach something other than what the course description demanded (well, I did do things like extend Theatre History II past the 1940’s, but that was because the course description hadn’t been updated in decades, and I had the full approval of the department chair).  There were plenty of moments when I was a performer as much as a pedagogue, though, and some of my courses were idiosyncratically mine.

Did I succeed at being either “great” or “good”?  That’s not for me to say.  But, as I look back at that classroom from over a half century ago, I know that I have never been so grateful that someone else, someone I knew, broke the rules.  No, he wasn’t a John Hancock or a Rosa Parks, but his influence was direct and life-changing in an unquestionably positive way.  If I ever had a similar effect on even a handful of students, I’ll consider it a win.

Thursday, November 30, 2023

About Those Monopoly Ads...

 

A somewhat off-topic comment on a post on Ethics Alarms got me thinking about Monopoly, and indeed about the two recent commercials I’ve seen for the board game. 

It’s now well-established that the game was actually invented by a leftist stenographer named Elizabeth “Lizzie” Magie.  Her idea, for a game called the Landlord’s Game, was originally intended to have two sets of rules: one in which everyone was rewarded if wealth was created, and a monopolistic version, in which players seek to accumulate all the wealth by crushing all the opponents.

Were Curmie of a cynical disposition (perish the thought!), he might suggest that the former represents the “rising tide lifts all boats” rhetoric of the predatory capitalists at the top of the economic food chain, and the latter represents their actual practice.  Magie wrote that the game “might well have been called the ‘Game of Life,’ as it contains all the elements of success and failure in the real world, and the object is the same as the human race in general seem to have, i.e., the accumulation of wealth.” 

The latter version of the game, the one we now recognize as Monopoly, although patented, was essentially stolen by a scoundrel named Charles Darrow, who made minor changes and sold the idea to Parker Brothers, making millions.  In turn, they simultaneously pretended that the game was Darrow’s invention and also bought the rights to Magie’s game for an absurdly low price.

In other words, the history of the game mirrors the game itself: it is a story of greed, shrewdness, and opportunism supplanting ethical behavior as a guiding principle.  Oh, and don’t forget luck.  It’s better to be lucky than good… in Monopoly and in life.  There are advantages to having a skillset in both, but it sure is easier IRL to have Daddy give you a few million dollars to start your business or in Monopoly to land on Boardwalk when it’s for sale rather than when someone else has a hotel there.

Interestingly, the reverse of this latter phenomenon happens if you go to jail.  It’s a disadvantage early because you can’t buy anything, but late in the game staying in jail means you’re not landing on someone else’s hotel-bearing property.  (One is tempted to note that a certain IRL late-in-the-game real estate mogul currently facing the possibility of jail time might actually benefit politically from incarceration by claiming to have been unjustly tried and convicted.)

It strikes me that, contrary to the arguments made by others on that Ethics Alarms thread, Monopoly stands alone (or at least virtually so—Curmie doesn’t claim to know all the other possibilities) as a game.  It is, in some ways, not about winning, but about your opponents losing.  It’s certainly not about the mutual benefit of all the competitors, as the other set of Lizzie Magie’s rules would have it. 

Even Life, the other board game from Curmie’s youth to involve “money” rather than points, usually ends with winners who have more money than their opponents, not all the money.  Yes, there’s a chance of winning at the end by literally going for broke (and heading to the poorhouse 90% of the time), but that just adds a little late-game interest to a game that is almost exclusively decided by luck.  No one suggests that the winner employed any particular skill, as there’s only one choice: go to college or don’t.

Let Curmie respond to comments by EA readers: Baseball games involve only two teams at a time, and despite the occasional shutout generally end with scores of 3-1 or 7-2 or whatever: not all the runs on one side.  And whereas there is an element of luck—the line drive right at the 2nd baseman, the bloop double, the blast that fades foul or stays fair by inches, and so on—the ratio of skill to luck is significantly higher there than in Monopoly.  Poker lasts as long as it lasts, except in high-stakes tournaments.  There’s not a sense of an unfinished competition if the game ends at midnight or after 20 hands or whatever.  It doesn’t require all the chips to be in a single player’s possession to call it a night.  Sure, games like Risk involve the annihilation of the opposition (and strategy often involving deceit), but they’re not being marketed to primary school kids. 

For all its heritage as a critique of the landlord class, Monopoly has come to be regarded as a paean to capitalism, just as Bruce Springsteen’s “Born in the USA” has somehow been transformed in the public imagination to a patriotic anthem to be blared over the speakers to the accompaniment of 4th of July fireworks.  It’s pretty clear that Lizzie Magie knew that kind of transformation might happen: a society smitten with financial wealth will envy the “haves,” and will jump at the chance to live that life even if only in a fictive world.

So… about those ads…  Curmie has seen two of them for the standard game (there are far too many variations on the theme than Curmie wants to contemplate).  The first, known as “8-Year Old Landlord,” shows, as the title suggests, a little girl in Leona Helmsley drag.  She’s owner of Park Place Apartments; we see her firing the janitor (her father), raising rents on her brother and the family dog, and issuing her pregnant mom a final notice before eviction.  The tag is “It’s all in the name of the game.  All is fair in Monopoly.”  Curmie supposes the ad is cute in a creepy sort of way, but it certainly condones if not encourages behavior we’d find objectionable in real life: the sort of legal but acquisitive and narcissistic impulses many renters see in their landlords.  Still, that focused amorality is a strategy for winning at Monopoly.  Can’t complain too much.

The other commercial, however, is far more problematic.  The title character and narrator of “Grand Theft Nai Nai” (“nai nai” is Chinese for “grandmother”) tells us that being 75 has its perks, notably that “People just trust you.  Blindly.”  In her capacity as a bank teller she steals a huge amount of money from customers and/or the bank itself, stuffing wads of bills into her clothing, even into her shoes.  We then cut to a shot of her doing the same thing with Monopoly money as she gloats about her wealth.  It may be that it’s her own money she’s stuffing up her sleeve in the image above, but then the parallelism is fractured.  The context certainly suggests that she’s taking money from the game’s bank, just as she did from the real-life bank for which she works.  Again, the tag is “All is fair in Monopoly.”

Except that it isn’t.  The young landlady of Park Place Apartments may be unfeeling and rapacious, but what she does is legal (well, the rationale for raising the brother’s rent may not be, but it goes by so fast that virtually no one will notice).  The old lady in the other ad is engaged in actual theft.  That’s against the rules in real life, and guess what, Gentle Reader?  It’s against the rules in Monopoly, too!  All is not fair in Monopoly; this isn’t Calvinball, where the rules are made up as the game progresses. 

Does this ad really encourage cheating?  Well, actually, yes.  The other commercial may demonstrate how to win at Monopoly; this one shows how to get arrested in real life, or at least how to deserve to be.  That’s not a good look, even for a game that has for years de facto conflated success and wealth.  Metaphoric, legal, stealing—tax breaks on yachts and private jets, huge “loans” that miraculously don’t need to be repaid, that sort of thing—is one thing.  Literal theft is another.  Hasbro (which acquired Parker Brothers and therefore Monopoly in 1991) needs to ditch the ad, no matter how cute they think it is.

Friday, November 24, 2023

Observations on the Violence in Dublin

As those who know me personally know, Curmie has spent about four months in Ireland, mostly in Dublin.  I’ve led seven Study Abroad programs there and led walking tours from O’Connell Bridge in the center of the city up to Parnell Square, a kilometer or so to the north.  I’ve spent dozens of hours in that area—at the Gate Theatre, the Hugh Lane Gallery, the Garden of Remembrance, the (alas, now defunct) Dublin Writers Museum.  The last time I was in Dublin, on a whirlwind research visit in 2019, I stayed at a b&b a few blocks east of Parnell Square.  It was a long walk to the National Library south of the Liffey, but finances dictated that necessity.

One of the first things I saw on my Facebook feed this morning was a statement from the Abbey Theatre, which is a couple blocks east and a block north of O’Connell Bridge.  It said they were “thinking of all of [their] neighbours in Dublin 1,” but that the evening’s performance of Brendan Behan’s The Quare Fellow would go on as scheduled.  I wondered what happened in Dublin 1, went to the Google machine, and encountered a headline in the Irish Times saying that Taoiseach (Prime Minister) Leo Varadkar had estimated “tens of millions” of Euro in damages caused by riots that started in the Parnell Square area and moved south as far as the area around O’Connell Bridge. “Whaaaaaat?  Riots in Dublin?  That hasn’t happened in decades.  Surely there’s some mistake,” I thought.  Alas, no.

In the early afternoon Dublin time Thursday, moments after the Macy’s parade was starting in New York, four people, including three small children, were injured in a knife attack outside the Gaelscoil Choláiste Mhuire, a primary school across the street from the Garden of Remembrance.  As of this writing (Friday evening in the US), one little girl and the childcare worker (it’s unclear whether she’s a teacher or other staffer) who launched herself between the attacker and the children remain in critical condition; the girl is said to be “fighting for her life.”  The only good news is that the perpetrator seems to have been acting alone.

This story is horrible enough.  But somehow this attack led to riots, with bus drivers pulled from their vehicles, petrol bombs thrown at refugee centers, public transport vehicles set alight, looting of over a dozen stores, and multiple injuries to gardaí (police).  The facts—or, rather, the closest we can come to facts—enumerated here are drawn from a variety of sources: the Irish Times, Irish Independent, Irish Mirror, RTE, ITV, the BBC, the Guardian, and the Telegraph.  I’m not going to bother to try to link every statement to a specific source: this is a blog piece, not an academic article.  You can, as they say, do your own research, Gentle Reader.

The principal rioters were young—in their late teens or twenties—but they seem to have been egged on by folks of an older generation.  Hundreds of people were involved in the violence.  As I write this, arrests from the riots number in the 30s; “many more” are promised after the authorities examine CCTV footage.  Curmie is no fan of the level of government surveillance that is common in Ireland and the UK, but if regular people have to put up with Big Brother, then at least that technology can be used to arrest and convict every one of these assholes.

If government sources are correct, the riots were a coordinated effort by what Garda (Police) Commissioner Drew Harris describes as a “complete lunatic faction driven by far-right ideology.” Varadkar added, “These criminals did not do what they did because they love Ireland. They did not do what they did because they wanted to protect Irish people. They did not do it out of any sense of patriotism, however warped.  They did so because they’re filled with hate, they love violence, they love chaos and they love causing pain to others.”

Whether or not these characterizations are accurate, and indeed whether or not this kind of violence was predictable, as opposition pols have suggested, there does appear to be ample evidence that the riots can be traced to anti-immigrant animosity.

Posts on social media identified the perpetrator of the knife attack as an immigrant, and that was enough to enflame the loonies.  Trouble is, the assailant was an Irish citizen, albeit foreign-born.  He’d lived in the country for twenty years.  Oops! 

You know who really is a migrant?  The guy who stopped the knife-wielding attacker.  BIG OOPS!  Deliveroo is an Irish equivalent of DoorDash or Uber Eats.  One of their delivery drivers is a man named Caio Benicio.  He’d been in Ireland for only a year or so, trying to make some money to send home to Brazil, where his restaurant had burned to the ground.

He was riding his motor scooter past the area, saw the knife and the threat to children, and went into action.  He says he didn’t have time to be afraid; he just responded to what he saw.  He stopped his bike and ran towards the scene while removing his helmet, with which he proceeded to clobber the assailant over the head, knocking him to the ground.  Others grabbed the knife and restrained the perp.

(Side note: there’s a GoFundMe appeal called “Buy Caio Benicio a Pint.”  Folks are encouraged to donate the cost of a pint of beer at their local pub to Mr. Benicio.  At the time of this writing, the pot now stands at over €300,000, or over $330,000.  It would take a lot of overtime at Deliveroo and some very generous tips to make that kind of money.  Mr. Benicio doesn’t see himself as a hero, just someone in the right place at the right time: another reason I don’t begrudge him a penny of that GoFundMe haul.  I just wish the woman who was seriously injured while protecting her young charges would receive similar assistance.)

Of course, the riots were spawned by people believing what they saw on social media, an even riskier proposition than trusting journalists.  The temptation to indulge in a little confirmation bias borders on the overwhelming.  But that’s only part of the problem.

Back in the Dark Ages when I started college as an undergraduate Government major, one of the distinctions made between traditional liberalism and traditional conservatism was that the former looked for trends involving groups of people (sexes, races, religions, etc.) and the latter centered on the individual.  There are appeals to both points of view: the former stands more explicitly against racism, sexism, and the like; the latter points out that this particular person of a “privileged” class isn’t necessarily an oppressor and that particular person from a “disenfranchised” class isn’t necessarily a victim.  (Oprah Winfrey is actually a little less oppressed than the average white male Walmart employee.)

But I become increasingly convinced of the wisdom of the horseshoe theory’s suggestion that the far left and the far right share an interest in authoritarianism that makes them more similar to each other than to the more libertarian mainstream manifestations of either philosophy.  And that means a tendency to stray from the positive attributes of those political perspectives.  (So says this civil libertarian, at least.)  In this case, it appears that the far right disregarded the individual and classed all “foreigners” as the Other.  (Curmie struggles to resist the urge to say “as usual.”)  And, of course, getting the facts right was of secondary importance, if it mattered at all.

There is plenty of cause for re-evaluating the Irish government’s policies on immigration.  The same could be said for virtually any country, including our own.  Looting department stores and setting fires to police cars would seem a rather counter-productive means of achieving that end, however.

Of course, even if there were a few hundred rioters and even if they were all right-wingers, they still represent a tiny minority of the Dublin population.  The left needs to recognize that “not all conservatives” remains as legitimate an argument as “not all immigrants.”  Who knows?  Maybe they’ll actually understand that.  Just don’t bet the rent.

Friday, November 17, 2023

Eye Black Is Not Blackface. Duh.

If you see blackface here, please leave
this page.  Its for intelligent people.

A few days ago, I commented on a post on Ethics Alarms regarding the high school principal in Sherman, Texas who declared that the musical Oklahoma! contains “mature adult themes, profane language, and sexual content” “would come in third place in a battle of wits with a sack of hair and an anvil.”

Gentle Reader, I hereby retract that characterization.  It appears that Sherman Principal Scott Johnson was merely a good soldier, enforcing the dictates of a superintendent and school board that can’t decide if the Victorian age was a little too permissive.  So… Johnson appears capable of giving that anvil a run for its money. 

The good news is that the international attention this case received resulted first in a decision to re-instate the original student cast but in a shortened “kids” version of the musical that would have cut the solo from Max Hightower, the trans student at the center of the controversy, and finally—when the students and parents refused to accept that utterly stupid “compromise” or the notion that Oklahoma!, of all plays, ought to be bowdlerized—a return to the original version with the students the director cast.

More to the present point, when compared to Jeff Luna, the principal at Muirland Middle School in La Jolla, California, even the folks who did make the idiotic decisions that led to the kerfuffle would appear to embody all the best attributes of Solomon, Socrates, Confucius, Albert Einstein and Leonardo da Vinci rolled into one.  We do sorta know what Ado Annie means when she laments her inability to “say no,” after all.

I was about to say that what Luna did surpasses credulity, but, alas, it does not.  There are a lot of adjectives that do apply—boneheaded,” “irrational,” and “unconstitutional” come to mind—but unfortunately “unbelievable” has no place on the list.

Last month, a Muirland 8th-grader identified as J.A. attended a high school football game, looking like he does in the photo above.  That is, he wore eye black, just as he’s seen countless football players (and not a few baseball players) do; I won’t bother you with the literally dozens of photos of players of all races doing so.  Now, whether eye black has any direct practicality is a matter for debate.  It started as a means of keeping glare out of the eyes.  I have no idea whether it actually does that, and even if it does, it doesn’t require the amount used by J.A.  But that, of course, is irrelevant. 

There’s little difference between J.A. and those fans who paint their faces red because their favorite team is the Alabama Crimson Tide or who wear “cheeseheads” to support the Green Bay Packers.  Maybe the allure is primal, maybe it’s that face-painting is linked to war paint.  But there’s no “maybe” about the fact that used as J.A. did, it’s completely and utterly harmless… not to mention the fact that, according to the Foundation for Individual Rights and Expression (FIRE), “many game attendees wore face or body paint.”

There were, of course, no incidents at the game in question, and literally no one took any offense.  That’s because most people have more than a couple of brain cells.  Not so, apparently, the Idiot Luna.  A week or so after the game, he called J.A. and his parents to a meeting, at which the boy was suspended for two days and barred from attending future athletic contests.  Why?  Because, according to the official paperwork, J.A. “painted his face black at a football game,” which qualifies as an “offensive comment, intent to harm.”  In other words, Luna would need to undergo a couple of millennia of evolution to attain the mental acumen of pond scum. 

You can read an excellent delineation of the facts in the letter from FIRE’s Director of Public Advocacy Aaron Terr.  A brief précis: 1). J.A. “emulated the style of eye black worn by many athletes.”  2). “J.A. wore his eye black throughout the game without incident.”  3). “J.A.’s non-disruptive, objectively inoffensive face paint was constitutionally protected expression.”  4). “The complete lack of disruption is unsurprising, as the sight of fans in face paint is familiar to anyone who has ever attended a football game or other sporting event.”  5). “The claim thar J.A.’s face paint constituted blackface is frivolous.”  6). “Muirland Middle School has no authority to discipline J.A. for his non-disruptive, constitutionally protected display of team spirit.”

The only part of this missive with which anyone could reasonably demur even slightly is the characterization of the eye black as a display of “team spirit.”  I doubt it was necessarily that, but it was certainly inoffensive, non-disruptive, commonplace, and constitutionally protected.  Terr needs to be more polite than I do, so I’ll allow his characterization of the assertion of blackface as “frivolous,” as opposed to the more accurate “fucking ridiculous.”

But, as they say in the late-night infomercials, “Wait!  There’s more!”  The family appealed the suspension, but the appeal was denied by the San Diego Unified School District, suggesting that they, too, are cognitively impaired.  The correct response, of course, would have been to uphold the appeal and fire Luna.

It’s difficult to tell if Luna is a Social Justice Warrior run amok, or simply a sports-ignorant walking example of the Dunning-Kruger effect, like the buffoons a few years ago who decided that the universally recognized gesture to denote a 3-point shot in basketball was a gang sign.  Ultimately, that distinction doesn’t matter—willful ignorance and blinkered paranoia are all but interchangeablebut it does matter that, alas, the situation is worse than even FIRE suggests.  

Whereas it is obvious that J.A.’s actions are utterly innocuous, and that the punishment is about three steps beyond absurd, that isn’t always the case.  Racial animus does indeed exist, and it is occasionally manifested at sporting events; there was a case less than an hour from Chez Curmie a year or so ago. 

I’m not going to get into whether expression that actually is offensive is constitutionally protected on school property.  Racism and its cousins in prejudice are unethical even if legal.  Luna and the district have demeaned all attempts to protect the Others (whoever that might be in terms of race, religion, gender identity, etc.) from harassment based simply on who they are. 

What is at play here is a variation on the Boy Who Cried Wolf.  When we accumulate enough examples of utterly inane allegations being brought by authority figures, we start to discount all such claims, even those which have merit.  J.A. and his family are bearing the brunt of this outrage, but we all suffer the consequences.

This is a slightly edited version of an essay that first appeared as a Curmies Conjectures post on Ethics Alarms.

Monday, October 30, 2023

This But That: Musings on Israel and Palestine

Curmie’s personal Facebook page lists his political affiliation as “Contrarian,” or at least it used to do so; that piece of information seems to have vanished, and there are more important things to do than investigate further.  That self-description holds, whether it’s visible to others or not. 

I was never in my adult life more conservative than when living in the small Iowa town that hosts the largest Democratic caucus in the state; whereas there were significant right-wing voices among my colleagues on the faculty of the small college where I taught, they were definitely in the minority.  

Nor have I ever been more liberal than when living in an East Texas Congressional district that elected and kept re-electing Louie Gohmert, often by landslide margins. (He’d still be my Congresscritter except for an ill-fated run for Texas Attorney General).

What this suggests is that when surrounded by largely one-sided rhetoric, Curmie tends to see the flaws in that perspective magnified relative to its strengths.  Whether that’s an admirable exercise in critical thinking or merely a stubborn petulance at being told what to believe sort of doesn’t matter.  It is what it is.

Curmie is also, of course, trained in theatre, and has taught over two dozen sections of acting classes, most of them grounded in the Stanislavsky system of beginning with a character’s objective.  This approach is especially useful in playing antagonists or even villains: they aren’t evil (necessarily, at least), they just see the world a little differently.  The world is a complicated place, and pretending that it is otherwise leads to potentially dangerous oversimplification.

Context matters, in other words.  Always.  That doesn’t mean that context is all that matters, or that every rationale for an action (or inaction) is legitimate—logically, legally, morally, or ethically.  Nor does it mean that just because we agree with a perspective with respect to Issue X, we are unable to disagree about Issue Y.  It is possible, for example, to say that the Biden presidency has been largely unsuccessful and still think it’s better than any of the alternatives currently presenting themselves.  It’s possible to acknowledge that there is considerable corruption in Ukraine and still support them in their struggle against Russian aggression.  And so on.

It is also possible to grant that allegations that Israel’s treatment of Palestinians shares some similarities with apartheid have some merit (the fact that these screeds are generally overblown does not mean they’re entirely fabricated) while also condemning without the slightest hesitancy or reservation the recent attacks by Hamas against Israeli citizens (and those from other countries, including the US).  Rape, murder, and kidnapping of non-combatants, of children and the elderly, cannot be countenanced under any circumstances whatsoever.  Curmie won’t go into details here, Gentle Reader.  You’ve probably been following this story as closely as he has, and there comes a point at which the sheer horror of the situation defies description.

And then, of course, there was the reaction, or, rather, series of reactions.  The Israeli military launched a massive bombing attack on Gaza, dropping nearly as many bombs on that small area in less than a week as the US employed in the entirety of Operation Desert Storm.  They cut off electricity to the region.  They imposed a mass and immediate migration. 

The Israelis (both here and throughout this essay, Curmie uses this term to refer to the Israeli government, not the citizenry) claim with no little justification to be defending their territory and their citizenry.  Also (of course) they claimed to be attacking Hamas installations rather than Gaza in general.  That said, there has been no little death and destruction suffered by people who had no part in the attacks in Israel.  Are the Israelis the “good guys” here?  No.  They’re the less despicable guys. 

Another response, of a different kind, was that of both the international left and a bunch of unthinking post-adolescents.  Blaming the situation wholly on Israel is profoundly stupid, yet so many individuals and organizations have effectively become spokespeople for terrorism.  This is not, of course, a unanimous response of the left any more than suggesting that all Jews want Gaza obliterated.  As a friend of Curmie, a man well to Curmie’s left politically, posted on Facebook, “When the topic turns to Israel, I step away from the Left.”

One of the biggest sources of headlines comes from Harvard, where some 30+ student organizations signed off on a proclamation declaring Israel to be “entirely responsible” for the Hamas attack.  Making the obvious point that the Hamas attack “did not occur in a vacuum,” the statement then proclaims that “The apartheid regime is the only one to blame.”  Uh… no.

But here’s the thing: many members of those student organizations didn’t even know that such a statement was even being considered until after it had been signed in their name.  “We didn’t read or understand what we were signing” is a pretty lame excuse for an Ivy League student; “We didn’t even know it was happening,” on the other hand, is legitimate.

Some brief observations:

1. Sometimes things that appear simple are actually complex.  There’s a good and—from Curmie’s perspective, at least—balanced discussion of some of this over at TheConversation.com.

2. Does the mere presence of non-Muslim visitors at the Al-Aqsa Mosque constitute a “desecration”?  Curmie’s definition of the term would require a higher threshold, but he’s an agnostic of Christian heritage, so what does he know?

3. Policies of the Netanyahu regime in particular no doubt had a triggering effect, indirectly precipitating the Hamas attack. 

4.Sometimes things that appear complex are actually simple.  There is no excuse, none, for what Hamas did.  Context is one thing; excuses are another.

5.There’s a difference between collateral damage and specifically targeting civilians.  That does provide context (there’s that word again), but there’s also a point at which knowing in advance that the collateral damage will be catastrophic becomes relevant.

6. It is virtually certain that Hamas takes shelter among innocent civilians, so any attack on Gaza will have civilian casualties.  Both sides know this.  Neither seems much to care.

7. Hamas’s energy, indeed its very existence, is defined negatively.  It is focused exclusively on the elimination of Israel (and perhaps of Jews in general), not on improving the lives of Palestinians.  This is one of the few points on which Curmie will brook no denial.

8. Silence is not always complicity.  Taking some time to organize ours thoughts—for those of us privileged enough to not be directly affected, at least—would seem a better option than saying something you’ll regret.  Not having words is not a moral or ethical failing. 

9.The same people who (rightly) argued that criticism of Barack Obama wasn’t inherently racist or of Hilary Clinton inherently sexist are now suggesting that criticism of the Israeli government is, in fact, inherently anti-Semitic.  It isn’t.  Nor is criticism of Hamas inherently Islamophobic. 

10. Lots of people in Israel don’t support their government’s policies with respect to Palestine; lots of people in Gaza don’t support brutality against Jews.

11. Kids in Gaza are just as innocent as those in Israel.

12. You can be a member of a group without supporting everything that group does.  No elected officials in a real democracy are ever supported by all their constituents, or even without reservations by their supporters.  Any American who can’t or won’t acknowledge the failures of the candidate they voted for in a recent election is not worthy of consideration.  Anyone who merely parrots the party line—any party line—is unworthy of consideration. 

13. It is not merely illogical but unethical to blame (or praise, for that matter) all members of a group for something the majority or the power-mongers do.  American citizens aren’t responsible for what our military does just because a slight majority of voters (as opposed to citizens) elected the Commander-in-Chief.  The same logic applies both to Israeli citizens and to residents of Gaza.

14. Simply being a member of a group—a local chapter of Amnesty International or the Nepali Student Association, for example—which has hitherto not done anything stupid regarding Israel and Gaza should not lead to doxing.  People who make decisions are responsible for those decisions; people who don’t even know a decision is being made are not. 

15. We can count on some self-important CEO (the usual apologies for redundancy) to engage in preening twatwaffledom in any circumstance.  No one should accept a job with a boss who engages so freely and publicly in proclaiming guilt by association.  Curmie is looking at you, Bill Ackman.

16. We are placed in the unfortunate situation of having to believe news reports, which, given the sloth of most journalists and editors, are often misleading if not outright prevarications.  Skepticism is definitely in order.

17. To the extent that such poor journalism is the result of bias rather than garden variety incompetence, it tends to backfire.  The truth has a nasty way of finding its way to the light, and uncovering false allegations (e.g., of an Israeli attack on a Gaza hospital) tends to discredit subsequent, accurate, reporting from the same source.

18. The only hope for the situation not getting worse is the release of the hostages.  We can be shocked by Israel’s response, but we cannot reasonably condemn it out of hand.  Horrific and appropriate are not always contradictory terms.

19. Should this release occur and the assault on Gaza continue, the Israeli government bears the responsibility for the continued carnage.

20. Curmie doubts that the release mentioned in #19 is likely to happen.  Alas.

21. It is possible to support Palestinians and still condemn Hamas’s slaughter of innocent people.  How do I know?  The guy I see in the mirror does both.

Wednesday, October 11, 2023

The Revenge of the Wackadoodles

 

One of my favorite lines from the late singer/songwriter Warren Zevon is “Just when you thought it was safe to be bored / Trouble waiting to happen.” That lyric came to mind when I happened across an article in the Chronicle of Higher Education titled “Hamline President Goes on the Offensive.”  Well, that lyric and one of my most oft-used phrases, “Oh, bloody hell!”. 

This rather lengthy article—over 3000 words—deserves to be read in its entirety, even though it may involve a (free but annoying) registration process, but I’ll try to hit the highlights here. The author is Mark Berkson, the Chair of the Religion Department at Hamline University. His was for a very long while the only voice, or at least the only audible one, on the Hamline campus to come to the defense of erstwhile adjunct art history professor Erika López Prater as she was being railroaded by the school’s administration on absurd charges of Islamophobia. 

You may recall the incident, Gentle Reader. Dr. López Prater was teaching a course in global art history, in which she showed images of a couple of paintings depicting the prophet Muhammad. Recognizing that there are some strains of Islam in which viewing such images is regarded as idolatrous, she made it clear both in the course syllabus and on the day of the lecture in question that students who chose not to look at those particular photos were free not to do so, without penalty. 

Ah, but that left too little room for victimhood. So student Aram Wedatalla blithely ignored those warnings and (gasp!) saw those images… or at least she says she did, which is not necessarily the same thing. Wounded to the core by her own sloth and/or recklessness, she then howled to the student newspaper and, urged on by Nur Mood, the Assistant Director of Social Justice Programs and Strategic Relations (also the advisor to the Muslim Student Association, of which Wedatalla was president), to the administration. The banner was then raised high by one David Everett, the Associate Vice President of Inclusive Excellence. (Those folks at Hamline sure do like their pretentious job titles, don’t they?) 

Anyway, Everett proclaimed in an email sent to literally everyone at Hamline that López Prater had been “undeniably inconsiderate, disrespectful and Islamophobic.” To be fair, he didn’t identify her by name, but there weren’t a lot of folks teaching global art history. Everett was just getting warmed up. He subsequently co-authored, or at least jointly signed, a statement with university president Fayneese Miller that “respect for the observant Muslim students in that classroom should have superseded academic freedom.” Not at any university worthy of the name, it shouldn’t. Anyway, López Prater was de facto fired, because destroying the careers of scholars for even imaginary offenses has become a blood sport for administrators (and, in public colleges, for politicians). 

There followed a not insignificant period in which the university administration was justly savaged for their disregard for facts or for due process, and for the hypocrisy of their actions, which clearly contradicted the university’s pretensions to upholding academic freedom. These denunciations came not only from other art historians, but also from such organizations as the American Association of University Professors, the National Association of Scholars, the Federation for Individual Rights and Expression, the American Freedom Alliance, PEN America, and (oh, yeah…) the Muslim Public Affairs Council. It wasn’t pretty, but it was richly deserved. 

Finally, far too late and far too insincere, there was a sheepish admission that calling López Prater “Islamophobic” was “flawed.” Any with more cranial capacity than a turnip would say “freaking ridiculous,” but hey, it’s a step, right? 

This torrent of negative publicity no doubt also helped to catalyze a number of other faculty to evolve from their invertebrate state and join Professor Berkson in defense of López Prater’s perfectly reasonable pedagogy. The faculty, now with the courage engendered by a very one-sided national response to events at Hamline, ultimately voted overwhelmingly to demand that President Miller resign. Also, of course, López Prater sued the university. As far as I can determine, that case is ongoing. 

Like a lot of other people, I thought, despite some reservations, that it was now “safe to be bored,” to coin a phrase. The matter was now in the hands of the Trustees, and the only hope to salvage any scraps of what legitimacy Hamline may once have enjoyed would be to show Miller the door, taking Everett and Mood with her. They did not do so, of course. Instead, they behaved like Trustees (Regents, Councilors, whatever) have done at every place I ever worked in career that stretched into six decades, from the ‘70s to the ‘20s: they couldn’t admit that they were the folks who hired Miller and that they’d made a mistake. They punted, as such craven and often anti-intellectual bodies are wont to do. 

So Miller is still in place until she retires “early” in June. Of course, she says, the events of the last year or so had no effect on the decision to retire. Oh, and “No one was let go for showing an image.” Also, that Nigerian prince is absolutely above-board. Just so you know.  (Okay, maybe that last part is made up.)

But there was, to quote Zevon, trouble waiting to happen. A couple of weeks ago, Miller and her minions presented a forum with the heady (but, of course, misleading) title “Academic Freedom and Cultural Perspectives: Challenges for Higher Ed Today and Tomorrow.” An actual discussion offering different perspectives on how to weigh the sometimes competing values of academic freedom and respect for cultural differences would be welcome. But such things almost never happen. There’s virtually always a point of view imposed from above; in my experience, this “correct” position is seldom… well… correct. 

This event, at least according to Professor Berkson, whom I tend to believe, was staged for no reason other than to excuse the inexcusable behavior of the Hamline administration. The first two people to speak: Everett and Miller. The latter declared that this was an “offensive” move. She pronounced it to contrast it with “defensive,” although stressing the second syllable would almost certainly have made her statement more accurate. 

She claimed that the real threat to academic freedom was happening in places like Florida and Texas. Well, she’s right that there are threats in those places: just because some of the allegations are exaggerated doesn’t mean there isn’t legitimate cause for concern… or, indeed, for anger. But the idea that Miller and her ilk are somehow innocent because others are guilty, too? No, that argument has no merit. Dr. Berkson is kinder than I would be when he writes that “Miller fails to see that there are many ways that academic freedom can be threatened.” 

Anyway, the two major culprits in the art history debacle served to introduce the keynote speaker, Michael Eric Dyson. Dyson is an intelligent, well-educated man (PhD from Princeton), and an eloquent speaker. He is best known for his ongoing rhetoric that blacks in this country continue to suffer from centuries of ongoing oppression. (Perhaps that’s why his net worth is estimated at a paltry $5 million?) 

His speech, writes Berkson, made some useful points, but ultimately he contributed to what Berkson calls “essentially a full-throated defense of the administration’s actions against López Prater.” “If I got Muslim students,” Dyson said, “and I know what upsets them, I got the freedom to show what I want to show, but why would you? What’s your point? What’s your intention?” Berkson, himself an authority on Islam, responds:
It is clear that López Prater had no intent to upset anyone. She was teaching an important work of Islamic art, which is part of her job. She showed concern for her Muslim students by giving them multiple warnings, in writing and orally, to avert their eyes when she showed the image if they so wanted. This is nothing like the examples — some given more than once by many speakers at the event — of Holocaust denial, flat earth theory, fomenting an insurrection, and using the N-word in the classroom. None of these absurdly inappropriate disanalogies are remotely similar to the challenge that arose in López Prater’s art history class and that many of us regularly face — responsibly teaching relevant and suitable academic content that might be disturbing to some students.
Note to Dr. Dyson: don’t ask rhetorical questions if there’s somebody ready to answer them.

There followed a panel discussion featuring three bused-in speakers: Stacy Hawkins (her bio on the Rutgers law school website features the word “diversity” a dozen times, if that gives you an idea of her priorities), anti-racist activist Tim Wise, and Robin DiAngelo (whose books include such titles as White Fragility and Nice Racism). The sole representative of Hamline faculty was political scientist David Schultz; given the fact that Miller would rather chew on razor blades that allow Berkson a forum, Schultz is a more than reasonable representative of a professoriate more interested in developing students’ analytical skills than in inculcating them with a particular perspective. 

Berkson has a lot more to say in his article—about power dynamics involving administrators, faculty (with sub-categories of tenured, tenure-track, and adjunct) and students; about the fact that none of the panelists (except Schultz) addressed the López Prater case even obliquely; about the casual assumption on the part of the guest panelists (and the Hamline administration) that faculty don’t actually care about their students’ well-being; about the distinction between freedom of speech and academic freedom; about the fact that the AAUP recognizes the delicate balance between academic freedom and respect for differing perspectives, but nonetheless describes the actions of the Hamline administration as running a “de facto campaign of vilification” against López Prater based on an “inaccurate and harmful understanding of the nature of academic freedom in the classroom.” 

I do urge you to read Berkson’s entire article if this topic interests you at all. But I’ll close with this: university students are old enough to contemplate ideas that may make them uncomfortable. Perhaps these different perspectives are grounded in race, or religion, or gender, or politics—for these purposes, it doesn’t matter. Trying new things is sometimes scary, and the intellectual terrain that must be crossed can be something of a minefield, but negotiating those hazards is imperative for faculty and students alike. 

Figuring out how to make that crossing successfully is not merely an admirable goal; it is a necessary one. Colloquia featuring different approaches are desperately needed. And there are any number of examples of situations in which there’s a legitimate argument for both sides of an argument (e.g., if López Prater hadn’t offered students the ability to opt out). This isn’t it. Stacking the deck with speakers eager to defend an indefensible position isn’t helpful; it’s the equivalent of defending Lauren Boebert or Jamaal Bowman for their recent headline-making misadventures.

Professor Berkson closes by citing Professor Schultz, who said of the conversation: “Our discussion here about diversity and academic freedom ... is probably at the most superficial level that we can have. … At the end of the day, let’s have a real discussion.”  Berkson comments simply, “Amen.”  

“Amen,” indeed.

Note: as has often been the case of late, this post, or one very like it except for a couple of stylistic edits, first appeared on the Ethics Alarms page.