The Center of the Universe

Astrophysicists tell us that the universe has no center.

That the question as to where exactly in the cosmos the creation event known as the Big Bang took place some 13.8 billion years ago is meaningless, since space itself emerged from it.

If anything, that the center of the universe is “everywhere.”

I beg to differ.

Join me in the following thought experiment:

Imagine an identical copy of our present universe overlapping the original, in a way that the two do not interact with each other.

Now, with the original frozen in place, the copy collapses back to the size of, say, a golf ball, retracing its path in the direction of the singularity from which the universe arose.

The location of this hypothetical golf ball must be the center of the universe, the very place where the Big Bang happened, must it not?

Astrophysicists ought to be able to crunch the numbers to dope out its coordinates, then calculate how far from the center of the universe our home planet Earth is located on any given day.

By golly, they should be able to calculate to one Planck length the distance from the center of the universe to the par-3 hole on any of the president’s golf courses on any given moment of any given day!

Since the COTU most likely forms the mother of all Lagrangian points, one could theoretically mark it with a “GROUND ZERO—The Big Bang happened HERE” sign, which would remain permanently parked in place by the combined gravitational forces of the isotropically expanding cosmos that surrounds it.

Instead, they tell us that the universe “has no center.”

Personally, I think astrophysicists are just too damn lazy to do the math on this.

With all due respect.

Dressed to Please

On 31 July 2018, Aeromexico Flight 2431 crash-landed in a field only seconds after takeoff. All 103 people aboard were safely evacuated in the nick of time before the aircraft burst into flames.

The following day, an aviation safety expert held forth on CNN upon upping your odds of surviving so-called “survivable” crashes like this one.

Besides non-flammable clothing that should cover arms and legs, she explained, passengers ought to select footwear conducive to exiting a potentially chaotic, smoke-filled, and obstacle-strewn cabin with fleet-footed promptitude rather than in a tentative traipse.

Whereupon the CNN anchor lady vowed to never again rock high heels on a plane.

High heels, obviously, weren’t designed for speed. On the contrary: their whole raison d’être is to slow women down.

Just as a vestigial primal instinct hailing from the olden pre-affirmative-consent days, when a man would simply grab a female by her hair and drag her into his cave, still causes men to wince, overtly or subconsciously, at any woman’s stated intention of having her tresses docked—for shorter hair invariably makes for a lesser grip and a resultant reduction in dragging leverage—men, generally speaking, prefer women to be meaningfully constrained in their ability to escape.

Hence man’s predilection for high heels on a woman.

Sure, for the purpose of putting distance between her and an unsavory pursuer, she could take off her heels and attempt a barefooted getaway—but how long before such transient gain in velocity will be offset by at least one of her now unprotected soles making painful contact with a pointed rock, a rusty nail, or a shard of glass, causing podiatric injury and transforming her nimble dash into a laborious limp?

Whether boarding an airplane or setting out for a night of partying in the city, from a security perspective, not to mention orthopedic considerations regarding long-term skeletal salubrity, short of wearing a blindfold when driving, high heels rank among the most ill-advised articles of dress to sport for any occasion or activity.

And yet, just as Muslim women have been gulled by centuries of masculine oppression into gussying up as letter boxes, bank robbers, or folded-down beach umbrellas (the first two of these graphic descriptions courtesy Britain’s former secretary of state), or at the very minimum to hijab up in public (something Britain’s former secretary of state may want to consider for himself as an alternative to splurging on some hairspray and a comb), in addition to having been brainwashed into professing with the utmost sincerity and the straightest of (often invisible) faces that it is their own free will, and nothing but, to attire themselves in this way, occidental men have managed to pull the same number on occidental women when it comes to high heels.

Ask any Western woman why she’s wearing high heels, and she’ll assure you, daggers flying out of her eyes, that no freer, more independent, and less informed-by-primeval-male-fantasies fashion choice has ever been made by her nor anyone else in human history—the very same response you’ll likely obtain from a lady in a burqa, a niqab, or a hijab.

In fact, demonstrating the alarming extent of the subliminal indoctrination in play here, many Western women will go one step further and insist that high heels are actually comfortable, if you can believe it. (As comfortable as being trussed up in an 18th-century baleen corset and gradually turning blue from lack of oxygen, I imagine.)

When I queried one woman who had told me so in person as to whether, if they’re so comfy, she ever wore stilettos around the house, she sort of backtracked and admitted that they were more about achieving a particular look than physical comfort.

In terms of evolutionary psychology, a discipline rarely in simpatico with the precepts of modern political correctness, the look in question is one that telegraphs to all interested parties within viewing range that the wearer is easier to catch and impregnate than if she had on Nikes or Chucks.

Nature, mind you, couldn’t care less about 21st-century sensibilities. Nature cares about propagating the species. Humans that wish to regard themselves as somewhat more evolved and enlightened than gorillas in the wild, therefore, should take to heart, in this context as well as in a few others, Katharine Hepburn’s character’s admonition, uttered in African Queen, that nature is “what we are put in this world to rise above.”

To which smart-alecky high-heel wearers will respond that they have risen above nature by habitually forcing their feet into a most supernatural position that produces a most supernatural gait.

Besides, the true purpose of high heels, they will add, is to make themselves appear taller, not easier to catch, so as to make up for the inequality in height between them and the taller sex.

Yet many men, no doubt, would also like to add a few inches, for a variety of reasons (and in a variety of places), such as to restore the inequality in height between the sexes, but since donning high heels would negate much of their speed advantage over high-heeled females, the dudes-in-heels thing has never caught on.

We know as Stockholm Syndrome the phenomenon of a person held hostage becoming sympathetic to her captor. Unfortunately, I cannot think of the scientific term—if you can, please put it in the comments below—for mistaking for one’s own free will the will of one’s oppressor, a condition on vivid display every time a woman cites autonomous self-governance utterly disencumbered of any inclination whatsoever to please anyone but herself as her sole motivation for ambulating about in high heels or shrouding herself from head to toe in a wearable pup tent.

And if you think I’m just picking on the girls, don’t even get me started on guys and their ties.

Yes, Me Too

It happened when I was thirteen years old, not on the way to the forum but in the foyer of a small theater in Austria.

I was bidding good-night to a group of people after a show, and when my turn came for what I expected to go down in history as no more than a pedestrian handshake with a rather toothsome 19-year-old acquaintance, she suddenly, without warning or having ascertained my consent—which, even had it been sought and obtained, would have been irrelevant on account of my having been underage even by Austrian standards—pushed her lips against mine for a second or two.

Needless to say, given my budding puberty and the pulchritude of the predator, following the incident I sojourned on cloud nine for weeks.

For decades thereafter, I would remember and cherish the experience as my first kiss of sorts.

Of late, however, in light of escalating public awareness regarding objectionable transgressions of a prurient nature, I’ve come to realize that not only does the incident described fail to fit the bill of a memory worth cherishing, but that it has the markings of sexual assault, and sexual assault of a minor (!) no less.

I am currently struggling to adjust to my yet unaccustomed role as a survivor of such.

Traumatic as it now is upon its reframing, this assault would not remain the only one to find myself at the receiving end of.

Fast-forward a number of years. I had matured into legal adulthood and then some, and I had rented out the bedroom of my small one-bedroom NYC apartment to a co-worker in temporary need of accommodation.

Not only because providing shelter to the needy invariably accrues toward the positive side of one’s karma ledger, but I could also use the extra cash.

And yes, the co-worker in question was young and gorgeous, and I had a major crush on her; factors that likely facilitated my decision to share my humble abode and dispense with my treasured solitude for a while.

Riddled with personal flaws and shortcomings as I may be, pushing myself upon women, no matter how attractive and no matter the circumstances, isn’t among them. If anything, I incline toward the other extreme, namely exhibiting what some members of the opposite sex may regard as an offensive indifference to their lures, or as my denying them the opportunity to exert their powers of rejection.

Whatever my underlying insecurity, in order for me to have a satisfying romantic or sexual experience, it is imperative that I feel genuinely desired, a prospect instantly doomed were I to impose myself in such a way as to arouse discomfort in my prey—for how can I feel genuinely desired if the target feels coerced, cornered, intimidated, or revulsed (or is drunk, drugged, asleep, or charging money for that matter)?

Anyhow, so I had this adorable sylph lodging at my place, and I treated her with my habitual air of courteous disinterest.

One evening, about a week into her stay, we were casually chatting at my desk in the living room—the topic of conversation, as I recall, was towels, specifically where she could hang hers to dry after showering—when she abruptly and utterly out of left field, sort of in the middle of a sentence, grabbed me by the back of my head, pulled it toward her, and mashed her mouth into mine.

Since I offered the polar opposite of resistance, we passionately tongue-wrestled for a lengthy spell, and when we finally came up for air, she hit me with the following—and, in hindsight, quite disturbing—revelation:

“I thought you didn’t like me!”

Think about that statement.

Assuming I had indeed so successfully dissembled my infatuation with her that she had sincerely believed or suspected I didn’t care for her in a romantic sense—even if she was merely uncertain as to whether I would appreciate her tongue thrusting into my mouth with zero heads-up—her aggression provides a textbook example of sexual assault from the aggressor’s perspective.

That I just so happened to more than welcome her assault (and to this day count it among my life’s favorite moments, if not the favorite one) doesn’t change the fact that that’s precisely what it was.

Criminal behavior is a function of the offender’s mindset at the time an act is committed, not of how the victim feels about it afterwards.

Nor does a victim have to deem him- or herself a victim in order to be a victim.

Unlike civil law, it is a core feature of criminal justice, as I understand it, that the state (i.e., society), not the victim, gets to decide whether a crime was committed and hence whether a victim exists.

Just because I may have enjoyed what you did to me doesn’t necessarily mean that what you did to me was not a crime. If you take it upon yourself to incinerate my barn, you’re guilty of arson, even if I was going to tear it down myself anyway and am grateful to you for having saved me the work.

Oh, and then this other female co-worker of mine once squeezed my behind in the workplace and followed up with a flattering yet lewd and objectifying comment regarding its shape.

I didn’t mind it at the time. In fact, I liked it. She was the type that could have palpated me up and down my entire anatomy to her heart’s content without asking permission. But that’s beside the point. In keeping with ever-evolving standards of decency in our maturing society, I have re-ass-essed her grope.

Far be it from me to make light of a serious issue, but yes, technically, I am a three-time sexual assault survivor.

Now that I have at long last come to recognize this, how will I cope?

Should I take to social media and publicly out my assailants?

Or should I simply let go and move on?

I am weighing my options.

The Fundamental Right to Buy

The Trump administration has announced a rollback of the legal requirement for companies to include contraceptive coverage in their employees’ health plans.

This move is being widely denounced as an access restriction and hence as an infringement of a woman’s right to birth control.

Of course, having to pay for anything is an access restriction. My being charged eighty cents for a pack of Doublemint circumscribes my freedom to chew gum, a freedom proportional to my ability and willingness to pony up the cash.

Although reasonable people may differ on whether there exists a constitutional or otherwise fundamental right to chew gum, there does exist—love it or hate it—a fundamental right to “keep and bear arms,” which the Constitution explicitly instructs “shall not be infringed.”

Strangely, no one ever seems to decry as a fundamental-rights-infringing kind of access restriction the fact that if you want a Glock or a hunting rifle, you must pay for it out of your own pocket.

Employers have never been mandated to cover their employees’ personal firearm expenses, nor does the government issue “gun stamps” for the indigent.

So if denying or restricting birth control coverage amounts to the denial or restriction of a fundamental right, then how come the absence of a legal framework designed to provide the people with free firearms doesn’t do likewise?

Related Post: Sex and Obamacare

The Purest of Evils

Johnny Cash once recounted that when writing Folsom Prison Blues, he’d asked himself what would be the worst possible reason to kill someone.

The man in black came up with the memorable line Well, I shot a man in Reno just to watch him die.

Indeed, bad as terrorism is, there are still worse reasons for killing people.

Although the search for motives in the Las Vegas massacre is still ongoing, from the preliminary looks of it, to call this particular mass murderer a “terrorist” would be an insult to terrorists.

Herculean a task as it certainly seems to find something “positive” to say about terrorism, at least its perpetrators fight—and kill—for causes bigger than just themselves, generally speaking.

Those causes may be utterly absurd, like propitiating some deity; or arguably rational, such as drawing attention to, and attempting to redress, in the most horrendous of ways, a real or perceived earthly injustice of one kind or another.

Sure, it is often personal issues and grievances that prompt individuals to attach themselves to such a cause. But in the end, if only in their own warped minds, they’re trying to serve some greater good by way of the carnage they wreak.

Unlike the Las Vegas shooter, whom President Trump referred to as “pure evil.”

Critics argue that the president didn’t go far enough; that he stopped short of calling him a terrorist solely on account of the fact that he was a white male American, but would have tripped over the tip of his overlong scotch-taped tie dashing toward the microphone to brand a terrorist any person of Middle-Eastern heritage suspected of lobbing spitballs at passers-by regardless of motive.

Perhaps so.

On close inspection, though, being referred to as “pure evil” is hardly more flattering than being called a terrorist. On the contrary. “Pure evil” isn’t short of anything. It is the most exalted of damning characterizations available, for what could possible be worse than—or even as bad as—pure evil, i.e., evil devoid of even the faintest potentially mitigating component?

The Madalay Bay sniper sprayed a barrage of bullets into a crowd for no apparent reason other than his getting some sick kick out of firing into a crowd.

To call the man a terrorist would a defense he does not deserve.

More, More, More

Being interviewed along with his BFF and presidential successor at the George W. Bush Presidential Center in Dallas last Thursday, Bill Clinton lamented that

If you look at America, we’re only having a 2.1 replacement of our native-born population from natural births. We can’t continue to grow this economy unless we grow more diverse and take in more immigrants.”

The only good economy, we have been told ad nauseam, is a growing economy. Just as sharks must stay in motion in order to survive, economies must grow, grow, grow—but where to? What’s the end game?

Every gardener knows that, in order to prevent over-growth, growing things must be pruned on a regular basis: lawns must be mowed, brush must be cleared, hedges must be trimmed, trees must be cut back so as to minimize the danger of their being knocked over by a storm and crashing into the house, etc.; lest, at some point, one has grown an unmanageable jungle.

So why must economies, by contrast, always keep growing ad infinitum?

Probably because populations are growing, and ever more people need ever more stuff, more energy, more resources of any kind, and—last but not least—more jobs.

In other words, economies must grow in order to sustain, or improve, the living standards of growing populations.

Soon, alas, the paradoxical situation arises when ever more people are needed in order to grow an economy at the rate required to sustain ever more people.

In particular, ever more young people are needed to sustain an ever increasing number of retirees—young people that will eventually become the older and retired generation, in turn needing ever more young people to sustain them.

And so the growth-cycle continues.

The only thing that doesn’t grow is the planet and its store of natural resources.

A planet that, incidentally, keeps getting warmer, which leads to ever more droughts of escalating severity, accompanied by dwindling freshwater supplies all over the place.

Yet, in order to grow our economy at a sustainable rate, we need ever more people, who’ll need ever more water.

How can more resources-guzzling people be the solution to anything in this world without, in the long run, exacerbating the very problems that were supposed to be solved by population growth, whether by way of immigration or otherwise?

Your Identity, Your Sex, and Your Pelvis

Except in the extraordinarily rare case of true hermaphroditism, there comes a moment immediately following a person’s birth when the obstetrician—or the midwife, or the midwife-turned-cabdriver, or whoever happens to be on hand assisting in the delivery—announces either “it’s a boy” or “it’s a girl.”

There comes another moment, a few centuries or millennia later, when an anthropologist digs up that person’s bones and seconds the original verdict, the pelvis being “the best sex-related skeletal indicator.”

In between those two events, i.e., during his or her lifetime, that person may vehemently contest said verdict and, with utmost sincerity, identify as the other sex, assume any or all of the trappings and behaviors commonly associated with the other, and even go so far as to undergo a range of medical procedures so as to approximate the other as best as possible.

As touched on in the previous post, why the transition from male to female in particular—i.e., the adoption of elements of a traditionally marginalized segment of society (women) by members of the traditionally dominant segment of society (men)—is to be applauded as a form of personal liberation rather than to be condemned as a form of cultural appropriation puzzles the mind.

But be that as it may.

While I grant everyone the freedom to identify as whatever they choose—or can’t help—to identify as, I respectfully reserve my freedom to call others as I see them, which may or may not square with how they see themselves.

If I’m the bouncer, your identifying as a grown-up won’t help you get into the club if it is a teenage body that houses your adult self. And no matter how sincerely you may profess to identify as an eight-year-old spirit lodged in a 50-year-old body, and no matter how authentically you may dress the part and suck on your lollipop, even if you’ve had your breasts surgically flattened or your facial hair permanently laser-removed, if you show up at my ticket booth, I’ll charge you full admission, not half price for minors.

(No grown-up would ever run around identifying as a minor, you might interject, since, in doing so, he or she would self-exclude from adult-only events and environs, like bars or R-rated movies unaccompanied by a person that identifies as an adult; as well as renounce his or her right to vote and operate a motor vehicle. Keep in mind, though, that just as transgender laws don’t say that transgender women, for example, are barred from using the men’s room and are therefore restricted to using the ladies’ room, but that they must be allowed to choose, it stands to reason that trans-age laws, too, would accord trans-age people the right to choose in given situations whether to be treated as minors, as in getting half-price admission, or as adults, as in being admitted into a topless joint and allowed to liquor up.)

Likewise, regarding a person’s sex or gender—terms I use interchangeably, with a slight preference for gender so as to preclude confusion with the act—I’ll invariably throw in with the judgment of the delivery person and the future anthropologist in assigning the appropriate locker room and selecting the set of pronouns I shall attach to that person.

So you may identify as a she, but if I am, or as soon as I become, reasonably convinced you entered this world accompanied by the words “it’s a boy”—recall the “maybe those dames ain’t dames” epiphany by one of Spats’s underlings when he spotted Josephine and Daphne scaling down that hotel wall—then in my book you’re a he, and that’s what I’ll refer to you as (and, of course, vice versa, if you were born a she that deems herself a he).

Save, perhaps, aloud in situations where I might elect to cave in to political correctness in order to avert confrontation, termination, or incarceration—Canada, for one, has just passed a law that makes it, yes, a hate crime to knowingly use gender pronouns that fail to accord with the gender identities of their referents—in which cases I’ll nonetheless continue to refer to you by your birth pronoun in the privacy of my own mind. (You can’t go to jail for what you’re thinking, as Dean Martin once crooned.)

Your sex at birth is literally the only criterion I use to determine your gender. To me, the two are synonymous. Because if they weren’t, I would categorize myself as a woman—a lesbian happily trapped in a male body, to be precise—and demand access to the ladies’ locker room, and no one could come up with a valid and irrefutable argument to deny it to me, as expanded upon in more detail here.

Now, here’s the problem with calling me a bigot:

I don’t identify as a bigot.

In fact, I very much identify as the polar opposite of a bigot (whatever the term for the opposite of a bigot may be). Delusional as this may sound to you based on everything I’ve said up to this point, I do perceive myself as quite inseparable from my subjective identity.

Ergo, if your definition of bigotry includes the reluctance or outright refusal to go along with someone else’s characterization of themselves, then, by your own definition, you’re a bigot for calling me a bigot.

In response, you may field the assertion, arbitrary though it is, that the modern societal mandate to honor people’s self-assessment doesn’t apply across the board but in certain select areas only.

Case in point, as stated above, among modern progressive circles it seems de rigueur to hold that for a member of a traditionally dominant segment of society to identify as a member of a traditionally oppressed or marginalized segment of society is to be unconditionally respected when it comes to gender, as when a natural-born male identifies as a woman, but is to be taken fervent exception to when it comes to race, as when a white person identifies as black.

The Internet abounds with hamfisted attempts to explain why the case of Caitlyn Jenner, the Olympic champion formerly named Bruce who now identifies as a woman, is fundamentally unlike the case of Rachel Dolezal, the white lady who identifies as black—the latter constituting a form of illegitimate appropriation, which the former, somehow or other, does not.

The question is this:

What panel of self-appointed and infallible sages, unencumbered by political agendas of one stripe or another, gets to sit there on their high horses and decide that to feel one was dealt a body that fails to match one’s “true” gender is not only more but in a fundamentally different way uncomfortable, frustrating, painful, depressing, or intolerable than it is to live in a body that doesn’t match one’s “true” race, age, height, capital hair density, or [insert your favorite trait not listed]?

Should identifying as a true American suffice to confer eligibility to vote in a U.S. federal election?

One headline, and minor variations thereof, that keeps cropping up of late reads along the lines of Straight Men Who Sleep With Other Men. The articles in its wake discuss the phenomenon of heterosexual males in heterosexual relationships who enjoy hooking up with guys on the side.

Which, naturally, makes one wonder as to the raison d’etre for the B in LGBT—if we have the word bisexual, why substitute a clunky phrase like “straight men who sleep with other men”?

Once again, it’s identity, stupid!

As goes for gender, but not for race or state of bigotry, sexual orientation appears to belong in the basket of qualities defined by self-labeling alone. If you identify as bisexual, you are bisexual. If you identify as straight, you are straight, irrespective of how many of the known sexes you bed or fantasize of bedding.

Having never come across the headline Vegetarians Who Eat Meat leads me to conclude that diet, like race, but unlike gender and sexual orientation, is subject to definition by others rather than oneself. The determination that “if your diet includes animal flesh, you’re not a vegetarian” seems acceptable in a way that “if you’re physically attracted to both sexes, you’re not straight” is not.

And to what extent are people to be taken at their own word regarding their religious identities? Is it enough to identify as a Christian or a Muslim in order to be a Christian or a Muslim?

It seems that the extent to which we are prepared to go along with the religious identities of others rides on our attitude toward a particular religion. The more distasteful we find a religion, and the more distasteful we find a person’s behavior, the more readily we’ll honor that person’s claim to be of that religion.

The dearer to our own heart a given religion, the more discriminating we’ll be in deciding who’s truly a member and who isn’t.

And so, you may well conclude that, on the one hand, every Allahu-akbar-screaming terrorist is a Muslim by definition, solely because he appears to identify as such, but that, on the other hand, no Hail-Mary-shouting and Bible-clutching terrorist could possibly be a Christian as evidenced by the nature of the atrocity he committed, for “a true Christian would never do such a thing,” wherefore he (or she) is not a Christian, regardless of self-identification.

Clearly, gender is only one of a raft of personal attributes that subject and observer may diverge upon. Official guidelines as to when it is morally imperative for us to accept another’s characterization of themselves, and when it is morally defensible to substitute our own, lack the kind of coherence necessary to distinguish them from subjective make-’em-up-as-we-go-along type of determinations like my own (which is why I might just as well go with my own personal judgments regarding who’s what).

So the headline just came out that Britain’s first pregnant man gives birth to girl.

Congratulations, but how do they know it’s a girl?

My Kingdom for a Cogent Defnition

Certain concepts, it seems, were cooked up by online content creators for the sole purpose of riling up that place known as “the Internet,” as in “the Internet erupts with outrage over [insert furor du jour].”

You can spot such concepts in that they are generally ill-defined and laden with half-baked inconsistencies, leaving it to the Internet itself to iron out the kinks until, at some point, an approximation of coherence may (or may not) emerge from the muddle.

As a shining example, let’s take a look at this barrel of drugged trout—thanks to the late David Foster Wallace for the foregoing locution—known as “cultural appropriation.”

So some company releases a poster (an ad of some sort) showing three non-Native-American ladies, including a Caucasian one, rocking war bonnets.

The Internet erupts with outrage. War bonnets, it insists, belong to the indigenous, and no one else has the right to don them for any reason. Period. Now let’s all boycott the company behind the poster. Shame on these people, including the participating models—what were they thinking???—and let’s ride them out of town on a rail.

So far so good.

But then, as happened during last year’s NYC fashion week, a designer has all his female models, including non-Muslims—including, yes, white non-Muslims—traipse up and down the catwalk wearing hijabs.

The Internet erupts not with outrage but with universal praise over this most compassionate and inclusive gesture of solidarity.

What gives?

Cultural appropriation is defined, or ill-defined, or incompletely defined, as the adoption or use of elements of a traditionally marginalized (i.e., non-white or non-western) culture by members of the dominant (i.e., white or whiter) culture.

Hence, cultural appropriation is a one-way street right out of the gate. A dreadlocked Justin Bieber is perpetrating cultural appropriation by having adopted—or stolen—an element of African-American culture. Not good.

By contrast, when Beyonce straightens and blondes her tresses, she’s practicing a form of cultural adaptation, not appropriation, as she’s not doing it just for the heck of it but in order to adjust to the white-dominated marketplace.

See, Beyonce doesn’t really want to wear a white woman’s hairstyle. But, alas, to be able to compete and make ends meet, she must make an effort, repugnant to her own personal sense of style as it may be, to blend into the culture that oppresses her.

Bottom line, white people culturally appropriate for kicks, while blacks and other non-whites do it out of sheer self-preservation, in which latter case it’s not appropriation—at least not the invidious kind.

That’s the going narrative.

Which, of course, fails to explain why white non-Muslim women that wear burkinis at the beach or hijabs on the catwalk are invariably applauded rather than excoriated. And they are applauded for doing so not in spite of but precisely because Muslims are considered marginalized in our western societies.

Even the Austrian president, a former leader of the Green Party, recently fantasized aloud about Austrian non-Muslim women donning headscarves as an act of solidarity with Muslim women. No condemnation from the Internet for his having suborned cultural appropriation.

If a white president had suggested that white women get cornrows as an act of solidarity with black women, you can bet your bottom dollar that the Internet would have blown a fuse.

Speaking of women—poignantly dubbed “the 51% minority”—aren’t they, too, a historically marginalized segment of society?

You could debate whether womanhood, strictly speaking, meets the definition of a culture, but why shouldn’t it? It certainly has its own array of uniquely identifying elements of style. (Just watch the red carpet at the Oscars. If you look closely, you may detect a few subtle differences in the way women doll up for the event in terms of hair, dress, and makeup, relative to their bow-tied and tuxedoed companions.)

Plus, word is that its members, by and large, are getting a pretty raw deal to this day, like 77 cents on the dollar and so on. (The actress playing Wonder Woman reportedly made a paltry $300.000 for her part in the film, while all her male superhero colleagues were paid millions. Doesn’t exactly sound like equal pay for equal work.)

Thus, you would think that any member of the oppressive male culture that adopts female elements like skirts or high heels—be (s)he a transvestite, a trans-gender woman, or that guy that returned in a dress after he’d been sent home from work for wearing shorts just the other day—would be roundly run through the wringer for opprobrious appropriation from the marginalized.

Yeah, right. Quite the opposite. The Internet worships at the feet of every belipsticked cross-dresser that comes along.

So if you wear a hijab or a dress without being a Muslim or a woman respectively, you’re a hero.

If you wear dreadlocks or a war bonnet without being black or Native American respectively, you’re an insensitive clod.

To be “fluid” by adopting elements of the marginalized appears to mark the most commendable stage of human evolution in some cases, while rating the most vehement censure in others.

Should you espy any rhyme or reason in all of this, congratulations!

And then there’s the blues, an arch-black type of music—how has Eric Clapton gotten away with playing it for so long?

I wonder how the Internet would react if Slowhand got dreadlocks.

The Enemy Within

Donald J. Trump’s “travel ban,” which he has now officially doubled down on calling it, may well be a most ineffective anti-terrorism measure for a variety of reasons, not to mention of questionable constitutionality.

Hands-down the oddest argument against it, though, is the oft-floated prediction that such a ban would do little to prevent the escalating problem of “home-grown” attacks, since these attacks are carried out not by immigrants and refugees from Muslim nations but by the—already natural-born—offspring of immigrants and refugees from Muslim nations.

On what logic is pointing to the threat posed by the second generation a rational argument in favor of admitting more of the first?

There are, of course, rational and compelling reasons for admitting more of the first, such as basic humanity and a moral obligation to help, even at the expense of sacrificing some of our physical security by offering others far more physical security than they’re used to. It may not be entirely outlandish to ask why we, owing to nothing but the geographical birth lottery, should be exempt from experiencing an occasional taste of the kind of violence that is par for the course in other parts of the world. What makes us so special that we ought to have a right to enjoy a level of safety unattainable to others?

But let’s not fool ourselves into thinking that we are not sacrificing some of our security by extending a helping hand to a people an ever so tiny percentage of whom appear extraordinarily susceptible to the peculiar notion that blowing up a mall will take them to paradise. (Keep in mind that one eighth of one percent is 1,200, and consider the national security challenge posed by even a paltry few hundred jihadist time bombs merrily plotting away in your country, be it the size of Germany or as large as the United States.)

In related news, earlier today, British Prime Minister Theresa May, campaigning for the general election she has called to be held on June 8th, said the most perplexing thing:

[W]hen this campaign started [i.e., a few weeks ago] we could never have predicted the tragic turn that events would take. We could never have imagined the appalling depravity that led a cowardly and callous killer to target innocent men, women, and children in the way that we saw in Manchester two weeks ago; nor could we have envisaged the brutal attack carried out on the streets of London on Saturday evening. “

Walking the streets of Manhattan in the hours immediately following the collapse of the twin towers, I vividly recall the stunned expression of shock and surprise in the eyes of the people I passed; faces that had “Oh my God, what just happened?” written all over them.

And even back then there already had been a failed attempt to topple the towers by means of a truck bomb some ten years earlier, so the 9/11 attacks, spectacular and gruesome as they were, struck a bit less out of the blue than they would have if Islamic terrorism had been an entirely unprecedented phenomenon at the time.

But now, almost sixteen years and dozens of grisly terror attacks on Western soil later, always as depraved as they come and targeting innocent men, women, and children; and in the midst of a seemingly never-ending “war on terror,” with threat levels constantly being raised, slightly lowered, and then raised again every hour on the hour; what does the Prime Minister mean when she says that attacks of the very nature as have been visited upon London in the past two weeks could “never have been” predicted, imagined, and envisaged?

What’s there not to predict, imagine, and envision?

If, in this day and age, you can’t predict, imagine, and envision that one or more nutty jihadists might blow themselves up in a crowd at any moment, or mow down people with a motor vehicle, and/or shoot or stab at passers-by at random, or cause carnage in whatever novel manner yet to be contrived; then what can you predict, imagine, and envision, if anything?

In closing, let us put the threat of Islamist terrorism in perspective, for as The Independent exposed in 2015, “White people are biggest terror threat in the US, report finds.”

The report in question states that, excluding the 9/11 attacks, from 2001 to 2015 only 25% of deadly attacks classified as terrorism (i.e., motivated by political or religious ideology) were executed “by people who claimed to be jihadist.”

A whopping 75% were carried out “by extremists who are not Muslim,” including members of “right-wing, anti-government organizations and white-supremacist groups.”

Of all terrorism-related fatalities in the U.S. during the stated time period, non-Muslims caused 65%, and Islamic jihadists caused 35%.

This indeed appears to show that American non-Muslims are vastly more terrorism-prone than American Muslims.

Unless, of course, one were to complicate matters by adding in the minor fact—which The Independent, presumably for simplicity’s sake, chose to omit— that Muslims (or, more precisely, people who identify as Muslim) make up about 1% of the U.S. population.

Which means that the segment comprising 99% of the U.S. population (including the 62% considered white) committed 75% of domestic terror attacks from 2001 to 2015, and the segment comprising 1% of the U.S. population committed 25% of domestic terror attacks from 2001 to 2015.

So you be the judge of who’s “the biggest terror threat in the US”—probably comes down to a question of overall vs. per capita.

The Barking of the Sheep

The other day, while channel-flipping my way through a commercial break, the TV screen suddenly filled up with dozens of bleating quadrupeds.

Because these critters looked so very adorable, I lingered awhile. I had come upon a documentary on free-range sheep farming.

A bearded shepherd was being interviewed about the joys and hardships of making a living off the raising of sheep in this day and age, and doing so organically, as he claimed to use neither pesticides nor herbicides on his pastures. He also talked about the emotional bond he had with his animals and how he knew each one of them individually. And about the skyrocketing rent per acre of pasture and the dark cloud this cast over the long-term sustainability of his profession.

Anyhow, cavorting among the sheep were what looked and sounded like a couple of cute dogs—I forget what breed, but some sort of herding dogs—whose job it was to keep the flock together and scare away any predators that may have set their caps on a juicy rack of mutton for dinner. The man explained that these dogs had been born and raised among the flock and therefore thought of themselves as sheep.

Mind you, he didn’t say these dogs were sheep. He said they thought they were sheep.

But assuming these dogs indeed identify as sheep—not sure how one can make such determination with certainty, but assuming one can—how does it square with modern-day sensibilities and political correctness to refer to them as dogs that identify as sheep rather than as, well, sheep?

For if a dog sincerely deems itself a sheep, it seems a bit transphobic to call it a dog, does it not?

Farewell to the Factor

Holy macaroni!!!

They really did it. They really spiked my favorite show.

The reality of it won’t fully sink in until, come Monday, a program other than The O’Reilly Factor will occupy the 8 P.M. slot on Fox News.

Some ten years ago, I briefly had a premium membership on billoreilly.com. Alas, I soon got terminated and banned for life for having supposedly violated the site’s terms of use as laid down by Bill O’Reilly.

Now the man finds himself terminated as per his erstwhile employer’s new and improved terms of service. Oh well. Karma’s a bitch.

Sadly, though, by silencing its loudest moderate voice, Fox News has taken its largest step yet on its journey toward becoming unwatchable to anyone left of Ted Nugent.

“Moderate???” you gasp, your visage contorting into a rictus of bewildered consternation?

It’s all relative. The New Yorker, for one, described Bill O’Reilly as “a stodgy conservative with some surprisingly moderate positions […] that, at times, pit him against the more extreme factions within his circle”; and Jon Stewart, guesting on The O’Reilly Factor, characterized the host as “the most reasonable voice on Fox,” who had “in some ways become the voice of sanity here” (which, Stewart proceeded to pointedly qualify, was “like being the thinnest kid at fat camp”).

In case anyone had managed to cling on to it until now, the speedy and unceremonious booting of cable news’s top-rated talent once and for all dispels the myth of job security. No such thing exists in this world. If even Bill O’Reilly can get fired overnight like your average waiter, anybody can get axed at any time. (Granted, the average waiter’s pink slip rarely comes with a $25 million paycheck for future services not to be rendered—at least none of mine ever did.)

One question, among the numerous fascinating facets of this story, is beating my brain like a hammer:

Why, instead of placing a wooden board over a ground-floor window at the Fox News building, on the inside of which was—perchance still is—mounted a giant outside-facing Bill O’Reilly poster, didn’t they simply go in and remove or replace the poster? (If you have a theory, please skip down to the comment section right now.)

Speaking of windows, let’s just hope that the charges that lead to O’Reilly’s “defenestration”(word of the day—brought to you by Politico) are true and irrefutable. Because if they aren’t, modern society has a new problem in addition to its longstanding problem of sexual harassment.

On the one hand, it seems highly unlikely that Fox News would have yanked its biggest star, i.e., slaughtered its most profitable cash cow, unless the accusations against him were rock-solid and proven beyond a reasonable doubt, perhaps even far more serious than widely reported. If anything, for as long as it could get away with doing so, the network probably bent over backwards in looking the other way and hushing up the severity of the transgressions in order to avoid, or delay, having to part with the Irish-American ratings machine.

On the other hand, given the lightning speed and wildfire ferocity with which sensational and damaging allegations, be they founded or not, can nowadays spread through social media and take on a life of their own; and irrespective of whether or not this may have been a, yes, factor in O’Reilly’s unseating; what are the odds that, in this day and age, advertisers may be cowed into withdrawal by the mere appearance of impropriety bruited about ad nauseam by an online lynch mob that couldn’t care less about the veracity of the allegations it disseminates around the Web, as long as those allegations—founded, exaggerated, or baseless—might be instrumental in slaying politically disagreeable dragons?

An attorney representing some of O’Reilly’s accusers frankly admits that “the mission was to bring down Bill O’Reilly.”

Salon attributes O’Reilly’s ouster to “public pressure campaigns from liberal activist groups, using a combination of old-fashioned and newfangled techniques to put pressure on Fox News advertisers and 21st Century Fox, the channel’s parent company.”

A New York Times article titled Fears of Revolt by Consumers Felled O’Reilly states that

[i]n an era when outrage can be easily channeled online, major brands are well aware of the risk of revolts from consumers who are increasingly savvy about hitting companies where it hurts. […] Numerous social-media-savvy groups are capitalizing on that to potent effect, both to expose where advertisers are placing their ads and to mobilize people into registering their concerns.”

Wunderbar as this novel kind of orchestrated advertiser intimidation may sound on its face, keep in mind that the outrage that can nowadays be so easily channeled online, and the ensuing fear of revolts from consumers, may rest on substantiated information or fabricated propaganda alike, and can thus be used to right wrongs just as it can be used to perpetrate them and to effectuate political ends deemed to justify all means.

Since the mission to bring down Bill O’Reilly, and Fox News as a whole, has been doggedly pursued by countless activists and rival media organizations for two decades with poor success, it stands to reason that whatever genuine outrage may exist over the substance of the allegations against Bill O’Reilly takes a backseat to the mission at large.

Given the sheer magnitude of the thorn that Fox News has long been in the side of so many, much—perhaps most—of what presents itself as a principled stance against misogyny and the maltreatment of women in the workplace is likely little more than collective salivation over having, at long last, hit upon the network’s Achilles heel, and now a humongous school of exhilarated sharks are eagerly circling the bleeding whale.

For if this were primarily about sexual misconduct per se, another Bill, Bill Clinton, would long ago have been declared persona non grata by the very crowd that are now rejoicing in the toppling of Bill O’Reilly while shedding public crocodile tears about the suffering inflicted upon the purported victims. The former president, after all, boasts actual rape allegations on his resume, whereas the complaints against the former host of The O’Reilly Factor seem confined to the verbal and attitudinal realm, such as referring as “hot chocolate” to a black female receptionist and making “grunting” noises when passing her desk, or waxing palpably frosty toward women that had brushed off his advances—one lady says he omitted to help her up after she had tripped while escaping his attempt to kiss her; another claims that, in the immediate wake of her having declined to accept an invitation to his hotel suite, he insulted her purse as “ugly,” then failed to follow through on a promise to give her a permanent segment on his show.

And back in 2004, an ex-staffer of his testified he had pestered her with salacious phone calls about rubbing her down with a loofah (a word I had to look up at the time).

Of course, as Bill O’Reilly was wont to exhort others on his show, “you don’t excuse bad behavior by pointing to worse.” Making subordinates’ or would-be employees’ job opportunities, promotions, or raises ride on the granting of sexual favors is surely to be classified as pathetic creepiness, no matter the actual lengths to which a predator will go to elicit compliance. The argument that Bills Clinton and Cosby, in their heyday, may well have climbed quite a few rungs further up the creepiness ladder shouldn’t let O’Reilly off the hook.

But there’s a flip side to this coin:

What about the the blackmail potential, that is, the specter of subordinates or would-be employees threatening to bring bogus sexual harassment charges in retaliation for not getting the job, the promotion, or the raise?

Moreover, the definition of sexual harassment having gotten so broad and practically unfalsifiable as to encompass looks received and vibes picked up on, it becomes ever easier for aspiring victims to find something in a person’s behavior that, should the need arise, can be interpreted or retconned as micro-harassment or covert harassment or whatever the term du jour for anything impossible to verify or confute objectively.

The societal stigma that these days attaches to not giving unconditional credence to allegations of sexual harassment naturally turns such allegations, and the threat thereof, into powerful weapons against which said stigma serves as a shield.

And once a person, usually a man, has been tarred with the brush of sexual harassment, no matter how valid or invalid the initial charge, he is now a sitting duck for the pile-on, as the burden of proof for subsequent accusers decreases in proportion to their numbers.

My Facebook friend Sarah weighed in on the matter. She starts off with unembellished pith:

I’m glad that O’Reilly got busted because he’s a racist asshole.”

In my humble and unenlightened estimation, all those people—and they are legion—who tag Bill O’Reilly as “racist” either (a) possess insufficient familiarity with his work and overall presentation in anything resembling meaningful context, or (b) are the type that couldn’t tell genuine racism from a toasted jelly sandwich even if someone were to train a 10,000 Watt spotlight on both. But that’s a discussion for another day.

Sarah then, perhaps surprisingly given her opening volley, concludes her commentary thus:

But these claims of sexual harassment — at least those that have been made public — are beyond ridiculous. Of course it’s wrong for anyone to make a derogatory statement to anyone else. But, please, if these women are trying to say that they are terrified and traumatized by a little name calling, they’re making women sound weak and are giving them (us) a bad name.”

Sarah seems to have at least a partial point there, as evidenced by this curious Facebook entry by one Caroline Heldman, an associate professor of politics at Occidental College:

I filed a complaint of gender discrimination against Bill O’Reilly with the hotline an hour before he was released by Fox News this morning.

I appeared as a regular guest on Bill O’Reilly’s show from 2008 – 2011. In December of 2011, during a typical heated interview, Mr. O’Reilly called me “hysterical.” I responded that his use of the term was sexist. He (or his staff) retaliated by editing out just that portion of the interview and never inviting me back on his show. […]

I am grateful to Lisa Bloom for representing me, and for the other women who have come forward to report experiences of sexual harassment and gender discrimination from Mr. O’Reilly. Young women need to see that this behavior is not acceptable or lawful.”

Whom this type of paranoid drivel gives a bad name to are, first and foremost, legitimate victims of sexual harassment and gender discrimination.

To call a woman “hysterical” is about as sexist and gender-discriminatory as it is to tell a man he has balls. The fact that one has negative, the other positive connotations is irrelevant on the sexism front, as both terms refer to a part of the human anatomy that only one sex possesses, the female uterus and the male testicles respectively. Good sexist, as it were, like having balls in the sense of being courageous, is no more or less sexist than is bad sexist, like being hysterical in the sense of being unable to control one’s emotions. Plus both terms are indiscriminately co-ed. Women can have balls in the sense of being courageous, and men can be hysterical in the sense of freaking out. Both men and women, of course, can also be hysterical in the sense of being hilarious.

To suggest that Bill O’Reilly would have called a guest “hysterical,” then edited the segment and never invited her back because she was a woman is, well, hysterical (in every sense of the term)—because how to explain all his male guests over the years that found themselves screamed at, showered with unflattering epithets (including “hysterical”), edited, and never invited back on the show?

In 2014, Kirsten Powers, a female liberal ex-Fox News and current CNN commentator, who frequently found herself the target of O’Reilly’s trademark tirades, countered those misdirected sexism charges in a USA Today piece artlessly titled Bill O’Reilly is not sexist. She presciently capped her portrait of Bill O’Reilly as an equal-opportunity offender with the prescient observation that “if everything is sexist, then eventually nothing will be.”

On that note, with a heavy heart, I guess I’ll have to go find myself a new favorite show—anyone know if Rachael Ray still does her 30 Minute Meals?

United Airlines — Try Fly With Us!

Presumably, most people will agree that airlines ought to operate in a manner so as to preclude right out of the gate (pun intended) the need to ever kick ticketed passengers off a flight due to overbooking or in order to accommodate deadheading crew members, as was reportedly the case in last week’s melodrama that resulted in a few missing teeth and a broken nose for one of four spontaneously deplaned individuals, not to mention a splitting PR headache for United Airlines.

In the unfortunate event, however, that an airline’s greed or mismanagement, or unforeseeable circumstances unrelated to ethical or managerial misfeasance, do necessitate the expulsion of one or more head of cattle from their class, what to do if the search for volunteers comes up empty or short no matter how many carrots are being dangled before the traveling livestock, and if one or more of the head ultimately selected for culling mulishly resist all verbally delivered entreaties to take wing back to the terminal whence they had boarded?

Apparently, airlines have the legal right to expel passengers for a variety of reasons, from being overbooked to security concerns to immodest attire or playing Words with Friends on a digital device before takeoff—and who hasn’t yet caught on that noncompliance with a flight crew’s instructions, whatever their nature, including the instruction to extract, ranks as a federal offense?

Of course, any right without the concomitant right to enforce it is about as useful as a blackboard without chalk. Hence, for better or worse, the word enforce contains force—for what might conceivably be the point in being invested with the right to remove a person from a plane without also having the right to drag this person off by force if he or she declines to succumb to verbal persuasion?

The attorney for the family of the thus injured Asian-American physician now suing the fuselages off United Airlines put it this way:

“If you’re going to eject a passenger, under no circumstances can it be done
with unreasonable force or violence. That’s the law.”

In other words, even the attorney for the plaintiff acknowledges (a) that to eject a passenger is perfectly legal and (b) that to use force in doing so is also perfectly legal, provided that said force remains within “reasonable” bounds.

This leaves us with the task of defining those reasonable bounds in cases where a passenger lawfully selected for removal, having proven impervious to all coaxing and blandishments lavished upon him, elects to go tooth-and-nail and breaks into a kicking-and-screaming-style hissy at first contact by a law enforcement officer’s hand.

Again, a right to remove that comes with the qualification “unless the individual in question unmistakably communicates his or her insistence on staying” isn’t much of a right to remove.

If my right to throw you out of my house doesn’t extend beyond my right to ask you to leave; or if, after my asking you nicely to absent yourself has failed to produce the desired result, my right eject you by force ceases at the slightest bruise you may suffer not in consequence of my grabbing your arm but in consequence of your attempts at shaking loose; and if the same constraints apply to any law enforcement personnel I may summon to effect your removal on my behalf; I effectively have no right to throw you out of my house. (Perhaps I shouldn’t have the right to throw you out, but then let’s call a spade a spade rather than pretend I have a “right” but no meaningful right to enforce it in the face of resistance.)

The most reasonable definition of “reasonable force,” it seems to me, is the minimum amount of force necessary to accomplish an objective that has proven unattainable through non-forceful means.

Physical injuries sustained by a forcibly removed individual indicate unreasonable force only if inflicted gratuitously, not if sustained solely as a result of that individual’s twisting and flailing about in his or her quest to resist and break free.

For—syllogism alert!—if only reasonable force may be applied, and if the mere presence of resistance injuries suffices to meet the legal definition of unreasonable force, then the “right” to move an unwilling subject from A to B dwindles toward nil, because now all a targeted individual has to do is signal impassioned resistance in order to essentially compel their approaching captors to resort to unreasonable force, which would be unlawful of them to resort to—ergo, anyone willing to exhibit ferocious unwillingness to comply with a lawful order is therefore legally exempt from having to comply. And that’s absurd.

All that said, as stated in the opening paragraph, the need to remove ticketed passengers should never arise in the first place. But if it does, it makes little sense to grant an airline the right to do so and use reasonable force if necessary, but then to label unreasonable even the minimum amount of force necessary to vanquish furious resistance when encountered.

In the words of Johnny Mercer, something’s gotta give.

Why the Case Against Voter IDs is a Case for Free Firearms

Presidential election day is upon us … again!

Tomorrow (as viewed relative to this post’s date of publication), for the third time this year, Austria will take a stab at holding a successful runoff election. Counting the initial election that resulted in the runoff, Dec 4th will be Austria’s fourth presidential election day 2016, i.e., my personal fifth.

The result of the first runoff in May, won by nominally-independent-Green-Party candidate Alexander Van der Bellen, was subsequently undone and holden for naught by the Austrian Supreme Court on grounds of multiple procedural violations during the counting of the votes. The repeat runoff, scheduled for October, was called off preemptively after it had been discovered that some of the return envelopes mailed out with the absentee ballots had failed to seal properly due to faulty glue.

Whether Austria will come through on Dec 4th remains to be seen. Either way, given that the country has been doing fine sans president since July, and the point of having one waxes more elusive by the day, no one’s in a rush to get this schnitzel breaded (if you will pardon my Shakespearean attempt at coining an idiom for the ages).

In the run-up to every federal election in Austria—so for the fourth time this year—all eligible voter are snail-mailed an official voter notification document, which contains various bits of information, such as the voter’s name, his or her individual voter number, an individualized barcode, plus the exact date and time of the election and the address of his or her assigned polling place.

On the bottom, in small print, voters are told that in order to cast their vote, they must bring this document along with—drumroll!—a valid photo ID; that they must bring both; that the voter notification card itself does not double as an ID.

While non-controversial in Austria (and, most likely, in most Western democracies), stateside opinions differ as to whether the requirement to bring a valid photo ID—which, of course, presupposes that one owns one—to the ballot box constitutes voter discrimination. The question of how to determine whether a person’s stated identity accords with their actual identity and hence whether they’re eligible to vote in the first place appears to weigh much more heavily on the minds of some Americans than of others. Depending on whom you ask, the prevalence of voter fraud ranges from negligible to, at least according to one avid Twitter user, “millions” of illegitimate votes that handed the popular vote to Hillary Clinton in the most recent election.

No matter where you stand on the issue, one thing everyone agrees upon: namely that those who, for whatever reason, do not possess IDs are more likely to be found among the less affluent and more minority-heavy segments of society, and therefore are more likely to vote Democratic than Republican. Look no further for an explanation of why, generally speaking, conservatives support and liberals object to stringent voter ID laws.

Just as Morgan’s canon states that “in no case is an animal activity to be interpreted in terms of higher psychological processes if it can be fairly interpreted in terms of processes which stand lower in the scale of psychological evolution and development,” one need not resort to high-minded ideals like fraud prevention or the commitment to standing up for people’s fundamental rights in order to figure out the liberal/conservative divide on the voter ID issue. Laxer voter ID laws simply translate into more votes for Democrats, which naturally suits the left just fine and horrifies the right. That’s all there is to it. If most ID-less people were Republicans, you can bet your bottom dollar that attitudes toward voter ID laws would be diametrically reversed along party lines.

The primary objection to voter IDs, I suppose, consists in the notion that the inconvenience and monetary expense associated with obtaining a valid photo ID makes it harder for people to vote and thus presents a de facto infringement of their constitutional right to do so.

This line of argument, alas, runs into trouble as soon as we recall to our minds that there exist other fundamental constitutional rights besides voting, such as the right to keep and bear arms. Obviously, if the inconvenience and expense associated with acquiring a photo ID constitutes an unacceptable obstacle to one’s fundamental right to vote, then the inconvenience and expense associated with acquiring a firearm constitutes an unacceptable obstacle to one’s fundamental right to keep and bear arms.

Although individuals may fancy certain rights over others, the Constitution contains no hierarchy of rights. Unfortunate as you may find this—as I certainly do—Americans don’t have more of a right to vote (or to speak freely, for that matter) than they have a right to keep and bear arms.

Ergo, to argue that Americans have a right to vote without being asked to produce a valid photo ID is tantamount to arguing that Americans have a right to free guns delivered to their homes, no questions asked—because if having to go out and purchase an ID abridges your right to vote, how does having to go out, subject yourself to a background check, and spend your hard-earned cash on a firearm not abridge your right to keep and bear arms?

Or, to analogize the fundamental right to vote to a fundamental right less incendiary than the fundamental right to keep and bear arms, the absence of a right to effortlessly-available and free-of-charge writing materials (paper, pencils, typewriter, laptop, etc.) could surely be viewed as an abridgment of your First Amendment freedom of speech and of the press—for how free to write your book are you if your right to write it doesn’t come with a concomitant right to all requisite writing supplies at no expense to you?

That aside, one must present proof of identity in order to apply for government benefits available to the impoverished, so if the requirement to produce ID at the ballot box amounts to massive voter suppression, the very same ID requirement must also lead to massive food stamp suppression. And yet, googling “voter suppression” brings up hundreds of thousands of results, whereas googling “food stamp suppression” brings up exactly two.

What conclusion are we draw from this disparity other than that ID requirements don’t suppress anything, for if they did, how could they keep eligible voters from voting but not keep eligible food stamp recipients from receiving their food stamps?

Be this my Austrian or my conservative half talking, but opposition to voter ID laws strikes me as driven by nothing save the desire to get as many non-citizens as possible to vote in federal elections.

Electoral Woes

The morning after election night 2012—Barack Obama had just secured himself a second term in the White House—then-future President-elect Donald J. Trump, in one of his many fatuous tweet-from-the-hip outbursts, apparently believing (in error) that Mitt Romney had bagged the popular vote, declared the electoral college “a disaster for a democracy.”

Following the 2016 election, which indeed saw the keys to 1600 Pennsylvania pass into the hands of the runner-up in the never-fought contest for the nationwide popular vote, the president-elect appears to have softened his position, now calling the electoral college a “genius system.” Come election night 2020, in the as-(un)likely-as-Trump-being-elected-president-in-the-first-place event that President Trump is denied a second term in spite of reaping the popular vote then, odds are he’ll revert to his 2012 assessment of the EC.

The human propensity for failing to anticipate that one day the shoe might end up on the other foot never ceases to amaze, for as sure as night follows day, our trusty electoral college is currently taking a mollywopping from those who, just a few short weeks ago, following then-candidate Trump’s refusal to pledge his unconditional submission to the outcome of the election if he were to lose, were denouncing as unpatriotic and dangerous any calling into question the legitimacy of American democracy as practiced.

If disgruntled Clinton voters (perhaps more accurately referred to as anti-Trump voters) have a fundamental quibble with the electoral college, their misgivings could be taken more seriously had they voiced them before the election rather than waxing apoplectic over the EC only after it vaulted the wrong candidate to victory; for it is highly doubtful that the very same folks that are now busy signing and circulating a spate of online petitions to change or outright abolish the electoral college on account of it being “antiquated” and “undemocratic” would be doing so if they perceived the EC, as is, as having saved us from rather than stuck us with a Trump presidency.

Anyone that wants to tinker with, or all-out eliminate, the EC on grounds that it gave us Trump—and W before that—may want to step back and consider for a moment that in future elections, any “new and improved” system may produce equally unsettling election results that the current one would have precluded.

But it’s even more complicated than that.

Because not only is it devilishly difficult to prophesy whether, all other things being equal, a modified or abolished electoral college would facilitate or frustrate victories by one’s favorite candidates in future elections, but on top of that, all other things would not be equal, as political campaigns will always be strategically tailored to whatever system is in place at the time.

Thus, it is a fallacy to claim that if the U.S. president were elected by popular vote, Hillary Clinton would be inaugurated on January 20th, for if the president were elected by popular vote, both candidates and their camps would have campaigned very differently from the outset. For one, they’d have chosen to spend more time and resources to court voters in populous coastal states rather than whistlestop through flyover country hunting for the farm vote in Iowa, and they would have revised their respective planks accordingly.

To engage in counterfactual speculation and retrodict that if Donald Trump had sat his Make America Great Again cap on winning the popular vote as opposed to the majority of electoral votes he would have failed is like saying that if football games were scored on grace and style in addition to touchdowns and field goals, the Mountaintop Catamounts never would have won the Superbowl—well, if grace and style points were part of the scoring system in football, both the Mountaintop Catamounts as well as their opponent, the Greencastle Railsplitters, would have modified their game. There’s no telling how well or badly each team would have risen to a hypothetical task never set to either, just as there is no telling whether Donald Trump would or wouldn’t have managed to rustle up, rally to his cause, and coax to the polls just enough additional irredeemably deplorables from that cyclopean basket so as to score the popular vote if that had been the road to victory. (My apologies for not being much of a ballgames aficionado, wherefore it is so much easier and faster for me to make up fictional team names than look up existing ones.)

Furthermore, there’s no telling how many non-swing-state voters stayed home on election day—or decided against mailing in their absentee ballots over exorbitant postage—because they correctly expected the outcome of the presidential race in their respective states to be a foregone conclusion (and either didn’t know or didn’t care about any of the down-ballot boxes to be ticked), but who would have hauled their a$#es to the polls if the president were elected by a nationwide popular vote, in which case their individual votes may have had an actual shot at affecting the result.

In short, it is invalid to superimpose real-world election results onto hypothetical or futuristic election scenarios, as any such scenario would have altered the behavior of voters and candidates alike. Anyone insisting that Trump never would have won the popular vote, even if that had been his goal, sounds just as astute and infallible as all those pundits that deemed it impossible for Trump to get anywhere near 270 electoral votes, which was his goal, and he overshot it by 34.

Besides quixotic calls for the abolition of the electoral college in the wake of Donald Trump’s traumatizing electoral landslide, which would necessitate a constitutional amendment and hence ranks roughly on par with fantasies about repealing the Second Amendment, a number of possible modifications to the institution are being floated, such as replacing the winner-takes-all system currently employed by 48 out of 50 states with a winner-of-the-national-popular-vote-takes-all system in all 50 states. In fact, back in 2007, Maryland proposed a plan to award all of its ten electoral votes to the winner of the nationwide popular vote. Implementation of the plan foundered over its failure to gain traction with other states. So this particular attempt at outsmarting rather than amending the Constitution went nowhere, as did hundreds of bills heretofore introduced in Congress to superannuate the EC in one fashion or another.

The electoral college seems here to stay, which is probably best, since a direct popular vote for president would de facto disenfranchise most of the country by concentrating electoral power in but a handful of large cities and other densely populated areas, primarily ranged along the two coastlines.

Still, many voters feel disenfranchised precisely on account of the EC, primarily voters in non-battleground states—for what’s the point of voting in places like deep-blue California or deep-red Wyoming, where all electoral votes are spoken for anyway?

Personally, I would propose replacing the winner-takes-all system with one similar to the congressional district method employed by Maine and Nebraska, whereby a state’s electoral votes would be awarded on a percentage basis proportional to the outcome of the popular vote in that state. So if 70% of Californians voted for Lassie and 30% for Lord Voldemort, instead of Lassie walking away with 100% of California’s 55 electoral votes, Lassie would get 70% and Lord Voldemort would get 30% of the 55. (Some rounding may have to be applied, but there are experts who can figure out exactly how many electoral votes out of, say, Idaho’s four amount to 59.2%.)

That way, the electoral college would remain intact, on paper as well as in spirit, both Lord Voldemort as well as Lassie voters would feel more empowered than they would otherwise, and flyover country would retain its relevance.

Now all we have to do is convince all 50 states to adopt my system.

Proof that the U.S. Election is NOT Rigged

At least not in favor of the Democratic candidate for president.

Earlier today, I stopped by my local post office with the intent to fulfill my patriotic duty as a currently exiled American and mail off my absentee ballot. The clerk placed my official election mail envelope, which contained the slightly smaller envelope that contained my ballot, on the scales and matter-of-factly announced the crushing verdict: €7.00! (That’s $7.64, as per today’s conversion rate.)

Now, if I were a currently exiled citizen of one of the battleground states like Florida or Ohio, where my vote could conceivably make a difference one way or another, it would be a different story. But as someone with an income of zero and a personal net worth of somewhere in the neighborhood of a half grand in credit card debt that I haven’t managed to pay down in over five years, I find myself disinclined to blow seven bucks and change on postage for casting a symbolic vote in a state—the Great State of New York, in my case—so deeply blue that Hillary’s bound to sweep its 29 electoral votes in a landslide no matter what anyway. So from my point of view, mailing my ballot would have been seven euros down the drain. (I was prepared to absorb the €1.70 expense I had expected, but not four times that amount, the quadruplication stemming from the election envelope’s oversize, which I had failed to take into account when budgeting my vote.)

Thus I retook possession of the missive I had intended to dispatch, returned it to the pocket of my tattered coat, and repaired back to my little cave in the Vienna Woods—yes, all Austrian caves are fitted with broadband, just in case you’re wondering—somewhat reassured that the election can’t be quite as rigged against him as candidate Trump makes it out to be.

Because, more likely than not, a person in my income & personal wealth bracket—i.e., so low he’s apt to get sticker-shocked by a €7.00 price tag—votes Democratic (or Democrat in common GOP parlance). So the fact that “the system,” by refusing either to pony up the postage for absentee ballots or to design smaller ballots and envelopes so as to bring down to an affordable level the postage to be paid by the sender, lets slip through its fingers the votes of likely Democrat(ic) voters like me, does seem to hint at a wholesome measure of indifference toward the outcome of this election.

If anything, judging from the absentee ballot postage situation, the system appears rigged in favor of the right.

What say you now, Mr Trump?