Wednesday, September 30, 2020

The Sizes of Stars

The universe we inhabit is full of wonders. It's filled, for instance, with hundreds of billions of galaxies, of which our Milky Way is just one, and each of those galaxies is comprised of billions of stars. These stars range in size from very big to unimaginably huge, thousands of times larger than our sun.

To get an idea of the range of size of the stars in our galaxy watch this video:
The word "awesome" is overused in our discourse, often employed to describe things that are really quite ordinary or mundane, but it's entirely appropriate, I think, to use the word to describe red giants.

It's also appropriate to invoke the word "awesome" when reflecting on the fact that scientists believe that the atoms that make up our earth and everything on it, including our bodies, were created long ago in the cores of stars that no longer exist. According to the scientific consensus you are indeed, stardust.

Tuesday, September 29, 2020

Amy Coney Barrett

As anticipated, President Trump has selected Amy Coney Barrett to fill the Supreme Court seat vacated by the death of Ruth Bader Ginsburg.

Much of the left will do their worst to destroy Ms. Barrett, personally and professionally just as they tried to do to Clarence Thomas, Samuel Alito and Brett Kavanaugh, but there are some intrepid folks on the left willing to risk the opprobrium of their fellow lefties by proclaiming their belief that Judge Barrett is an excellent choice for the Court.

Harvard Law professor Noah Feldman is a prominent example. Feldman is very liberal and disagrees with Barrett on judicial philosophy, but in a recent article at Bloomberg.com he offers an endorsement that will give his fellow liberals serious heartburn.

The whole article is worth reading, but here are some of the more significant outtakes:
I disagree with much of her judicial philosophy and expect to disagree with many, maybe even most of her future votes and opinions.

Yet despite this disagreement, I know her to be a brilliant and conscientious lawyer who will analyze and decide cases in good faith, applying the jurisprudential principles to which she is committed. Those are the basic criteria for being a good justice. Barrett meets and exceeds them.

I got to know Barrett more than 20 years ago when we clerked at the Supreme Court during the 1998-99 term. Of the thirty-some clerks that year, all of whom had graduated at the top of their law school classes and done prestigious appellate clerkships before coming to work at the court, Barrett stood out.

Measured subjectively and unscientifically by pure legal acumen, she was one of the two strongest lawyers. The other was Jenny Martinez, now dean of the Stanford Law School.

When assigned to work on an extremely complex, difficult case, especially one involving a hard-to-comprehend statutory scheme, I would first go to Barrett to explain it to me. Then I would go to Martinez to tell me what I should think about it.

Barrett, a textualist who was working for a textualist, Justice Antonin Scalia, had the ability to bring logic and order to disorder and complexity. You can’t be a good textualist without that, since textualism insists that the law can be understood without reference to legislative history or the aims and context of the statute.

To add to her merits, Barrett is a sincere, lovely person. I never heard her utter a word that wasn’t thoughtful and kind — including in the heat of real disagreement about important subjects. She will be an ideal colleague....This combination of smart and nice will be scary for liberals.

I don’t really believe in “judicial temperament,” because some of the greatest justices were irascible, difficult and mercurial. But if you do believe in an ideal judicial temperament of calm and decorum, rest assured that Barrett has it.

Barrett is also a profoundly conservative thinker and a deeply committed Catholic. What of it? Constitutional interpretation draws on the full resources of the human mind. These beliefs should not be treated as disqualifying. I hope the senators at her hearing treat her with respect.

And when she is confirmed, I am going to accept it as the consequence of the constitutional rules we have and the choices we collectively and individually have made. And I’m going to be confident that Barrett is going to be a good justice, maybe even a great one — even if I disagree with her all the way.

I want to be extremely clear. Regardless of what you or I may think of the circumstances of this nomination, Barrett is highly qualified to serve on the Supreme Court.
We might all hope that Democrat senators treat her with respect, but if history is any indication, they won't. One point of their attack has been and will be Barrett's religion. Some Democrats believe that anyone who takes religion seriously is ipso facto disqualified from being confirmed to the bench, but, of course, this is silly.

Barrett herself explained why in an event last year at Hillsdale College where she was asked "What role, if any, should the faith of a nominee have in the confirmation process?” Her answer was "None."

She went on to explain:
I mean, we have a long tradition of religious tolerance in this country. And in fact, the religious test clause in the Constitution makes it unconstitutional to impose a religious test on anyone who holds public office.

So whether someone is Catholic or Jewish or Evangelical or Muslim or has no faith at all is irrelevant to the job.

I do have one thing that I want to add to that, though. I think when you step back and you think about the debate about whether someone’s religion has any bearing on their fitness for office, it seems to me that the premise of the question is that people of faith would have a uniquely difficult time separating out their moral commitments from their obligation to apply the law. And I think people of faith should reject that premise.

So I think the public should be absolutely concerned about whether a nominee for judicial office will be willing and able to set aside personal preferences, be they moral, be they political, whatever convictions they are. The public should be concerned about whether a nominee can set those aside in favor of following the law.

But that’s not a challenge just for religious people. I mean, that’s a challenge for everyone. And so I think it’s a dangerous road to go down to say that only religious people would not be able to separate out moral convictions from their duty.
She's right about that. Why should one's religion be uniquely problematic? Why not a candidate's ideology or judicial philosophy? Why not a candidate's fervently held personal beliefs on any issue likely to face the Court? To single out religious conviction as uniquely disqualifying is both manifestly unconstitutional (the Constitution prohibits religious tests for high office) and grossly bigoted.

We should desire that the most highly qualified people be seated on the Supreme Court, people who have a deep reverence for the Constitution and who see their role as interpreting the law, not making it.

What we don't need are more justices who are little more than partisan hacks deciding cases according to their personal ideological preference, cultural fashion or political expediency.

Monday, September 28, 2020

Questions for Mr. Biden

Kimberley Strassel, in a recent Wall Street Journal op-ed, lists eight topics Joe Biden should be compelled to address in tomorrow night's debate with Donald Trump. There are others, of course, but he should be asked at least about these:
Hunter Biden. Joe Biden wants this election to be about character. Fair enough, but that requires addressing the ethical swamp that was his son’s business during the Biden vice presidency. As a new Senate report makes clear, the candidate’s son was cashing in. The vice president was warned and did nothing about it. Will this kind of influence-peddling be the norm in a Biden presidency?

The Supreme Court. Mr. Biden in June said he was compiling a short list of black women for the high court, and that he’d release it after further “vetting.” He now refuses, saying such a list could subject his picks to “attacks.” Mr. Trump has been entirely open about his vision for the court, and the names on his list have withstood scrutiny. Doesn’t Mr. Biden have a similar obligation to transparency, especially as he says the 2020 winner should fill the seat vacated by Justice Ruth Bader Ginsburg?

The Federal Bureau of Investigation. The FBI spent 2016-17 investigating a GOP political campaign, using tactics that were condemned by the Justice Department inspector general and are now the subject of a U.S. attorney’s investigation. How briefed was Vice President Biden on this probe? Why did he personally unmask Mike Flynn? Does he agree with the inspector general that these tactics were wrong? Should the public fear a Biden FBI will also investigate Mr. Biden’s political opponents?

Political norms. Mr. Biden talks about Mr. Trump’s norm-breaking, and he has a point. Will he call out his own party’s norm-breaking? Will he oppose abolishing the Senate filibuster? Will he oppose court packing? Will he acknowledge the damage of politicized impeachment proceedings?

The economy and spending. By one count, Mr. Biden has pledged some $11 trillion in new spending over a decade. Yet his tax proposal will raise only one-third this amount. Where’s the rest, and does he truly propose adding further to a spiraling deficit? Meanwhile, his tax proposal (the largest permanent increase since World War II) and regulatory regime would impose new costs on companies and families reeling from shutdowns. Spending doesn’t equal jobs. So how, specifically, does Joe intend to revive the economy?

Energy. Mr. Biden insists his fracking ban would apply solely to public lands. How does he reconcile that with past promises to get rid of all fracking? Where does he stand on coal? Does he acknowledge that his plan for net-zero carbon emissions, by necessity, requires the elimination of most fracking and coal jobs?

Administration. Mr. Biden ran as a moderate but has made significant concessions to retain progressive support. What will his cabinet look like? Will he return to Obama-era familiars? Or will we see Secretaries Elizabeth Warren, Bernie Sanders and Ilhan Omar?

Religious liberty. Mr. Biden speaks often of his Catholic faith, attempting outreach to religious voters. How does he square this pitch with his promises to roll back religious freedoms, like ending conscience protections for nuns and other religious employers?

Health. Mr. Biden would be the oldest president ever, and even some Democrats express concern about his mental acuity. Will he provide comprehensive medical records to assure voters of his fitness for office?
I'd also like to see him asked about exactly what he would have done differently in response to the Covid-19 pandemic than the Trump administration did, why it took him so long to say anything at all critical of the rioters in our cities, what he thinks about efforts to defund the police, what he thinks about the Democrats' Green New Deal proposal, how he reconciles his support for unrestricted abortion rights with his profession of Catholic faith, whether he supports gun confiscation, abolition of the electoral college, granting statehood to D.C. and Puerto Rico, allowing biological men to use girls' restrooms, sanctuary cities, and racial "reparations," and whether he supports offering poor families the choice of which schools their children will attend.

His responses to such questions, were they honest, would be edifying for many voters.

Strassel closes her column with this:
To listen to the media, Mr. Biden will have “won” the first debate if he merely remains upright. But let’s set the bar appropriately. The former vice president was long known as a capable debater, and he’s supposedly spent much of the past month prepping.

Americans deserve to know exactly what they are signing up for in this election. And these questions are the bare minimum of understanding what a Biden presidency would be like.

Saturday, September 26, 2020

Justice and the Death of Breonna Taylor

Riots are again tearing at the fabric of our cities, this time because the rioters are dissatisfied with the verdict handed down by a Lexington grand jury which investigated the death of Breonna Taylor in Louisville, Kentucky last March 13th.

The grand jury found no evidence to warrant indictments against the two officers who were directly involved in the shooting, although they did bring an indictment for wanton endangerment against a third officer who was outside the building. The mob has deemed this inadequate and have taken their dissatisfaction out on their community in the usual fashion.

As National Review's Andrew McCarthy observes, however, their belief that justice has been denied is simply not supported by the facts of the case.

Anyone interested in this tragic episode is encouraged to read McCarthy's entire column from which I've excerpted the portion which describes the events surrounding the actual shooting. I've edited it slightly for clarity:
On March 13, after working, Breonna met her boyfriend Kenneth Walker for dinner. They returned to her apartment where they watched television, and she went to sleep after midnight.

At about 12:40 a.m., the police, led by Sergeant Jon Mattingly and Detective Myles Cosgrove, knocked on the door and announced themselves as police. Taylor and Walker were startled out of their sleep. Walker, a licensed owner of a nine-millimeter Glock, says he did not know it was the police at the door and speculated that it might be Breonna's former boyfriend and convicted drug dealer Jamarcus Glover breaking in.

For their part, the police expected that Ms. Taylor would be alone — they had not seen Walker enter the dwelling with her.

It was dark and there was a long hallway between the bedroom and the front door. There was screaming. Walker fired as Mattingly came through the door, striking him in the leg and severely wounding him. Mattingly and Cosgrove returned fire into the hallway in the general direction of where they believed the shooter was.

When the smoke cleared, Walker was unharmed, but Taylor had been struck six times. FBI ballistics experts eventually determined that Cosgrove fired the fatal shot.

Meantime, a third officer, Detective Brett Hankison, who was in the parking lot outside the apartment, began firing when the commotion, which he could not have seen, began. He sprayed the patio and a window with ten bullets — irresponsibly, to be sure, but fortunately without harming anyone.

Hankison, who had a spotty disciplinary record in almost 20 years as a cop, was terminated when police officials judged that his conduct during the raid shocked the conscience.
This was certainly an awful tragedy, but neither of the officers did anything that violated any law or police procedures. As a result, the grand jury, on the basis of the evidence provided by the state Attorney General Daniel Cameron, an African American, refused to bring an indictment against them.

McCarthy's column goes on to describe Taylor's involvement with Glover and why the police were at her apartment in the first place and closes with this:
What happened to Breonna Taylor was a calamity. That is why the city of Louisville just paid $12 million dollars to settle the wrongful death lawsuit her family filed, rather than trying to fight it. Obviously, the money cannot bring her back to life, and will never be adequate compensation for her loved ones’ loss.

But that could also have been said for the politicized filing of unprovable homicide charges. The legal system can only do the best it can; it cannot fully compensate for tragic loss, and its criminal processes are not equipped to address catastrophes that are not crimes.

The state of Kentucky was right not to opt for mob justice. Unfortunately, the mob has a different conception of “justice,” and it is ripping the country apart.
It's very unfortunate that so many people in our society are eager to resort to violence before they even know why they're doing so. It's almost a reflexive response to news they don't like, and they're not interested in waiting until they know the facts before they decide that an injustice has been done and that it's time to loot, burn and even kill.

Such people are either morally stunted or not very smart. Or both.

Friday, September 25, 2020

Can a Secular Society Condemn Sexual Assault?

Until the pandemic hit and dominated the news cycle there was a lot of talk about the frequent allegations of sexual assault and other forms of abuse by men in positions of power who prey upon the women in their orbit. The tacit assumption, pretty much universal in all the discussion, was that this is despicable behavior, and it is, but there's an irony buried in this assumption.

In a secular society comprised of people who have largely declared God to be irrelevant what does it mean to say that the behavior of these men is despicable or morally wrong? Having abandoned any transcendent moral authority to whom we are all accountable, must we not also give up the traditional notion that there are any objective moral norms and obligations?

It's certainly difficult, as even most secular thinkers have acknowledged, to see how there can be a standard of moral good without an adequate objective authority whose moral nature serves as that standard, and if there is no objective standard there really is no objective good, at least in the moral sense, and therefore no objective moral wrong.

Thus, good and bad, right and wrong if they exist at all, must be subjective which means that they're dependent on one's inner feelings or preferences. If one person's feelings differ from another's, though, neither person is right nor wrong, they're just different.

This subjectivity expresses itself differently among the three main groups of people involved in these sex scandals.

First, there are the victims who, lacking any objective standard by which to assess what was done to them, simply allege their aversion and revulsion. For them what was done to them is wrong for no reason other than they were made uncomfortable, repulsed, harmed or frightened by it or the like.

Then there are the perpetrators. Lacking any objective reference point for their behavior, they intuit that there's nothing wrong with forcing themselves on a weaker individual as long as they can get away with it.

In other words, for these men, might makes right. Others may deplore what they do, society may choose to punish what they do, but if they can get away with it they're not doing anything wrong in any meaningful, moral sense, and, if they're powerful enough to be immune from social sanctions why should they care what society thinks? The sad truth is that powerful men often do get away with it, with the help of the next group, as the case of former president Bill Clinton illustrates.

The third group are the commentariat in the media and elsewhere who condemn what these men do, who suspect, perhaps, that there's something deeply wrong with sexual assault, but who can give no real reason for their suspicions. They may insist that people have a right not to be violated in such intimate ways, but upon reflection they may realize that such rights are simply conventions fabricated by society.

Having abandoned God they've also abandoned the ability to cite any truly objective rights. After all, what could it actually mean to say that it's morally wrong to violate a "right" if there's no ultimate accountability for what anyone does?

This is why they often employ weasel words like "inappropriate" or "unacceptable." At some level they're aware that they cannot justify using stronger language like "morally reprehensible" or "evil." On a secular understanding of the world there simply are no moral evils. These are some of the same folks who pooh-poohed the allegations of women back in the 90s who testified of Bill Clinton's escapades and predations. Clinton's defenders insisted that "character doesn't matter in a president," only competence matters.

So, for this group, right and wrong are pragmatic. Nothing's really wrong except insofar as it harms the prospects of one's political party or, more cynically, if it can be used to harm the prospects of one's political opponents. Put differently, these people believe that whatever hinders their own political aspirations is wrong and whatever promotes them is right.

So, they'll ignore the odious behavior of the Clintons, Jeffrey Epsteins and Harvey Weinsteins of the world as long as it does no harm to their party, and they'll express moral outrage at the similarly odious behavior of their opponents if they can gain political advantage by so doing.

Thus, what the growing host of offenders in Hollywood, Capitol Hill, and corporate penthouses are alleged to have done is only wrong for the pragmatist because the members of the victim group are exposing the perpetrators in such a way as to harm their respective party's prospects among the vast numbers of unenlightened voters who still believe in objective moral values and who still believe that preying on women is objectively evil.

Thursday, September 24, 2020

Lucifer?

Philosopher and novelist Iris Murdoch, in her book The Sovereignty of Good (1970) describes in vivid accents the modern man who prides himself in his rational approach to life unencumbered by the silly superstitions believed in by gullible religious people. The modern rational man, typified in her telling by someone like the 18th century philosophical icon Immanuel Kant, is a man who ...
...confronted even with Christ turns away to consider the judgement of his own conscience and to hear the voice of his own reason . . . . This man is with us still, free, independent, lonely, powerful, rational, responsible, brave, the hero of so many novels and books of moral philosophy. The raison d’être of this attractive but misleading creature is not far to seek . . . .

He is the ideal citizen of the liberal state, a warning held up to tyrants. He has the virtue which the age requires and admires, courage. It is not such a very long step from Kant to Nietzsche, and from Nietzsche to existentialism and the Anglo-Saxon ethical doctrines which in some ways closely resemble it.

In fact, Kant’s man had already received a glorious incarnation nearly a century earlier in the work of Milton: his proper name is Lucifer.
Lucifer? Satan? Why such a harsh judgment? Perhaps because the modern, "rational" man believes only what science and his senses tell him. The rational man looks at himself and his fellows as little more than flesh and bone machines, animals, whose only real "purpose" is to reproduce, experience pleasure and avoid pain.

He regards morality as an illusion. His reason affords him no basis for caring about the weak or the poor, no basis for human compassion, no particular point to conserving the earth's resources for future generations.

Whereas Kant thought that reason dictated the categorical imperative, i.e. the duty to treat others as ends in themselves and not merely as a means to one's own happiness, the fact is that reason, unfettered from any divine sanction, dictates only that each should look to his own interests.

In practice modern man may care about the well-being of others, but he must abandon his fealty to science and reason to do so because these provide no justification for any moral obligations whatsoever.

Indeed, the purely rational man is led by the logic of his naturalism to the conclusion that might makes right. The pursuit of power frequently becomes the driving force of his life. It injects his life with meaning. It leads him to build abattoirs like Auschwitz and Dachau to eliminate the less powerful and less human. It leads him to employ vast prison camps like the Soviet Gulag to dispose of the millions who hold inconvenient political opinions or who otherwise stand in his way.

Would Kant have endorsed the notion that might makes right? No, but then Kant wasn't quite in tune with the modern, rational man. Kant believed that in order to make sense of our lives as moral agents we have to assume that three things are true: We have to assume that God exists, that we have free will, and that there is life beyond the grave.

The modern man, of course, rejects all three, and in so doing he rejects the notion of objective moral value or obligation. That's why reason has led men in the 20th century to embrace ideologies that have produced vast tracts of corpses, and that's why, perhaps, Murdoch uses the name Lucifer to describe them.

Wednesday, September 23, 2020

How Many Galaxies?

Our sun and the planets it holds in thrall, including our earth, reside in a small corner of a vast galaxy called the Milky Way, a swirling mass of dust, gas and billions of stars. The Milky Way is so huge that it takes light, traveling at 186,000 miles per second, 100,000 years to get from one end of the galaxy to the other.

Yet, as huge as it is, our galaxy is just one of billions of galaxies strewn across an incomprehensibly big universe. This short video explains how we know this:
Relative to all this our planet is an infinitesimally tiny speck and the inhabitants of our planet - us - are even tinier. It's no wonder that some philosophers have concluded that human beings and the quotidian pursuits in which we engage have no more significance than motes of dust bouncing around in a shaft of light.

Those philosophers are right - unless the universe was intentionally created and we were purposefully put here for a reason.

Tuesday, September 22, 2020

The Next Supreme Court Justice

With the passing of Supreme Court Justice Ruth Bader Ginsburg President Trump has the opportunity to make another yet appointment to the Supreme Court, the prospect of which has his opponents in quite a dither. Already there've been objections raised to appointing a Supreme Court Justice this close to an election, none of which, however, have validity.

One objection is that it's somehow unethical or unprecedented to confirm a SCOTUS justice in an election year. I wrote about this recently and won't go over the same ground here.

It's enough to say that it's neither unethical nor unprecedented. Twenty-nine times there has been a SCOTUS vacancy in a presidential election year, and in every case the president made a nomination. In 19 of those cases the presidency and the Senate were in the hands of the same party and 17 of those 19 nominees were confirmed. In the 10 instances when the Senate and presidency were controlled by different parties only 2 of the 10 were confirmed. In fact, if the president's opponents insist that he not exercise his constitutional responsibilities in an election year they're implicitly demanding that he limit himself to a three year term of office rather than four.

This is ironic because the Democrats were adamant in 2016 that the Republicans give Obama nominee Merrick Garland a vote before the election. From The Washington Free Beacon:
Obama, then-vice president Joe Biden, Hillary Clinton, then-Senate minority leader Harry Reid (D., Nev.) and Senate Democrats ripped Republicans, who were in control of the Senate, for their decision not to consider Garland. Sen. Patrick Leahy (D., Vt.) said the Senate had approved judges as late as "September" in election years, and Sen. Bernie Sanders (I., Vt.) said the notion that Obama could not nominate a judge that year was "absurd."

Now the situation has reversed, as Democrats insist President Donald Trump and Senate Republicans not move on the vacancy left by the death of Justice Ruth Bader Ginsburg.
Another objection is that President Trump doesn't have time to get a replacement for Justice Ginsburg confirmed before the election or before his term is over, but this concern lacks historical support. There are 42 days until the election. Justice Ginsburg’s confirmation in 1993 took 42 days, John Paul Stevens's confirmation in 1975 took 19 days, and Sandra Day O’Connor was confirmed in 33 days in 1981.

Even if Trump loses the election on November 3rd and the Republicans lose their majority in the Senate he's still the president until January 20th and the Republicans are still in control of the Senate until January 3rd. That gives them plenty of time to confirm a nominee. In fact, every nominee of the last 45 years has been confirmed in less than 110 days.

There's nothing unconstitutional in nominating and confirming a justice this close to an election. Here's what the Constitution says about the president’s power to appoint justices to the Supreme Court courtesy of National Review's Jim Geraghty:
He shall have Power, by and with the Advice and Consent of the Senate, to make Treaties, provided two thirds of the Senators present concur; and he shall nominate, and by and with the Advice and Consent of the Senate, shall appoint Ambassadors, other public Ministers and Consuls, Judges of the supreme Court, and all other Officers of the United States, whose Appointments are not herein otherwise provided for, and which shall be established by Law: but the Congress may by Law vest the Appointment of such inferior Officers, as they think proper, in the President alone, in the Courts of Law, or in the Heads of Departments.
That's the entirety of what the Constitution has to say about the appointment of justices. Geraghty adds this:
It [the Constitution] doesn’t say anything about how close the vacancy is to Election Day. It doesn’t say anything about whether the Senate has to hold hearings about the nominee. It doesn’t say anything about whether the Senate has to vote on that nominee; a refusal to vote on the nomination, as occurred with Merrick Garland, is a de facto rejection. It doesn’t say anything about whether the Senate can vote in a lame duck session.

This means President Trump can nominate anyone he likes up until noon on January 20, 2021, if he isn’t reelected. The Senate can choose to hold a vote on that nominee anytime it likes. Or it can choose not to hold a vote on that nominee. If the Democrats win a majority of the seats in the Senate, they take over on January 3, 2021. If the Senate is a 50-50 split and Joe Biden wins the presidency, then Mike Pence breaks ties up until January 20, and then Kamala Harris breaks ties in the afternoon.
In other words, as long as Mr. Trump has fifty votes in the Senate, which is by no means a sure thing, the nominee he announces later this week will almost certainly be the next Supreme Court justice.

Monday, September 21, 2020

Maybe He Misspoke

In these racially troubled times those who enjoy "white privilege" are frequently adjured to acknowledge that not only is every individual of the white race a racist but that our institutions are permeated with systemic racism. The only way to gain absolution for this sin is to confess and repent.

Eager to show that he's as woke a white person as there is and eager to expiate his own white guilt, Princeton University president Christopher Eisgruber donned the requisite sackcloth and ashes and abased both himself and his institution in a September 2nd letter to the Princeton community in which he claimed that racism and "the damage it does to people of color" remain "embedded" in the university.

He made this statement despite the university's pro forma promises to students and the federal government that it doesn't discriminate on the basis of race. Apparently those assurances have been false.

The Department of Education, apprised of President Eisgruber's abject admission of Princeton's continued structural racism under his watch and the obvious implication that the university has been lying when it claimed not to discriminate, has launched an investigation into the university for violation of the Civil Rights Act of 1964. The act states that "no person in the United States shall, on the ground of race, color, or national origin, be...subjected to discrimination under any program or activity receiving Federal financial assistance.

The Washington Free Beacon reports that,
In a letter addressed to the Princeton president, the Department of Education said that the school's self-admitted racism was grounds for an investigation. The department is concerned that Princeton—which receives millions in federal funding—has violated the Civil Rights Act of 1964.

"Based on its admitted racism, the [department] is concerned Princeton's nondiscrimination and equal opportunity assurances … may have been false," the letter reads. "The Secretary of Education may consider measures against Princeton for false … nondiscrimination assurances, including an action to recover funds."
We can assume that President Eisgruber never saw this coming when he wrote his letter to the Princeton community. No doubt he thought the letter would ingratiate him with hundreds of faculty and administrators who in July signed a letter calling on Princeton to atone for its "anti-Blackness."

That letter includes demands such as extended sabbaticals for minority faculty, removing standard application questions about previous misdemeanors and incarcerations and acknowledging that the school was built on Native-American territory.

Like Buridan's Ass, Eisgruber now finds himself in quite a difficult dilemma. He can stand by his original claims that his school is systemically racist, in which case his university stands to lose millions of dollars in federal funding, or he can recant those assertions, declare that, on second thought, his school isn't racist after all, and stick his thumb in the eye of all those hundreds of faculty and staff who insisted in July that it is.

It'll be interesting to see how President Eisgruber resolves his predicament.

Saturday, September 19, 2020

Post-Truth Nation

Denise McAllister, in an article a few years ago at The Federalist, put her finger on one of the tragedies of our post modern age - the loss of belief in objective truth.

The present age, she points out, has come to be labelled the post-truth era and to the extent the designation is accurate it has been calamitous in numerous ways.

McAllister writes:
Oxford Dictionaries has picked “post-truth” as its word of the year (2016), citing that “a year dominated by highly-charged political and social discourse” was the driving force that increased the word’s use by 2,000 percent.

It defines post-truth as “relating or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” This word, publishers say, has “become ‘overwhelmingly’ associated with politics.”

.... What I find particularly fascinating regarding Oxford Dictionaries’ announcement is the reaction from liberals. Suddenly, people who hold to philosophies that actually undermine and reject objective truth are deeply concerned about emotions dictating facts.
This last sentence is important. Several generations of students have been taught by liberal postmodern professors to be skeptical of any and all truth claims. Now, with Trump's ascendancy and the evidence that his lies don't mean much to his supporters, some of these same people who were so eager to abandon the notion of objective truth in their academic work are now wringing their hands over the apparent fact that truth no longer matters to a large segment of our population.

McAllister drives home the irony of this crucial epistemological development:
I’d like to cut through the fog and focus on a foundational issue that is driving this chaos.

I have to ask [all those] fretting about fake news and post-truth in this election season: Where have you been all these years as America has abandoned truth for relativism especially in higher learning (and now in all levels of education)? Haven’t you been paying attention as we have put emotion over facts in just about every sphere of society? Our nation has been abandoning objective truth for more than a century! What did you think would result?

This sudden outcry against post-truth reminds me of the vapors so many had when they heard the [obscene Trump] tape.

Suddenly, people who had been telling us there’s no right and wrong—no objective values or morality by which we can judge others—switched gears and became Puritans in a flash. This is the phenomenon I find truly amazing, and it’s just more evidence of the subjectivism that has been consuming our country for decades like cancer, eating away every part of civil society.

My response to those now worried about this “post-truth society” is “You reap what you sow.” This abandonment of objective facts for emotion is the inevitable result of our culture’s unrelenting commitment to moral relativism. This is the chaos that comes as a result. Look at it, soak it in, and maybe you’ll learn something.

Post-truth is not the fault of social media or of current politics. These are symptoms, not the disease. The disease is an American society that has closed its mind to objective truth and is now being forced to live with the conflict and chaos that ensues.

Relativism, subjectivism, and materialism are all bankrupt philosophies. Yet these are what drive our culture and politics. When objective truth and values are abandoned, there are no unifying principles of truth or morality that bind together the vast number of disparate individuals and groups that inhabit our nation.

Reason, objective morality, and principles of equal justice, truth, love, freedom for the individual, and limited government (to name a few)—these are objective realities. But we turned our backs on objective truth and embraced subjectivism, pluralism, and relativism, hiding them under the cloak of tolerance. We abandoned our principles.

Now all we have is the many, with nothing to make us “one.” America’s greatness stems from its commitment to e pluribus unum—out of the many, one. Out of the many states, one nation. Out of the many races, religions, and backgrounds, one people. To achieve that—to bind together all these subjective entities into a functioning and civil whole—we need objective principles to which we are all committed.
Of course, our abandonment of objective truth has a cause - secularism. There can only be objective values if those values are rooted in the transcendent. Eliminate all reference to transcendence and society's left with nothing in which to anchor its moral, political and aesthetic values. They simply float like untethered balloons drifting along on the breezes of social fashion.

McAllister goes on:
The fact is when everyone defines truth for himself, there are no ties that bind. If I decide something is true for me, based on how I feel, then it is true. This is all I have if I have rejected objective truth. All I have are my subjective experience, feelings, and natural impulses.

These become truth for me, just as your subjective feelings become truth for each of you. In the end, we are ruled, not by a common commitment to truths to which we all are bound or a commitment to exercise reason as we pursue truth, but by our own individual feelings.

The result is chaos and conflict, because there is no real common ground. For there to be common ground, you would have to be committed to something objective, a truth outside of yourself that is the same for everyone. This conflict can’t be avoided if society rejects objective truth and reason.

What is truly frightening is that human beings can’t remain in a state of chaos, so they look to a savior for peace, unity, and security. In a world that has abandoned a common commitment to objective truth, the only savior, the only path to unity, is an individual or group of individuals with enough power to control everyone else.

The motivation of their will to power, just as with all who abandon truth, is to satisfy their own feelings and feed their own natural appetites. Principles and truth play no role. “The object of power is power,” George Orwell wrote in “1984.”

“Either we are rational spirit obliged for ever to obey the absolute values of the Tao [objective truths and values],” C. S. Lewis wrote in “The Abolition of Man,” “or else we are mere nature [creatures ruled by emotions and natural impulses] to be kneaded and cut into new shapes for the pleasures of masters who must, by hypothesis, have no motive but their own ‘natural’ impulses.”
There's much else of value in McAllister's essay, and I urge you to read it all. She concludes with this thought:
If we truly want to make America great again, if we really want peace and prosperity, we must return to the foundational principles and truths that made our nation great in the first place. If we want unity—e pluribus unum—we have to abandon subjectivism and once again embrace objective truth and morality. If we don’t, America will be transformed into a brave new world where truth is defined by the powerful.
Her words are, if you'll pardon the pun, objectively true, and they're certainly worth reflecting upon.

Friday, September 18, 2020

Historically Unprecedented

A Wall Street Journal editorial today offered up some exciting economic facts. Here are a few salient excerpts from the column:
  • The median household income in 2019 grew a whopping 6.8% — the largest annual increase on record. While this year’s government-ordered lockdowns will erase these gains in the short term at least, it’s still worth highlighting how lower-income workers and minorities benefited from faster growth and a tighter labor market before the pandemic.
  • Real median U.S. household income last year rose by $4,379 to $68,709. In dollar amounts, this is nearly 50% more than during the eight years of Barack Obama’s Presidency.
  • Median household incomes increased more among Hispanics (7.1%), blacks (7.9%), Asians (10.6%) and foreign-born workers (8.5%) versus whites (5.7%) and native-born Americans (6.2%).
  • Median earnings increased by an astounding 7.8% for women compared to 2.5% for men.
  • Poverty fell 1.3 percentage points last year to 10.5%, the lowest level since 1959, and declined more for blacks (2 percentage points), Hispanics (1.8), Asians (2.8), single mothers (2.6), people with a disability (3.2), and no high-school diploma (2.2).
  • The black (18.8%) and Hispanic (15.7%) poverty rates were the lowest in history.
  • The child poverty rate also declined to 14.4% from 16.2% in 2018 and 18% in 2016. The decline in childhood poverty last year was nearly twice as much as during the entire Obama Presidency.
What was the engine driving this outstanding economic news? The WSJ editors credit the economic policies put in place by the Trump administration:
These income gains weren’t magical. Policy changes mattered. The Obama Administration’s obsession with income redistribution and regulation retarded business investment and economic growth. This in turn led to slower income growth for most Americans.

Business investment and hiring increased amid the Trump Administration’s deregulation and the GOP’s 2017 tax reform that unleashed animal spirits. New business applications increased twice as much during the first two years of the Trump Presidency versus the last two Obama years.

Rising economic growth lifted all classes. The record of his first three years are why voters still give President Trump an edge over Joe Biden on the economy.
Their conclusion is crucially important for undecided voters to consider:
The pandemic will eventually end, and the labor market is recovering faster than expected. The question for Americans on Nov. 3 is what kind of economy they want to have on the other side. The Trump policy mix lifted wages for all and reduced inequality. The Obama-Biden policy mix that put income redistribution first led to slower growth and more inequality.
Many liberal Democrats and Independents insist that their primary concern is for the economic well-being and future of the nation's poor, but if they're sincere, why would they not vote to continue the historically unprecedented progress that's resulted from the policies of the last three years?

Thursday, September 17, 2020

Premature Celebration

We've probably all seen videos of pro football players racing for a touchdown, who, to their everlasting embarrassment, recklessly spiked the football before they crossed the goal line. A similar premature celebration occured among Democrats in 2016 when they were so confident on the eve of the election that Donald Trump would be consigned to political oblivion that they didn't even try to conceal their glee.

As just one instance of the political version of spiking the football before crossing the goal line here's a video of several MSNBC personalities - Lawrence O'Donnell, Joy Reid, and Rick Wilson - yukking it up at the impending humiliation of Donald Trump:
The MSNBC folks have been trying to wipe the egg off their faces ever since, but rather than be humbled by the defeat of Hillary Clinton, their contempt for Trump quickly morphed into open and unapologetic hatred and hostility.

For four years the team at MSNBC, including the same people who appear in the above video, have been working feverishly to destroy the presidency of the man who embarrassed them so badly in 2016. There's reason for concern that if 2020 turns out to be a reprise of 2016 these commentators will be pushed over the edge of sanity much like Chief Inspector Dreyfus whose hatred of Inspector Clouseau in the old Pink Panther movies drove him insane:

Wednesday, September 16, 2020

Human Language: An Evolutionary Mystery

Several years ago I did a post titled Human Exceptionalism in which I noted that many researchers investigating the uniqueness of human language have concluded that it defies any naturalistic explanation for its origin.

That post sent me back to one I did a couple of years ago on the last book written by the late Tom Wolfe which was also on the mystery of human language and I thought I'd repost it.

Here it is:

I've been enjoying Tom Wolfe's new book, The Kingdom of Speech, and heartily recommend it to anyone interested in the history of the theory of evolution and/or the history of the study of linguistics. Michael Egnor at Evolution News concurs with this commendation, and goes even further. Rather than me telling you what the book is about, I'll quote Egnor:
Tom Wolfe has a new book, The Kingdom of Speech, and it's superb. Wolfe's theme is that human language is unique and is not shared in any way with other animals. He argues forcefully that evolutionary stories about the origin of human language are not credible.

In the first chapter of his book, Wolfe describes an article in the journal Frontiers of Psychology from 2014, co-authored by leading linguist Noam Chomsky and seven colleagues. Wolfe declares that:
"The most fundamental questions about the origins and evolution of our linguistic capacity remain as mysterious as ever," [the authors] concluded. Not only that, they sounded ready to abandon all hope of ever finding the answer. Oh, we'll keep trying, they said gamely... but we'll have to start from zero again.

One of the eight was the biggest name in the history of linguistics, Noam Chomsky. "In the last 40 years," he and the other seven were saying, "there has been an explosion of research on this problem," and all it had produced was a colossal waste of time by some of the greatest minds in academia....

One hundred and fifty years since the Theory of Evolution was announced, and they had learned...nothing....In that same century and a half, Einstein discovered the ...the relativity of speed, time and distance... Pasteur discovered that microorganisms, notably bacteria, cause an ungodly number of diseases, from head colds to anthrax and oxygen-tubed, collapsed-lung, final-stage pneumonia....Watson and Crick discovered DNA, the so-called building blocks genes are made of...and 150 years' worth of linguists, biologists, anthropologists, and people from every other discipline discovered...nothing...about language.

What is the problem? What's the story?...What is it that they still don't get after a veritable eternity?
Wolfe provides a précis of his argument:
Speech is not one of man's several unique attributes -- speech is the attribute of all attributes!
Yet despite almost two centuries of speculations and hypothesizing we're no closer today to being able to explain what language is or how we come to have it than we've ever been. Indeed, Darwin and his votaries tried to come up with a plausible explanation and failed so utterly that scientists gave up for almost eighty years trying to explain it. Says Wolfe:
It is hard to believe that the most crucial single matter, by far, in the entire debate over the Evolution of man - language - was abandoned, thrown down the memory hole, from 1872 to 1949.
It's also hard to believe that it's been 67 years since 1949 and still no progress has been made on this question. Egnor writes:
And yet, as Wolfe points out, Darwinists are at an utter loss to explain how language -- the salient characteristic of man -- "evolved." None of the deep drawer of evolutionary just-so stories come anywhere close to explaining how man might have acquired the astonishing ability to craft unlimited propositions and concepts and subtleties within subtleties using a system of grammar and abstract designators (i.e. words) that are utterly lacking anywhere else in the animal kingdom.
Egnor, who is himself a neuroscientist, closes his piece with these words:
I have argued before that the human mind is qualitatively different from the animal mind.

The human mind has immaterial abilities -- the intellect's ability to grasp abstract universal concepts divorced from any particular thing -- and that this ability makes us more different from apes than apes are from viruses. We are ontologically different. We are a different kind of being from animals. We are not just animals who talk. Although we share much in our bodies with animals, our language -- a simulacrum of our abstract minds -- has no root in the animal world.

Language is the tool by which we think abstractly. It is sui generis. It is a gift, a window into the human soul, something we are made with, and it did not evolve.

Language is a rock against which evolutionary theory wrecks, one of the many rocks -- the uncooperative fossil record, the jumbled molecular evolutionary tree, irreducible complexity, intricate intracellular design, the genetic code, the collapsing myth of junk DNA, the immaterial human mind -- that comprise the shoal that is sinking Darwin's Victorian fable.
The charm of Wolfe's book is that it reads like a novel, which is the metier for which Wolfe is famous. It's free of scientific jargon, it's funny and contains some fascinating insights into several of the major figures in the history of the search for an explanation for the origin and nature of language. Plus, it's only 169 pages long.

All in all a great read.

Tuesday, September 15, 2020

An Amazing Molecule

A decade ago I wrote a post on Stephen Meyer's new (at the time) book titled Signature in the Cell. In that post I wrote this:

In the controversy between Darwinian materialism and intelligent design there are four main issues over which the battle is joined. These are the origin and structure of the universe, the origin of life, the origin of species or diversity, and the origin of human consciousness.

It's interesting that despite materialist boasts of epistemic superiority they have a theory for only one of these (speciation). For each of the others the materialists have no testable, empirical, scientific explanations at all. Notwithstanding, we're constantly reminded by experts such as Judge Jones of the Kitzmiller vs. Dover Area School District case that Darwinian materialism is science and intelligent design (ID) is religion.

Even the one theory that the materialists do have, common descent, often proves to be a procrustean bed for the facts that scientists uncover in their daily work, but even so, that theory is really not at issue between IDers and Darwinians. What's at issue in the debate are not the scientific facts, but the theoretical significance of those facts.

The materialist says that the universe, life, and consciousness each has a purely material explanation even if they haven't the haziest idea what it could be, and the IDer says that the scientific evidence that we have in each of these areas points most plausibly to intelligent agency.

Now comes a book by Stephen Meyer, Signature in the Cell: DNA and the Evidence for Intelligent Design, that shows the utter implausibility of materialist explanations for the origin of life. SIC may well prove to be a game-changer in the debate over whether the origin of life can be explained in purely physical terms. Surely if Judge Jones had read it he would have been hard pressed to arrive at the conclusions he did. SIC will make it very difficult for future critics of ID to get away with some of their traditional accusations and arguments.

In his book Meyer accomplishes a number of things: He demonstrates in convincing fashion the sheer implausibility of any materialist explanation for the DNA enigma, i.e. the origin in the genetic material of specified functional information. He methodically makes the case that of the possible explanations for the digital code inscribed on nucleic acid molecules - chance, physical law, a combination of chance and law, or intelligent purpose - the best is clearly intelligent purpose.

He also takes on just about all of the common objections to ID, especially those which arose in the 2005 Dover trial, and one by one shows each of them to be lacking in any real force. The complaint that ID is religious, that it's not science, that it's a souped up version of creationism, that it's not testable, makes no predictions, and leads to no research are all addressed and thoroughly refuted.

His argument is so thorough and so devastating to materialism that many readers will find themselves wondering how anyone could still embrace the materialist explanation of origins.

Meyer adopts an interesting format for his book. He weaves the science and philosophy together with his personal intellectual biography to trace how he came to hold the views he does.

The book is long (508 pages) and not every chapter will interest those who may not have much background in cell biology, but he makes every topic accessible even to the scientifically uninitiated. He unravels the argument throughout the book, but for me his discussion in the epilogue of how information is not only coded straight forwardly on the DNA molecule, but how the same nucleotide sequences can code for different proteins depending on a host of complex biochemical conditions, much like a word can have multiple meanings depending on the context in which it is used, was of special interest.

His explanation, too, of how information is stored not only in nucleic acids but also in structures in the cell, and indeed the very structure of the cell itself, was fascinating.

To get an idea of how astonishing this is imagine a page of text. The text itself has one meaning, but if it appears on a certain kind of paper it might take on additional meanings. Moreover, if you read, say, every third word yet another meaning would come to light, or if you read the text backwards still another meaning would be revealed.

The information in the cell has this kind of complexity. Intelligent cryptologists can create this sort of code, but blind chance and physical forces have never been known to do so, nor, as a matter of probability, is there any but the most nominal chance that they could.

Here's a recently produced six minute video which beautifully illustrates just a smidgeon of the amazing complexity of the DNA molecule. As you watch bear in mind that according to materialism these properties of this molecule are all the result of blind luck:

Monday, September 14, 2020

Critical Race Theory

President Trump recently banned the teaching of critical race theory (CRT) in all government departments in the executive branch where it had been prominent in diversity training seminars and workshops. This raised the question in the minds of many as to what, exactly, critical race theory is and why the administration would proscribe it.

An article by Bradley Devlin at the Daily Caller answers that question. Devlin, leaning heavily on a book titled “Critical Race Theory: An Introduction” by Richard Delgado and Jean Stefancic, writes that there are four key beliefs that undergird the theory.

1. The Belief That Racism is the Ordinary Condition of Society.

CRT holds that racism permeates every aspect of our society - its institutions, laws, history, values, customs and social interactions. It's inherent in the dominant (white) class and every white person is, whether they know it or not, racist.

Thus, the proponent of CRT will examine every interaction, every element of culture, to spot the underlying racism and root it out.

2. The Belief in Interest Convergence.

CRT also teaches that whites comprise the "oppressor class" in America and give people of color opportunities to be full participants in society only when it's in their (whites) interest to do so. In other words, only when the interests of blacks converge with the interests of whites will whites willingly grant blacks equal treatment.

In CRT racism is an affliction solely of the dominant class in society. The oppressors are racists and can't be otherwise, the victims of oppression are not racist and cannot be.

Since whites only take up the mantle of anti-racism when it benefits them socially, interest convergence implies that the action of becoming an anti-racist is actually itself a racist act.

It's hard to see how the Civil War, Emancipation and the Civil Rights Movement of the 1960s were all in the interests of the white oppressor class, but that's what CRT teaches.

3. The Belief That Classical Liberalism Is Oppressive.

Liberalism was a political idea developed by the oppressive class, according to CRT, and is how the oppressor maintains and creates oppressive systems. Thus, classical liberalism, i.e. the ideal of a free people living in a free society enjoying free markets and minimal government intrusion in their lives, is oppressive in itself.

Critical race theorists advocate a Marxist-style revolution, which, if history is a guide, is almost certain to replace one form of oppression with a much worse one, but then such details don't seem to deter the Critical Race theorist.

4. The Belief That There Are Alternative Ways Of Knowing.

Mr. Devlin explains this pillar as follows:
Critical Race theorists argue that since traditional ways of knowing — science, rational inquiry, logic — are institutions of white supremacy and how whites understand the world, other ways of knowledge accumulation must be adopted. Storytelling, more specifically, telling counter-stories (like the 1619 Project) is the primary way to challenge the dominance of traditional knowledge.
This, of course, assumes that people of color are incapable of understanding science and logic which certainly seems to be itself a racist assumption. It sounds as though it could've been lifted right out of white supremacist propaganda.

Devlin goes on in his essay to explain how CRT is a blend of Marxist thought and postmodern assumptions about truth, and the interested reader should check out his article at the link.

Meanwhile, it's hard to imagine anything more likely to divide people along racial lines, more likely to engender racial hatred and resentments, and more likely to insure that blacks remain mired in poverty and defeat, than the idea that whites are inherently and ineradicably evil, that the country, root and branch, is morally corrupt, and that blacks are both inherently good and at the same time incapable of competing intellectually with whites.

But that's what Critical Race Theory promotes. The Trump administration should be applauded for expunging it from our government institutions and for no longer requiring taxpayers to subsidize such a polarizing doctrine.

Now, if only we could get it out of our university classrooms we'd have a much better chance of seeing our children enjoy a racially harmonious future.

Saturday, September 12, 2020

Hispanics Are Moving Toward Trump

Giancarlo Sopo is the Trump re-election campaign's director of rapid response for Spanish Language Media so he'd be expected to shine the best light on Republican efforts to woo Hispanic voters that he can. Even so, his recent essay in The Federalist and the statistics which accompany it are quite stunning.

For instance, in 2016 Trump won 28 percent of the Hispanic vote, a total that fell below the 31 percent GOP presidential candidates have averaged since 1980, but two recent surveys show Trump currently at 35 and 37 percent of the Hispanic vote. These are the highest Hispanic totals for a Republican since 2004.

These are nation-wide figures, but state polls are not encouraging for the Biden campaign either. Some of them show Mr. Biden failing to garner the same level of support Hillary Clinton did in a losing effort to Trump in 2016. Here's Sopo:
In Florida, a survey by Democratic polling firm Equis Research found Biden running 11 points behind Hillary Clinton’s Hispanic vote margin over Trump in a state that she lost. According to a new NBC/Marist poll released this week, among Latino voters, Trump is leading in the Sunshine State with 50 percent compared to Biden’s 46.

A recent Rice University and Texas Hispanic Policy Foundation poll found Biden trailing Clinton’s Hispanic vote margin over Trump by a whopping 18 points. Meanwhile, a new Bendixen & Amandi poll found Trump crushing Biden by 38 points among Cuban Americans, which had been previously trending Democratic.
It should be noted that all this is despite the fact that Democrats have sought to use Trump's immigration policies as a hammer to pound him with among Hispanics. Nevertheless,
Latinos recognize the president is a strong leader whose policies delivered record-low unemployment and poverty levels for our communities before the global pandemic.

GOP operatives often say that Hispanics are natural GOP constituents because we are hard-working, family-centric, and pro-life. This is all true, but we also resist the left’s cultural edicts. Much of the president’s broader policy agenda — yes, including on immigration — is popular with Hispanics. This is a fact that befuddles many political observers, and some refuse even to acknowledge it.
Sopo has more for interested readers at the link, but the upshot is that the Democrats can ill-afford to lose such a large percentage of the Hispanic vote in November. Yet they find themselves stuck with a candidate, Joe Biden, who simply doesn't generate much enthusiasm in this demographic, even among those who've traditionally voted Democratic.

Friday, September 11, 2020

On 9/11 A Thought about Missionaries

On this date nineteen years ago, a group of savage fanatics launched the worst ever attack on the American mainland in hopes of ultimately destroying this nation and the values it was founded upon.

As a counter to the foolish assumptions about Western civilization that motivated those terrorists, as well as those who cheered them on, I'd like as a kind of observance to rerun an older VP post based on sociologist Rodney Stark's excellent book titled How the West Won.

Like all his books HWW is history that reads like a novel. He argues in it that all of the progress the world has enjoyed since the medieval period has had its genesis in the West.

His theory, to my mind convincingly defended, is that progress originated in areas with high levels of personal liberty, low taxation, and strong property rights. To the extent these were absent, as they have been in most parts of the world throughout history, progress died in the crib, as it were.

He also argues that the crucial soil for progress was a Judeo-Christian worldview in which the universe was seen as an orderly, law-governed, rational product of a personal, non-arbitrary God. Where this belief was absent, as it was everywhere but Europe, science and technology, medicine and learning, either never developed or were never sustained.

Along the way Stark punctures a host of myths that have become almost axiomatic among progressive intellectuals but which are at complete variance with the historical facts. He makes a strong case for the claim that capitalism and even colonialism have been marvelous blessings, that the fall of Rome was one of the single most beneficial events in world history, that the "Dark Ages" never happened, that the crusades were not at all the rapacious ventures by murderous Christians of gentle, pastoral Muslims we've been told they were, that historical climate change had many salubrious effects on Western progress, that there was no scientific "revolution" but rather a continual and accelerating unfolding of scientific discovery that began at least as far back as the 13th century and probably earlier.

I urge anyone interested in history to get a copy. Stark includes a lot that he covered in earlier works, but much of it is new and what isn't new bears repeating anyway.

An example of something that's both myth-busting and new was Stark's discussion of the work of Robert D. Woodberry.

Woodberry's research makes it clear that much, if not most, of the progress made around the world is due to the work of Western missionaries who labored in remote lands a century or more ago.

Here's what Stark writes about the role missionaries played in making life better for millions:
Perhaps the most bizarre of all the charges leveled against Christian missionaries (along with colonialists in general) is that they imposed "modernity" on much of the non-Western world. It has long been the received wisdom among anthropologists and other cultural relativists that by bringing Western technology and learning to "native peoples," the missionaries corrupted their cultures, which were as valid as those of the West....But to embrace the fundamental message of cultural imperialism requires that one be comfortable with such crimes against women as foot-binding, female circumcision, the custom of Sati (which caused women to be burned to death, tied to their husbands' funeral pyres), and the stoning to death of rape victims on the grounds of their adultery.

It also requires one to agree that tyranny is every bit as desirable as democracy, and that slavery should be tolerated if it accords with local customs. Similarly, one must classify high-infant mortality rates, toothlessness in early adulthood, and the castration of young boys as valid parts of local cultures, to be cherished along with illiteracy. For it was especially on these aspects of non-Western cultures that modernity was "imposed," both by missionaries and other colonialists.

Moreover, missionaries undertook many aggressive actions to defend local peoples against undue exploitation by colonial officials. In the mid-1700s, for example, the Jesuits tried to protect the Indians in Latin America from European efforts to enslave them; Portuguese and Spanish colonial officials brutally ejected the Jesuits for interfering. Protestant missionaries frequently became involved in bitter conflicts with commercial and colonial leaders in support of local populations, particularly in India and Africa....

A remarkable new study by Robert D. Woodberry has demonstrated conclusively that Protestant missionaries can take most of the credit for the rise and spread of stable democracies in the non-Western world. That is, the greater the number of Protestant missionaries per ten thousand local population in 1923, the higher the probability that by now a nation has achieved a stable democracy. The missionary effect is far greater than that of fifty other pertinent control variables, including gross domestic product and whether or not a nation was a British colony.

Woodberry not only identified this missionary effect but also gained important insights into why it occurred. Missionaries, he showed, contributed to the rise of stable democracies because they sponsored mass education, local printing and newspapers, and local voluntary organizations, including those having a nationalist and anticolonial orientation.

These results so surprised social scientists that perhaps no study has ever been subjected to such intensive prepublication vetting....

Protestant missionaries did more than advance democracy in non-Western societies. The schools they started even sent some students off to study in Britain and America. It is amazing how many leaders of successful anti-colonial movements in British colonies received university degrees in England - among them Mahatma Ghandi and Jawaharlal Nehru of India and Jomo Kenyatta of Kenya....

Less recognized are the lasting benefits of the missionary commitment to medicine and health. American and British Protestant missionaries made incredible investments in medical facilities in non-Western nations. As of 1910 they had established 111 medical schools, more than 1,000 dispensaries, and 576 hospitals. To sustain these massive efforts, the missionaries recruited and trained local doctors and nurses, who soon greatly outnumbered the Western missionaries....

[Woodberry's] study showed that the higher the number of Protestant missionaries per one thousand population in a nation in 1923, the lower that nation's infant mortality rate in 2000 - an effect more than nine times as large as the effect of current GDP per capita. Similarly, the 1923 missionary rate was strongly positively correlated with a nation's life expectancy in 2000.
These missionaries battled every kind of pestilence, hardship, and deprivation. They were often murdered or died from disease, all in an effort to make life better for people living in miserable circumstances, while leftist academics sit in their comfortable, air-conditioned offices, never having made anything better for anyone, blithely and foolishly condemning those who did for being "superstitious" and "cultural imperialists" who imposed their values on idyllic societies that would be better off if left alone.

Some might call these academics intellectually arrogant or even stupid, but if nothing else they certainly display a moral blindness.

Yes, and what of those who sought on 9/11/2001 to cripple and perhaps destroy one major fount of all this blessing? What might be said not only of them but also of the young Marxist rioters in our streets today who believe that the West has been, and is, thoroughly evil and deserves ruination?

well,for starters it might be said that their historical ignorance is appalling.

Woodberry's paper can be read in pdf here.

Thursday, September 10, 2020

Political "Normlessness"

As William McGurn writes in Tuesday's Wall Street Journal, the president has frequently dismayed his opponents and supporters alike by his insoucient disregard for norms of conduct that've traditionally been upheld by our presidents and national politicians. He seems unable to abide by the conventions of deportment appropriate to his office, and he is regularly thrashed by the media for his boorish behavior.

However, as McGurn also notes, his opponents are at least as bad, if not worse, yet they never seem to receive more than a slap on the wrist from the media. One recent egregious example was the article published by The Atlantic which accused Mr. Trump, based entirely on anonymous sources, of calling our war casualties "Losers" and "Suckers."

To print such serious allegations, based on nothing more than anonymous testimony, is a flagrant violation of journalistic ethics, a dereliction The Atlantic would never commit had similar allegations concerned Joe Biden or Barack Obama. Yet most progressive media have given the magazine's editor a pass, presumably because the article wounded Trump.

McGurn writes:
The Trump-justifies-the-means rationale has been as poisonous for the nation as anything the president has done. Start with this: Has there ever been a norm violation more grievous than the way the Justice Department and FBI were politically weaponized to intervene in an election and then take down an elected president, built on a salacious Russian dossier commissioned by the Hillary Clinton campaign and lies to the Foreign Intelligence Surveillance Act court?

Not to mention intelligence chiefs such as CIA Director John Brennan and Director of National Intelligence James Clapper who publicly painted the president as a Russian agent while privately testifying to Congress they’d seen no evidence for such a claim.
He goes on to list about a dozen other examples, including Speaker Nancy Pelosi's infamous ripping up of the president's State of the Union speech, Congresswoman Maxine Waters' encouraging people to harass Trump officials in restaurants and shops, Hillary Clinton's urging Joe Biden not to concede if he loses the election, and the prolonged harassment of Mr. Trump via an absurd impeachment proceeding.

He could've also mentioned the absolutely baseless slander, repeated frequently by his opponents, that Mr. Trump, should he lose in November, will refuse to vacate the White House and will need to be forcibly removed. What evidence do they have that such a claim is true?

McGurn continues:
Where was the press skepticism about the Steele dossier or the whole Russia collusion narrative? Can anyone remember headlines about norm busting when Americans learned the FBI agent accused of altering a document for a FISA court had declared himself part of the “resistance”?

Or when a text revealed the FBI’s lead investigator into the Trump and Clinton campaigns telling his FBI lover that they will “stop” Mr. Trump from becoming president?

Against this, the standard rhetorical excesses likening the president to murderous totalitarians— Trump is Hitler, Trump is Stalin, Trump is Mussolini— look almost quaint.
Almost any breach of decorum and civility has been justified if it'll hurt the president and diminish his chances of winning in November. The media simply loathes the man, not because of his intemperant behavior - indeed, were he a Democrat they'd busily be making excuses for it - but for what he's doing to roll back the massive progress the left has made over the last fifty years toward establishing a socialist, centralized state in America.

The left can't abide the fact that President Trump has appointed hundreds of lower court judges and a few Supreme Court justices, all of whom believe the role of a judge is to interpret the law, not to impose a law of their own creation on the American people.

They resent that his low tax, low regulation policies had created, until the pandemic broke out, the best economy in living memory and that, for all the progressive talk about concern for the working poor, Mr. Trump's policies actually put more poor minorities back to work than anything any of his predecessors had done.

In other words, the left despises Trump primarily because he's showing the world that the big government nostrums they've advocated since the 1930s don't work in the real world and that free people and free markets do.

Wednesday, September 9, 2020

There Are No Facts

Of late we've been hearing that we live today in a post-fact, post-truth world. Milo Yiannopolis, the gay bad boy loathed by campus liberals and many conservatives as well, once observed that “We live in a post-fact era and that is wonderful.” I must politely disagree, not with the first part of his assertion which is, I say it with irony, factual, but with the second. It's not so wonderful at all, in my opinion.

Facts matter because truth matters, but the subjectivization of truth, most obvious in the frequently heard claim that "What's true for you isn't true for me," has made it difficult to hold onto the concept that there actually is any objective truth about most things that really matter. The conviction that there is is so 20th century.

One manifestation of the loss of a belief that truth is objective, and that it matters, is the apparent eruption in recent months and years of "fake news" stories.

Daniel Payne at The Federalist lists sixteen "Fake News" stories in the major media just since Trump's election, all of which were false or misleading, but which were repeated thousands of times on social media before the truth came out.

Of course, if we're living in a post-truth era, it may largely be due to the fact that our media and our politicians, most notably Mr. Obama and even more egregiously, Mr. Trump, seem to live in a world where facts don't matter at all. As Peter Pomerantsev, in an essay at Granta, pungently observes, what's different today is not merely that we're living in "a world where politicians and media lie – they have always lied – but one where they don’t care whether they tell the truth or not."

Pomerantsev places much of the blame on the postmodern mindset:
How did we get here? Is it due to technology? Economic globalisation? The culmination of the history of philosophy? There is some sort of teenage joy in throwing off the weight of facts – those heavy symbols of education and authority, reminders of our place and limitations – but why is this rebellion happening right now?

This equaling out of truth and falsehood is both informed by and takes advantage of an all-permeating late post-modernism and relativism, which has trickled down over the past thirty years from academia to the media and then everywhere else.

This school of thought has taken Nietzsche’s maxim - that there are no facts, only interpretations - to mean that every version of events is just another narrative, where lies can be excused as ‘an alternative point of view’ or ‘an opinion’, because ‘it’s all relative’ and ‘everyone has their own truth’ (and on the internet they really do).

Maurizio Ferraris, one of the founders of the New Realism movement and one of postmodernism’s most persuasive critics, argues that we are seeing the culmination of over two centuries of thinking. The Enlightenment’s original motive was to make analysis of the world possible by tearing the right to define reality away from divine authority to individual reason.

Descartes’ ‘I think therefore I am’ moved the seat of knowledge into the human mind. But if the only thing you can know is your mind, then, as Schopenhauer put it, ‘the world is my representation’.

In the late twentieth century postmodernists went further, claiming that there is ‘nothing outside the text’, and that all our ideas about the world are inferred from the power models enforced upon us. This has led to a syllogism which Ferraris sums up as: ‘all reality is constructed by knowledge, knowledge is constructed by power, and ergo all reality is constructed by power.

Post-modernism first positioned itself as emancipatory, a way to free people from the oppressive narratives they had been subjected to. But, as Ferraris points out, ‘the advent of media populism provided the example of a farewell to reality that was not at all emancipatory’. If reality is endlessly malleable, then Berlusconi... could justifiably argue, ‘Don’t you realize that something doesn’t exist – not an idea, a politician, or a product – unless it is on television?’

To make matters worse, by saying that all knowledge is (oppressive) power, postmodernism took away the ground on which one could argue against power. Instead it posited that ‘because reason and intellect are forms of domination . . . liberation must be looked for through feelings and the body, which are revolutionary per se.’

Rejecting fact-based arguments in favour of emotions becomes a good in itself.
This sounds about right to me. The postmodern view of objective truth - that it's an outdated holdover from the failed Enlightenment habit of placing too much epistemological confidence in Reason - leads us to the place where a postmodern philosopher like the late Richard Rorty can assert that "Truth is whatever your peer group will let you get away with saying."

If one's peer group is the media then there's a pretty broad spectrum of things one can get away with saying, no matter how fantastical, as long as those things are critical of political opponents. Unfortunately for Rorty and his definition of truth, though, his own peer group, philosophers, didn't let him get away with defining truth that way.

Tuesday, September 8, 2020

Three Random Thoughts

A few short thoughts:

1. Sometimes we hear people say that they respect another person's opinion, but they disagree with it. They mean well but it raises the question, do we owe respect to the opinions of others. IMO the answer is no. We have a duty to respect others and to respect their right to hold their opinion, but sometimes those opinions are simply foolish or even evil, and it's equally foolish to tell someone that we respect it.

When opinions are foolish we should still show respect for the person who holds it, but when the opinion is evil the holder of that opinion forfeits their prima facie right to be respected. For instance, no one owes respect to someone whose opinion it is that molesting children is acceptable.

2. One of the queerest attempts to push wedges between the races is the resentment expressed by some at what they deem to be an unjustifiable "cultural appropriation" of certain aspects of what's considered to be black culture. The latest episode of this was when a white British pop singer named Adele wore "bantu knots" at a festival and was criticized for essentially purloining a hairstyle that is considered to be the cultural property of African Americans.

One might think that those who get upset about this sort of thing should actually take pride in the fact that others appreciate their cultural traits enough to want to adopt them for themselves, but Adele was chastised for her choice of hair styles:

A drag queen who calls herself (himself?) The Vixen tweeted, "Twice this weekend I have seen people do backflips to defend white women in Bantu Knots. If you spent the whole summer posting #blacklivesmatter and don’t see the problem here, you were lying the whole time."

User @sadhanamoodley wrote, "Seriously Adele... You should know better."

Journalist Ernest Owens tweeted, "If 2020 couldn't get anymore bizarre, Adele is giving us Bantu knots and cultural appropriation that nobody asked for. This officially marks all of the top white women in pop as problematic. Hate to see it."

It's all very silly, and in any case, the idea that only members of the race which gave rise to a particular behavior or style or item - like hoop earrings and bantu knots - have the right to adopt that behavior, style or item is as shallow as it is foolish. If the individuals who think this way are going to be consistent they should, unless they're white, forego the use of everything that was originally developed by white Europeans - including cell phones, computers, televisions, radios, motor vehicles, airplanes, medicine and medical technology, home heating, air conditioning, refrigeration, all electrical appliances, and on and on.

Rather than grouse about someone wearing her hair in bantu knots these folks would do better to spend a little time reflecting on how much cultural appropriation they themselves engage in every day of their lives.

3. In discussing the meaning of life students will sometimes argue that life's meaning doesn't depend upon there being a God. They insist that we can still find meaning in trying to make the world a better place, to improve life for our kids and others, whether there is or isn't a God.

What these students sometimes fail to recognize is that the belief that human progress is possible assumes that history is linear, a view that is distinctly and uniquely Judeo-Christian. The notion that history has a beginning and progresses to a climactic denouement was a product of the Judeo-Christian belief in creation, fall, redemption and eschaton that eventually permeated Europe. It never caught on anywhere else in the ancient world.

Most cultures outside Judeo-Christianity and perhaps some of its offshoots, such as Islam, have held a view that history was either static - that things are the same as they always were and always will be - or cyclical, that history moves in grand cycles ultimately returning to an original starting point. The atheistic German philosopher Friedrich Nietzsche, for example, famously held this view which he called eternal recurrence.

So, if someone believes that the world can be improved and that tomorrow can be better than today, they're borrowing an idea whose genesis depends upon belief in the existence of God even if they insist that the existence of God doesn't matter.

Monday, September 7, 2020

A Labor Day Post

Note: This post was originally written before the Covid19 pandemic devastated the restaurant industry:

On Labor Day it might be appropriate to revisit the debate over raising the minimum wage.

On the surface raising the minimum wage to $15 an hour seems like a simple solution to help unskilled, poorly educated workers struggling with poverty, but, like most simple solutions, raising the minimum wage has unintended consequences that hurt the very people it's supposed to help.

An article by Ellie Bufkin at The Federalist explains how raising the minimum wage has actually harmed many workers, especially in the restaurant industry.

New York state, for example passed a law several years ago requiring that businesses offer mandatory paid family leave and pay every employee at least $15 an hour, almost twice the previous rate. The results were predictable and indeed were predicted by many, but the predictions went unheeded by the liberal New York legislature.

Bufkin uses as an illustration a popular Union Square café called The Coffee Shop which is closing its doors in the wake of the new legislation. The Coffee Shop employs 150 people, pays a high rent and under the Affordable Care Act must provide health insurance.

Now that the owner must pay his employees twice what he had been paying them he can no longer afford to stay in business:
Seattle and San Francisco led New York only slightly in achieving a $15 per hour minimum pay rate, with predictably bad results for those they were intended to help.

As Erielle Davidson discussed in these pages last year, instead of increasing the livelihood of the lowest-paid employees, the rate increase forced many employers to terminate staff to stay afloat because it dramatically spiked the costs of operating a business.

Understaffed businesses face myriad other problems [in addition to] wage mandates. Training hours for unskilled labor must be limited or eliminated, overtime is out of the question, and the number of staff must be kept under 50 to avoid paying the high cost of a group health-care package. The end result is hurting the very people the public is promised these mandates will help.

Of all affected businesses, restaurants are at the greatest risk of losing their ability to operate under the strain of crushing financial demands. They run at the highest day-to-day operational costs of any business, partly because they must employ more people to run efficiently.

In cities like New York, Washington DC, and San Francisco, even a restaurant that has great visibility and lots of traffic cannot keep up with erratic rent increases and minimum wage doubling.

When the minimum wage for tipped workers was much lower, employees sourced most of their income from guest gratuities, so restaurants were able to staff more people and provided ample training to create a highly skilled team. The skills employees gained through training and experience then increased their value to bargain for future, better-paying jobs.

Some businesses will lay off workers, cut back on training, not hire new workers or shut down altogether. A Harvard study found that a $1 increase in the minimum wage leads to approximately a 4 to 10 percent increase in the likelihood of any given restaurant folding.
How does this help anyone other than those who manage to survive the cuts? When these businesses, be they restaurants or whatever, close down it's often in communities which are "underserved" to start with, and the residents of those communities wind up being more underserved than they were before the minimum wage was raised.

Moreover, raising the minimum wage makes jobs heretofore filled by teenagers and people with weak qualifications more attractive to other applicants who are at least somewhat better qualified.

Workers who would've otherwise shunned a lower wage job will be hired at the expense of the poorly educated and unskilled, the very people who most need the job in the first place and who were supposed to be helped by raising the minimum wage.

Despite all this our politicians, at least some of those on the left, still think raising the minimum wage is a social justice imperative, even if it hurts the people it's supposed to help.

Or perhaps the politicians know it's a bad idea, but they see advocating a mandatory increase in wages as a way to bamboozle the masses into thinking the politician deserves their vote.

Saturday, September 5, 2020

The Intellectual Life

In an interesting - and rather unusual - piece in First Things Paul Griffiths gives advice to young people aspiring to the intellectual life. He lists and discusses four requirements of such a life. The first three are these:

1. The aspiring intellectual must choose a topic to which he or she can devote his or her life. Just as one might fall in love with another, so, too, does one often fall in love with an idea or question.

2. An intellectual must have time to think. Three hours a day of uninterrupted time. No phone calls, no texts, no visits. Just thinking and whatever serves as a support for thinking (reading, writing, experimenting, etc).

3. Anyone taking on the life of an intellectual needs proper training. This may involve university study, but it may not.

What Griffith has to say about each of these is interesting, but the most interesting part of his essay to me is what he says about the fourth requirement. One who aspires to the life of the mind must have interlocutors, i.e. people with whom one can share ideas. He writes:
You can’t develop the needed skills or appropriate the needed body of knowledge without them. You can’t do it by yourself. Solitude and loneliness, yes, very well; but that solitude must grow out of and continually be nourished by conversation with others who’ve thought and are thinking about what you’re thinking about. Those are your interlocutors.

They may be dead, in which case they’ll be available to you in their postmortem traces: written texts, recordings, reports by others, and so on. Or they may be living, in which case you may benefit from face-to-face interactions, whether public or private. But in either case, you need them.

You can neither decide what to think about nor learn to think about it well without getting the right training, and the best training is to be had by apprenticeship: Observe the work—or the traces of the work—of those who’ve done what you’d like to do; try to discriminate good instances of such work from less good; and then be formed by imitation.
Very well, but such people are not easy to find. Most people don't care at all about the things that fascinate and animate an intellectual. Most people are too preoccupied with the exigencies of making a living and raising a family to care overmuch about ideas or the life of the mind.
Where are such interlocutors to be found? The answer these days, as you must already know, is: mostly in the universities of the West and their imitators and progeny elsewhere. That, disproportionately, is where those with an intellectual life are provided the resources to live it, and that, notionally, is the institutional form we’ve developed for nurturing such lives.

I write “notionally” because in fact much about universities (I’ve been in and around them since 1975) is antipathetic to the intellectual life, and most people in universities, faculty and students included, have never had and never wanted an intellectual life. They’re there for other reasons. Nevertheless, on the faculty of every university I’ve worked at, there are real intellectuals: people whose lives are dedicated to thinking in the way I’ve described here....If you want living interlocutors, the university is where you’re most likely to find them.
Griffiths adds this:
You shouldn’t, however, assume that this means you must follow the usual routes into professional academia: undergraduate degree, graduate degrees, a faculty position, tenure. That’s a possibility, but if you follow it, you should take care to keep your eyes on the prize, which in this case is an intellectual life.

The university will, if you let it, distract you from that by professionalizing you, which is to say, by offering you a series of rewards not for being an intellectual, but for being an academic, which is not at all the same thing. What you want is time and space to think, the skills and knowledge to think well, and interlocutors to think with. If the university provides you with these, well and good; if it doesn’t, or doesn’t look as though it will, leave it alone.

The university’s importance as a place of face-to-face interlocution about intellectual matters is diminishing in any case. Universities are moving, increasingly, toward interlocution at a distance, via the Internet. This fact, coupled with the possibility of good conversation with the dead by way of their texts, suggests that for those whose intellectual vocation doesn’t require expensive ancillaries (laboratories, telescopes, hadron colliders, powerful computers, cadres of research subjects, and the like), they should be one place among many to look for interlocutors.

You should, in any case, not assume that what you need in order to have an intellectual life is a graduate degree. You might be better served by assuming that you don’t, and getting one only if it seems the sole route by which you can get the interlocution and other training you need. That is rarely the case....
Here's his conclusion:
And lastly: Don’t do any of the things I’ve recommended unless it seems to you that you must. The world doesn’t need many intellectuals. Most people have neither the talent nor the taste for intellectual work, and most that is admirable and good about human life (love, self-sacrifice, justice, passion, martyrdom, hope) has little or nothing to do with what intellectuals do.

Intellectual skill, and even intellectual greatness, is as likely to be accompanied by moral vice as moral virtue. And the world—certainly the American world—has little interest in and few rewards for intellectuals.

The life of an intellectual is lonely, hard, and usually penurious; don’t undertake it if you hope for better than that. Don’t undertake it if you think the intellectual vocation the most important there is: It isn’t. Don’t undertake it if you have the least tincture in you of contempt or pity for those without intellectual talents: You shouldn’t. Don’t undertake it if you think it will make you a better person: It won’t.

Undertake it if, and only if, nothing else seems possible.
There's a lot of wisdom in all of this.