The St. Croix Review

The St. Croix Review

The St. Croix Review speaks for middle America, and brings you essays from patriotic Americans.

Sunday, 29 November 2015 03:51

Our Christian Faith

Our Christian Faith

Angus MacDonald-Editorial

Christianity: Lifeblood of American Free Society (1620-1945), 136 pp., John A. Howard, The Howard Center, 934 Main St., Rockford, IL 61103.

We have forgotten the courage of those who wrote the Declaration of Independence. Nine of the fifty-six signers died of wounds or hardship during the Revolutionary War. Five were captured or imprisoned. Wives, sons and daughters were killed, jailed, mistreated, persecuted -- but they would not abandon the principles of honor they knew. "Except the Lord build the House," said Benjamin Franklin, "they labor in vain that build it."

Our fathers were blessed that Washington, as a president and general, was a Christian gentleman. That basis of his life is hardly mentioned in our day. "The General (Washington) hopes and trusts that every officer and man will endeavor to live and act as becomes a Christian soldier defending the dearest Rights and Liberties of his country." A month later he wrote to say he deplored that cursing and swearing was growing into fashion "We cannot hope for the blessing of Heaven if we insult it by impiety and folly."

While Washington believed in national unity, he deplored the passions of party strife that tear the country into rival sections. While these passions are eternal and seem to be a natural behavior, they cause problems. We must find civilized methods of conversation. A country can be properly governed only when virtue predominates.

During the 19th century, hundreds of colleges and universities were funded by our churches. The curriculum was more concerned with the development of character and piety rather than the acquisition of skills and knowledge. The main purpose was to teach Christianity and proper behavior. This continues in our day and guides organizations that are not connected with our churches. Service clubs such as the Lions and the Rotary are examples.

Notwithstanding the continuity of our Christian faith, that faith does not have priority in the teaching of our universities, print, broadcast media, or the entertainment business. Do they have basic principles? Are they devoted to entertainment only? I do not doubt teachers and entertainers impose standards on themselves, but they should be able to state those standards and live by them. We are a Christian nation, and that should be acknowledged and observed. Intellectual leaders should tell us what defines our culture. To make money is not enough. We don't have to be dull prudes, but we do have to state and preserve who we are. Timothy Dwight, president of Yale University from 1795 to 1817, wrote "Without religion, we may possibly retain the freedom of savages, bears, and wolves, but not the freedom of New England." *

Amplification is the vice of modern oratory." --Thomas Jefferson

Some of the quotes following each article have been gathered by The Federalist Patriot at: http://FederalistPatriot.US/services.asp.

Sunday, 29 November 2015 03:47

Summary for October 2010

The following is a summary of the October 2010 issue of the St. Croix Review:

In the "The Enduring American," Barry MacDonald contrasts the ordinary, honest, hard-working American with the political and cultural elite.

Mark Hendrickson, in "Three Neglected Economic Lessons from American History," writes about the history of sound money, the Constitution's bulwark against excessive government spending, and government's harmful responses to recessions; in "More Lessons from History: How Obamanomics May Play Out," he says the larger the government's share of GDP the slower economic growth; in "Geithner Versus the Bush Tax Cuts," he believes the point of Obama's policies is to grab control of as much economic activity as possible, make more people dependant on government, and redistribute wealth; in "Rethinking the Corporate Income Tax," he explains why there is nothing good to say about the corporate income tax; in "Global Warming -- The Big Picture" he reviews Brian Sussman's Climategate.

Allan Brownfeld, in "An Early Conservative Leader Reflects on How America -- and the Republican Party -- Lost Its Way and How We Can Find Our Way Back," relates Tom Paulken's program of action to take the Republican party back from opportunists, pragmatists, and phony conservatives; in "National Debt Is Described as a Fiscal "Cancer" and a Threat to National Security," he lays out the details of our challenge.

John Ingraham, in "Timely Words from Davy Crockett," relates the story of how Horatio Bunce taught Davy Crockett the meaning of the Constitution.

Herbert London, in "The Mosque on Sacred Ground," views the placing of a Mosque near Ground Zero to be an insult; in "The Wild Turkish Card," he notes the growing estrangement between the U.S. and Turkey and considers consequences; in "Oliver Stone and Hugo Chavez," he describes the latest deceitful film produced by Oliver Stone lauding the South American dictator; in "Thought Control at Augusta State University," he relates the brutal enforcement of political correctness on campus; in "The Arts in the Obama Age," he shows how the National Endowment for the Arts is becoming a propaganda tool of the Obama Administration.

In "The Federal Reserve's Historic Announcement," Fred Kingery sees a turning point at which the Fed has determined to compromise its roles to preserve the purchasing power of the dollar and conduct a monetary policy that supports full employment.

In "When They Dropped the Bomb -- Remembering August 1945," Paul Kengor writes about the desperate circumstances leading to the atomic bombings that ended WW II.

In "How Much Risk Is Too Much?" Robert T. Smith explains the "Precautionary Principle" that balances economic activity or human behavior against its potential harm to the environment, so that guidance for regulation is provided. He shows that environment activists are hijacking the principle to advance a political agenda.

In "The Lasting Consequences of World War I," Michael S. Swisher brings to life layers of forgotten history.

Thomas Martin, in "What's Wrong with the World," uses the 100th anniversary of the publication of G. K. Chesteron's book of the same title to comment on a lost grip of sound virtues by which to guide behavior.

In "Whose 'Good'?" Piers Woodriff believes that the modern "intellectual" has no definition for what is "good" and thus has no method for reliable guidance of behavior.

Gillis J. Harp, in "Are We All Ideologues Now?" describes the Ideologue as a thinker whose views induce blindness.

In "News About America," Jigs Gardner looks at David Hackett Fisher's Albion's Seed, that describes four waves of immigrants who each brought distinctive values and traits that together have made up the original American character: each shared fierce independence.

Joseph Fulda describes a messy and prodding experience with airport officials in "Across the Continent with the TSA."

Libertarian's Corner: Across the Continent with the TSA

Joseph S. Fulda

Joseph Fulda is a freelance writer living in New York City. He is the author of Eight Steps Towards Libertarianism.
Every time the Transportation Security Administration (TSA) fails to protect aviation . . . it punishes [law-abiding] passengers with further restrictions and humiliations.
. . . .
Returning responsibility for protecting its customers and inventory to the airlines also keeps everyone happy. . . . Since profits nosedive after any attempted skyjacking, let alone terrorism, airlines have all the incentive we could ask to institute practical, effective security.

So writes Becky Akers in the cover story of the May 2010 issue of The Freeman. Well, during May, but before reading Ms. Akers' article, I took a trip across the continent by air and learned the old-fashioned way how right she is. Here's a piece of my story.

On the way to my destination, I was told by the ticket agent that a bag the size of mine could easily be carried on board and that there would be no fee whatsoever. Imagine my surprise -- I hardly ever travel by air -- when almost every entirely harmless household item therein was classed "dangerous" and the bag had to be checked both ways, at $25 a pop. What's more, although the TSA is supposed to be concerned with only explosives, bombs, and other security risks, they are so overzealous as to presume to do the jobs of other federal agencies, such as the DEA, and state medical boards. Like many males over 40 (and females over 50), I take a baby aspirin each night before retiring. I am also prescribed 150 mcg. tablets of Synthroid for a hypoactive thyroid and was also prescribed 300 mg. capsules of Gabapentin as a sleep aid for the totally quiet, pitch-black rural environs to which I was traveling. As a lifelong city dweller, I actually need a good bit of noise and light to sleep soundly! Because there are no minors in either my household or the one to which I was traveling, I naturally asked that the medication be dispensed in bottles with easy-off, rather than child-resistant, caps. Of course, when packing my bag, I took the reasonable precaution of firmly rubber-banding both bottles.

When I arrived at my destination, I discovered something I had not expected: My very secure screw-on topped 1,000-count baby aspirin bottle was unscrewed and mixed in a toxic, gooey paste with a smashed and equally secure screw-on topped, but now very-nearly-empty aftershave bottle. The disgusting paste was encrusted all over the inside of my bag. While I didn't see how this could possibly be an accident, I also didn't immediately see a motive, either, so I looked through the rest of my bag carefully, when suddenly a light bulb went on. My Synthroid bottle had no rubber band on it -- although the medication was intact, while the Gabapentin bottle had two rubber bands on it -- although the medication within it was also intact. There was the evidence that the bag had definitely been opened and a potential motive, as well. When our overzealous TSA agents found merely Synthroid in the Synthroid bottle and merely Gabapentin in the Gabapentin bottle, they no doubt felt bored, frustrated, or chagrined enough to create their own action, or so I would hazard a guess, by smashing what could not possibly have come apart by itself and making a foul mess of my entire bag. (I'd also hazard a guess that interfering with a prescription drug is a felony and that somewhere deep in the bowels of government, there's a videotaped record, so they didn't act against the prescribed medication.) This, my friends, is what we get when we entrust our air security to the feds!

On the way back, I carried my two bottles on board, but because I eat only kosher food and because meals, kosher or otherwise, are no longer available on board most domestic, economy-class flights, I dared bring along a kosher wurst, a.k.a. a sausage, which the TSA agent eyed with some suspicion -- or was it envy? -- even after it passed through the metal detector without incident. She then prodded it with some stick with a strange tip, and still found nothing. Still suspicious, she then decided to have the damned sausage X-ray'd. Well, of course, that also turned up nothing but not-so-good-for-you beef. After all, I'm not about to eat a wurst with explosives within it. But whereas only a paranoid government agent would suspect me of eating a toxic sausage, now it was my turn to worry. I quietly phoned my wife on my cell to ask her whether the X-rays rendered my meal inedible. Although generally up on these things, this time she was unsure. Her uncertainty prompted a second phone call -- this time to my personal physician in New York, at the second-worst possible time, the worst being the moment she arrives in the office, the second-worst being just about five minutes before she is about to leave for the day. Well, she was more concerned about the presumably dirty stick used to prod the sausage than with the X-rays as it turned out. After assuring her that the inner wrapping remained intact, she assured me that my sausage was safe to eat.

As Thomas Paine wrote in Common Sense over 230 years ago, "our calamities are heightened by reflecting that we furnish the means by which we suffer."

Disclaimer: A cursory check of the Internet indicates that a lawsuit was filed by a Joseph Fulda against the TSA. That suit was filed, if indeed it was filed at all, by some other Joseph Fulda. *

"My reading of history convinces me that most bad government results from too much government." --Thomas Jefferson

Sunday, 29 November 2015 03:47

Are We All Ideologues Now?

Are We All Ideologues Now?

Gillis J. Harp

Gillis J. Harp is professor of history at Grove City College and member of the faith and politics working group with The Center for Vision & Values. This article is republished from V & V, a web site of the Center for Vision & Values at Grove City College, Grove City, Pennsylvania.

New media have shaped our political culture. Some, like talk radio and all-news cable stations, are developments of older, established technologies. Others, like internet blogs, are based on comparatively new technologies. Yet, both venues have provided congenial habitats for that enemy of reasonable, constructive political discourse: the ideologue.

What exactly is an "ideologue?" Merriam-Webster defines it as "an often blindly partisan advocate or adherent of a particular ideology." Everyone, of course, works from a certain set of assumptions and argues for particular policies based upon their presuppositions. Nothing is wrong with that. But the ideologue is blindly loyal to certain partisan positions, regardless of the facts. As political philosopher Robert Nozick explains, "The moment a person refuses to examine his or her beliefs is the moment that person becomes an ideologue." Sociologist Daniel Bell argued in The End of Ideology (1960) that ideology's role is to mobilize mass movements by inflaming popular zeal; therefore, ideologues "simplify ideas, establish a claim to truth, and, in the union of the two, demand a commitment to action." Unfortunately, this zeal and oversimplification often overwhelm rational debate. They produce a lack of civility in political discussions and a loss of focus in seeking the common good.

In the 1960s and 1970s, most of the ideologues I encountered were on the political left. Some were Marxists who refused to accept the existence of political prisoners in China or Cuba. Others were radical feminists who declared that all men were potential rapists. Recently, however, conservatives seem to have become more like the ideologues they criticized 40 years ago. They have long excoriated "knee-jerk liberals," but have many conservatives actually become knee-jerk ideologues on the right? There are a few warning signs that the transformation may be well underway.

For instance, conservatives denounced Clinton for intervening in Bosnia but championed Bush's intervention in Iraq. Or, as another example, conservatives supported cutting taxes when the country was fighting two expensive wars but, soon after, denounced dangerous deficits.

These are merely two examples that point to the triumph of blind partisanship.

One last example: Participating in anti-Vietnam protests during the late 1960s, some protestors carried pictures of Lyndon Johnson decorated with swastikas. Today, a few Tea Party activists carry placards with President Obama portrayed as Hitler. Refusing to consider complicating facts, ideologues assume that their opponents are demonic.

It is sad to see conservatives morphing into rigidly partisan ideologues, enabled by a mass media that generates more heat than light by seeking the lowest common denominator. Some programs on Fox News sound more like Jerry Springer than they do Bill Buckley's old decorous debate show, "Firing Line." Some of the founders of the post-World War II conservative renaissance would be horrified. Russell Kirk argued that conservatives, with their realistic recognition of human limitations and their preference for prudential, incremental change, were fundamentally anti-ideological. The conservative, commented Kirk,

. . . thinks of political policies as intended to preserve order, justice, and freedom. The ideologue, on the contrary, thinks of politics as a revolutionary instrument for transforming society and even transforming human nature. In his march toward Utopia, the ideologue is merciless.

Facts don't matter, and character assassination is permissible. The shouting, weeping egotists who speak on behalf of the conservative movement today don't strike me as very, well, conservative.

Besides conservatives, I can think of at least two other (overlapping) groups who should scrupulously avoid becoming ideologues:

First, are academics. I encountered a few examples of this sort of animal back when I was an undergraduate. A teaching assistant in political science refused to discuss the Soviet Gulag; an historian wouldn't acknowledge that religion ever served any positive role in history. There still aren't many conservatives in American academe today, but the solution to that imbalance isn't to import right-wing ideologues to replace the left-wing ones. Again, Daniel Bell can help us understand how the authentic scholar differs from the ideologue:

The scholar has a bounded field of knowledge, a tradition, and seeks to find his place in it, adding to the accumulated, tested knowledge of the past as to a mosaic. The scholar, qua scholar, is less involved with his "self." The intellectual [i.e., the ideologue] begins with his experience, his individual perceptions of the world, his privileges and deprivations, and judges the world by these sensibilities.

Accordingly, ideologues seek to make the world fit into their tidy personal molds, regardless of untidy facts.

A second group: Christians should also be the least inclined to embrace the approach of the ideologue. Though they are prepared to be dogmatic about the core essentials of the faith, they should wear human-devised systems very lightly. While Christians should be prepared to defend those propositions contained in Holy Writ, they hold no special brief for man-made systems. Though they recognize that some systems have had a more benign influence in human history than have others, they should refrain from absolutizing particular historical arrangements in a fallen world. Although certain social, political, or economic structures may be superior to others, Christians need to remember that they are only relatively better.

In many ways, I am preaching to myself. As a professor at an evangelical college, I have found it salutary to reflect on the pitfalls of ideology. American conservatives these days might do well to ponder the dangers as well. *

"Guard against the impostures of pretended patriotism." --George Washington

Sunday, 29 November 2015 03:47

Whose "Good"?

Whose "Good"?

Piers Woodriff

Piers Woodriff is a retired Virginian farmer.

To this unorthodox "intellectual," it seems that ever since G. E. Moore's 1903 denial of the adjective "good," modern intellectuals have unwittingly become partisans of evil, enablers of destruction. Julien Benda, in 1928, with The Treason of the Intellectuals, and Alexander Solzhenitsyn, in the 1978 Harvard Commencement Address, attributed the worst political disasters and wars of the 20th century to the intellectuals -- not the economic powers or the military powers. This indictment of intellectuals began before I was born. Flannery O'Connor, the "Hillbilly Thomist," was in the middle of it with her crusade against the "innerleckchuls."

So -- why -- for over a century, have the intellectuals upheld Moore's denial of the adjective "good"? "Good" -- logically -- means creative, the support of creation, of nature, of natural law. Apparently Moore did not consider that definition to be defensible because it implies that there is a creative power -- natural law -- transcending the universe. That puts theology in the equation and many influential intellectuals would have objected. Prevented by intellectual "correctness" from defining "good" as "creative," Moore was forced to conclude that the adjective "good" could not be defined.

Why does Moore's conclusion still hold today? For one thing it fits in perfectly with the reign of "non-judgmentalism." When Christ said "judge not," he meant do not judge people. Judge acts only. People can only be known by their acts, which change if they change. But, that reading has been "righteously" twisted into: "do not judge people or their acts." Moore's 1903 denial of the adjective "good" facilitated that twisting. 1970: Iris Murdoch, The Sovereignty of Good, p. 74: "good is non-representable and indefinable." Well, that makes "judgment" almost indefensible. My, how righteous we have all become!

If we cannot define "good" exactly, we have no way of definitively judging acts. It gets worse. If the ban on the equation of good with creative holds, then we have no absolute definition for morality. Morality automatically becomes "relative," relative to some conditional, worldly "good." For the champions of egalitarianism, morality must not be defined as "creativity," because some people are more creative than others. Morality as creativity is an absolute that would destroy relative human equality. That so called "morality" would be immoral. Consequently, humans must not be defined as being made in the image of the Creator: "able to be creative."

There is a famous controversy, the "Is-Ought Controversy." It has always puzzled me. The fact that a thing exists or is done does not prove that it is natural or that it ought to be done. Correct: no "ought" from this "is." But, there is another "is," which has apparently been left out. The way a thing is designed to work is the way it properly works and is the way it "ought" to be done. Here again, a word being used: "design," gets us into trouble with that monster called "theology." Drop "design"? Personally, l see no way to escape from "design." If you ignore the "design" of the universe, i.e., biology, chemistry and physics -- you are dead.

Liberal or conservative, left or right, progressive or regressive? Liberals, apparently, are superior to conservatives. The key distinction though, in my opinion, goes back to the Enlightenment. Alasdair Macintyre has articulated that distinction, but few seem to have noticed. In the Scottish Enlightenment, which inspired the American Revolution, human enlightenment (liberality) springs from submission to Truth, i.e., God. In the French Enlightenment, which inspired the French Revolution, human enlightenment (liberality) springs from submission to Reason, i.e., Man.

The American Revolution led to peace and prosperity and an equable form of government. The French Revolution led to a bloodbath. The French then realized that true democracy puts truth ahead of reason, avoiding tyranny -- theocracy of any kind. But, since 1900, America has adopted the original French bias: replace God with Man, truth with reason, fact with theory. Benda and Solzhenitsyn clearly saw the problem. Worship of the Creator -- theism (theonomy) -- enables human creatures to honor truth and be creative. Worship of the rational self -- anthropocentrism (autonomy) -- too often glorifies the individual so much that self-destruction follows. *

"If you want total security, go to prison. There you're fed, clothed, given medical care, and so on. The only thing lacking . . . is freedom." --5-Star General and U.S. President Dwight D. Eisenhower (1890-1969)

Sunday, 29 November 2015 03:47

What's Wrong with the World

What's Wrong with the World

Thomas Martin

Thomas Martin teaches in the Department of Philosophy at the University of Nebraska at Kearney. You may contact Thomas Martin at: This email address is being protected from spambots. You need JavaScript enabled to view it..

It is an honor to be here this evening to celebrate the 100th anniversary of G. K. Chesterton's book, What's Wrong With the World. I must admit in sitting down to think about what's wrong with the world to present to you this evening, I was overwhelmed. Where does one begin! The oil spill in the Gulf of Mexico, the high unemployment rate, the stock market, the ever-failing American public educational system, same sex marriage, divorce, abortion, the breakdown of the family, bunions, arthritic joints, molts, warts and/or wrinkles. Fortunately, I received a flyer from the American Chesterton Society, which had a caricature, drawn by G. K. Chesterton himself, of two wide-eyed men on a globe looking intently at what is directly in front of them. The larger man is wearing glasses and is kneeling down looking through a magnifying glass ever so keenly at the world. The smaller man, also wearing glasses, is mounted on the larger man's neck, looking ever so keenly through a magnifying glass at the top of the larger man's head. Both men are obviously fixated on studying the phenomena that are right before their eyes.

G. K. C.'s two characters represent the natural scientist and the social scientist; the former studying and ordering his observations of nature and the latter studying and ordering his observations of man. The natural scientist is the larger of the two men because the natural scientist has been successfully at work for centuries in unlocking the secrets, as it were, of physics, chemistry, biology, and geology, the inner workings of nature. In fact, physicists and biologists, using the lens of empirical investigation with microscopic precision, claim to have discovered the origins of man, which began with a big bang that is still expanding forward into space.

But why is the larger man on his knees? He could be on his knees because he is in awe of nature; or, he wants to closely study her in order to understand his relationship as a good steward; or, he wants to put her on Sir Francis Bacon's rack and squeeze all the secrets out of her; or, perhaps, he is simply under the weight of the smaller man, the social scientist who has driven him to his knees and is using his magnifying glass, the microscopic eye of the natural scientist, to study mankind just as an entomologist would study ants to describe the workings of the well-ordered society in the anthill, the model of the global society into which mankind is currently progressing.

With this picture in mind as a clue to what is wrong with the world, I opened the book to the first chapter, "The Medical Mistake," and read:

A book of modern social inquiry has a shape that is somewhat sharply defined. It begins as a rule with an analysis, with statistics, tables of population, decrease of crime among Congregationalists, growth of hysteria among policemen, and similar ascertained facts; it ends with a chapter that is generally called "The Remedy." It is almost wholly due to this careful, solid, and scientific method that "The Remedy" is never found. For this scheme of medical question and answer is a blunder; the first great blunder of sociology. It is always called stating the disease before we find the cure. But it is the whole definition and dignity of man that in social matters we must actually find the cure before we find the disease.

Chesterton's caricature proves helpful. We are being studied by the magnified eye of the social scientist that sees before his eyes a shattered world that lacks organization. In America, Uncle Sam has fallen off the wall of the ideal city on a hill, and, lying broken, beaten, and bruised, he is in need of all the President's horses and all the President's men to fulfill the American rights of life, liberty, and freedom from social inequality, as guaranteed by social welfare equally distributed to every citizen in the new global society.

The medical mistake Chesterton refers to is caused when politicians, social commentators, and social scientists treat the body politic as if it were a human body, which they liken to a social organism, all neatly laid out with graphs, charts, and pies, sliced, diced, and categorized, to aid in diagnosing the social diseases affecting its health.

The modern social experts examine the body politic, moving from the limbs, to torso, to head, and diagnose the bruises to be the current economic downturn, high unemployment, home foreclosures, obesity, drug abuse, sexually transmitted diseases, divorce, unwanted pregnancies, children born out of wed-lock, abortion, high school drop-out rates, racial discrimination, gender bias, homophobia, latch-key children, the decay of the family, etc.

To solve these problems, the political and social experts, like physicians treating the sick, immediately prescribe a regimen of social programs -- "The Remedy"-- to cure the social disorders and diseases causing the current broken state of the nation.

The whole idea of a social cure never works, Chesterton notes, because it is a logical mistake committed by politicians and social commentators who begin at the wrong end of the political question or social problem. They have taken a turn to the left by not asking what is the right shape for man.

. . . the sociological method is quite useless: that of first dissecting abject poverty or cataloguing prostitution. We all dislike abject poverty but it might be another business if we began to discuss independent and dignified poverty. We all disapprove of prostitution; but we do not all approve of purity. The only way to discuss the social evil is to get at once to the social ideal. We can all see the national madness; but what is national sanity? . . . What is wrong is that we do not ask what is right.

The only way to discuss the social evil is to get at once to the social ideal. You cannot point to the social evils of poverty and prostitution as vices without the ideal of purity and the dignified poverty of a Mother Teresa who has given her all. Medical science is content with the normal human body, and knows when it has been cured. Physicians have medical standards by which they measure and know what normal is; for example, the normal temperature of the body is 98.6, etc. But the social sciences, Chesterton notes, is by no means content with the normal human soul; it has all sorts of fancy souls for sale. Man as a social idealist will say "I am tired of being a Puritan; I want to be a Pagan," or "Beyond this dark probation of Individualism I see the shining paradise of Collectivism.

The medical problem results because the social scientist does not have an ideal shape in mind for the human soul. In fact, the social scientist makes the scientific fallacy of denying man a soul as it cannot be seen when using the magnifying method of the natural sciences, though it works well with entomology. Unlike the natural scientist, the astronomer, biologist, chemist, and physicist using telescopes, microscopes, petri dishes, calibrators, etc., to study the objects in nature; the principal instrument used by the social scientist to dissect man is the survey, which is simply a piece of paper, a questionnaire, which is given to a group of "human subjects" to answer. When all the data is collected, the social scientist totals up the numbers -- it must be numbers -- and does a statistical analysis which gives him the norm, which he uses as the point from which to measure the current patterns of social behavior.

The crux of the matter is, Chesterton notes, that:

There are two things, and two things alone, for the human mind, a dogma and a prejudice. A doctrine is a definite point; a prejudice is a direction. The Middle Ages were a rational epoch, an age of doctrine. Our age is, at its best a poetical epoch, an age of prejudice. That an ox may be eaten, while a man should not be eaten, is a doctrine. That as little as possible of anything should be eaten is a prejudice, which is also sometimes called an ideal. . . . The essential of the difference is this: that prejudices are divergent, whereas creeds are always in collision. Believers bump into each other; whereas bigots keep out of each others' way. A creed is a collective thing, and even sins are sociable. A prejudice is a private thing, and even its tolerance is misanthropic.

Track, for example, the modern social scientist's study of society conducted by a selective survey, pointing out the latest social indicators in the article, The New Norm, in the Milwaukee Journal Sentinel, on June 10:

A national survey indicates that teens are too accepting of out-of-wedlock births, but teen births have flattened and contraceptive use is up. It has long been a staple in the abstinence-only crowd that teaching contraceptive use to teenagers results in more teen sex.

The National Survey of Family Growth found that the number of teen girls and boys who had sex had not changed significantly from 2002 to the 2006-2008 period in this recent survey. However, the number of girls who used a contraceptive (most likely a condom) during premarital sex rose to 84 percent, up from 55 percent in 1985. Among boys, use of the condom was 81 percent up from 71 percent.

A social scientist ends up with a norm that is not normal; he has no standards. The word normal, from the Latin norma, literally means carpenter's square, and is the standard by which one measures, when building a home. Applied in the medical field, the physician, though not the craftsman of a human being, has the standards by which he knows the normal (temperature, blood pressure, blood consistency, etc.) when diagnosing sickness and injury to restore health.

However, there is no end to the studies of the social scientist, who moves to the tune of "induction never ends," and seems to be waiting for the fact that will bring with it a revelation. But that fact will never arrive: experience does not tell us what we are experiencing. (Weaver, Rhetoric)

C. S. Lewis, in The Abolition of Man, noted, from propositions about fact alone no "practical" conclusion can ever be drawn. (That) this will preserve (life) cannot lead to "do this" except by the mediation of (life) "ought to be preserved." In other words, the positivist cannot get a conclusion -- a dogma -- in the imperative mood out of a premise in the indicative mood; he cannot learn what he ought to do from what it is in his power to do. A prejudice can never become a dogma.

There is nothing modern concerning the method of the social scientist as it is practiced by the smaller man mounted on the larger man's back. Listen as Socrates, in Plato's, Republic, describes the smaller man's method of observation as that of the sophist,

It is as if a man were acquiring the knowledge of the humors and desires of a great strong beast which he had in his keeping, how it is to be approached and touched, and when and by what things it is made most savage or gentle, yes, and the several sounds it is wont to utter on the occasion of each, and again what sounds uttered by another make it tame or fierce, and after mastering this knowledge by living with the creature and by lapse of time should call it wisdom, and should construct thereof a system and art and turn to the teaching of it, knowing nothing in reality about which of these opinions and desires is honorable or base, good or evil, just or unjust, but should apply all these terms to the judgments of the great beast, calling the things that pleased it good, and the things that vexed it bad . . .

Standing outside and looking down on man, the modern social scientist is a one-eyed Cyclops only interested in knowing how the great beast responds by listening to the sounds it makes which he then methodically records. He has burnt out the interior eye of his soul with his magnifying glass and therefore lacks soulful depth perception and the grammar of virtue to describe a person's actions. The man under his magnifying glass will not exhibit moderation, self-control, courage, covetousness, gluttony, sloth, lust, anger, righteous indignation, prudence, and cowardice. Now he can only classify a person as having a personality that is typed, A or B, and given that man is an animal, he is no longer seen as having a free will. What were previously considered actions, will be labeled "behaviors," which are simply reactions caused by social, biological, and economic conditions. Man is determined by his genetic predispositions and responses to the conditions of his environment. Responsibility is replaced with pleasurable and painful experiences, and thought is replaced with how a person feels. Like the great beast the sophist studies, man can be surveyed and recorded in the sounds it is wont to utter to determine how he feels. The sophist's knowledge puts him in the position of an animal conditioner as he knows how to appease man's desires and satisfy his sensual whims.

When a man becomes an anthropologist, he denies all that is not quantifiable. Science cannot find freedom and, unless it is operable, it has nothing to do with the heart. When he tries to be scientific about man, the most noted failure of the 20th century, the social scientist negates the soul where literature, verse, art and music are the formative means of developing the heart. In by-passing the language of the heart for information about man, the social scientist ends in eliminating virtue where thought corresponds to action and the unexamined life is not worth living.

This sophisticated Cyclops detaches himself from mankind and treats him like the animal of which Socrates spoke. He ends in classifying human beings by the categories of race, class, sex, and ethnicity because he has instituted a prejudice which directs him, as it were, to cut man into pieces. With razor-like precision, he further cuts up each and every one of us by height, weight, hair color, body type, age, economic status, nationality, state, ability to own a home, and status -- economic, marital, medically insured, employment, etc. all of which can be seen as subcultures or parts of the larger culture. Vivant Gordon, in her article "Multicultural Education: Some Thoughts from an Afro-centric Perspective," furthers my points when she states regarding a college education:

For a first-year student to depart from that established path [the Eurocentric perspective] will be a bold undertaking requiring the election of a curriculum that promotes an equitable study of the cultures and contemporary issues of the national non-white citizenry. This prickly path will require the pursuit of courses of study such as African-American, Chinese-American, Native-American, Japanese-American, Puerto-Rican American, and Women's Studies as well as other studies of the pluralism of America.

Gordon laments the plight of all the ethnocentricity of America's non-white citizenry, who are not a part of the dominant culture, when faced with the absence of their presence among the Greco-Roman tradition. She further assumes that the study of contemporary issues of the national non-white citizenry is essential for an equitable study of cultures and contemporary issues.

Vivant Gordon uses the anthropological sense of culture to be defined as a group of people with common mores, values, customs, and race. She further defines a subculture to be an association wherever two or three people have something in common. In order to have "cultural diversity" at universities, Vivant Gordon advocates an assortment of "studies" from the "perspective" of each race to properly inculcate what has hitherto been the "Eurocentric (Greco-Roman) male point of view."

Implicit in her assumption is that each hyphenated American group has a culture which provides a perspective of the world. Given that America is a nation of people from every nation in the world, and each of these people has a culture, and each culture provides a perspective, and all perspectives deserve equal study, it is unfair and unjust in the pluralism of America to say that one of the plurals, one of the perspectives, has a privileged position. When boiled down, this means an Afro-American has his view, a Cuban-American has his view, a Japanese-American has his view, a Greco-Roman has a view, and, as all views are equal, they all deserve to be taught.

Furthermore, notice, that on Vivant Gordon's prickly path of cultural studies, she also included Women's Studies.

With this inclusion, the other cultural perspectives must be doubled and perhaps even tripled. Help me! Is there an Afro-American perspective, or is there a female Afro-American perspective in addition to a male Afro-American perspective? Naturally, if there is a female Afro-American perspective and Aristotle is right that the whole must be composed of its parts, then each part -- each woman -- must have the same perspective. However, the women I know do not think, look or act alike, any more than the men I know, think, look, or act alike, which means each individual has a unique perspective from where he sits. (How about a course in the individual American?)

Furthermore, if each culture offers a perspective and each sex has a perspective and each person has a perspective, and all perspectives are equal, why would anyone want to study another person's perspective when he could study himself on his Facebook page? Why should a Japanese-American study about an Afro-American if each person has a perspective of the world? This prejudice which is not a dogma results in cultural relativism, and when all ways of life are equal, there is no way life ought to be lived; there are no normal moral standards, no dogmas from which to measure, so morality becomes a matter of opinion.

In America, our children are taught to celebrate cultural diversity under the social ideal of tolerance, as if it were the Christian virtue of charity, of love of one's neighbor. The social doctrine of tolerance is based on acceptance of human difference according to race, class, and gender but without any sense of what unites us as American citizens. It celebrates pluralism devoid of a uniting principle. E pluribus Unum, "Out of many," one has been replaced by "out of one, many." Its tolerance is misanthropic. So it is with our existing divisions. They keep out of each other's ways. . . . What kind of parent tolerates his child? What kind of neighbor tolerates his neighbor?

Vivant's form of diversity, a prejudice for choosing authors by the color of their skin and not by the content of their characters, lacks the unifying principle necessary for qualitative judgment. This is the norm in the modern progressive university as driven home by the textbook Diversity in Families, which I picked up in my campus bookstore and is used in a course in the Department of Family and Consumer Sciences.

I opened the book and read the first paragraph: (and I quote)

Families are in flux. Far-reaching changes in society are altering family life and bring forth contrasting interpretations of these changes . . . Widespread divorce, the growth of single-parent families, cohabitation, and the rise of out-of-wedlock births suggest that the family is disintegrating. On the other hand, these patterns can be interpreted to show that although some families may be troubled, the family is very much alive as it changes in response to the surrounding world.
For starters, the popular wisdom that we are witnessing the breakdown of the family is based on a stubborn myth that the family of the past was better than the family of the present. The family is pictured as being cohesive, communicative, and highly committed to one another. . . the family of the good old days would be seen as highly romantic, descended from Puritan manners and morals of anti-sexual bias rooted in a divinely ordained sexual monogamy. . .
However, the reality of the myth of family life is quite different . . . there is no golden age. . . . In reality, there was desertion by spouses and illegitimate children . . . that society is decaying because of high divorce rates overlooks that countless marriages in the past were ended simply by desertion. To judge marriage of past times as better than contemporary marriage is to ignore historical changes.
Historical change is revealing that the myth upon which the traditional American family is based is the product of a "false universalization, of assuming a single, uniform family experience. . . . This myth of one legitimate family type has distorted our reasoning about why families are different. . . . When in fact new scholarship on the family has shown that throughout history major social structural forces have created a diversity of family forms.

The diversity of family forms, the new norm for the authors of the textbook, is one of a vast majority of single-parent houses, most of which are maintained by mothers living in poverty. And households maintained by persons living alone contribute to the expansion of living arrangements and this surge of non-family households includes an increase in young adults moving away from home, postponement of marriage, continued high rate of divorce, and an growing number of old persons living alone.

Textbook conclusion:

People with limited resources have used their families to fashion solutions to the problems of day-to-day living. Family strategies may require unique work patterns, kinship and living arrangements, and distinctive forms of interpersonal relationships . . . As families adapt to the world around them, they may take on characteristics that differ from the monolithic family type.

Notice the opening premise, families are in flux, which the social scientist deduces from doing a survey of the current living arrangements that have replaced, heaven forbid, the anti-sexual bias of Puritanical marriage rooted in a divinely ordained sexual monogamy. Remember the norm is not normal. The dogma of marriage is a false universalization that is proven because some marriages ended in desertion and divorce. The Christian ideal has not been tried and found wanting. It has been found difficult, and left untried.

This is like a doctor arguing that the ideal of a normal healthy person is a false universalization. Given that the human body is in flux and recent studies show a majority of Americans are obese, this proves that there is a diversity of human forms that take on characteristics that differ from the monolithic body type of the predominately male Greco-Roman perspective.

The new social idealist is a progressive socialist who is opposed to the dark probation of Individualism and sees the future shining paradise of collectivism which Chesterton, one hundred years ago, was attuned to as the rising of the social scientist turned the scientific method on man to control and order his life toward the end of the greatest happiness for the greatest number.

The last few decades have been marked by a special cultivation of the romance of the future. We seem to have made up our minds to misunderstand what has happened; and we turn, with a sort of relief, to stating what will happen -- which is (apparently) much easier. . . . The modern mind is forced towards the future by a certain sense of fatigue, not unmixed with terror, with which it regards the past. It is propelled towards the coming time; it is, in the exact words of the popular phrase, knocked into the middle of next week. And the goad which drives it on thus eagerly is not an affectation for futurity. Futurity does not exist, because it is still future. Rather it is a fear of the past; a fear not merely of the evil in the past, but of the good in the past also. The brain breaks down under the unbearable virtue of mankind. . . . The older generation, not the younger, is knocking at our door.

Aristotle long ago noted that the family of father, mother, and children is the natural domestic society that comes before political society. It is the social relationship necessary for the end and perfection of government. So goes the family, so goes civil society. The modern family has not fluctuated into a diversity of family forms, it is broken and in a state of decay.

In fact, the family is the ideal, from which G. K. Chesterton measures what's wrong with the world, and I quote:

Now, for the purpose of this book, I propose to take only one of these old ideals; but one that is perhaps the oldest. I take the principle of domesticity: the ideal house; the happy family, the holy family of history. For the moment it is only necessary to remark that it is like the church and like the republic, now chiefly assailed by those who have never known it, or by those who have failed to fulfill it. Numberless modern women have rebelled against domesticity in theory because they have never known it in practice.

Now, I was a young man when my wife introduced me to the principle of domesticity, so young, in fact, that I did not have the opportunity to take a social science course to survey the state of the fluctuating family I tied myself into on my wedding day. However, I do know a husband should never attempt to study his wife by taking a survey of her behaviors and comparing her with other women. There is no better way to get yourself in hot water, and this is as it should because a wife is like a fire. Chesterton states:

In the ideal family there is a wife who is like a fire, or to put things in their proper proportion, the fire is like the wife. Like the fire, the woman is expected to cook, not to excel in cooking, but to cook; to cook better that her husband who is earning the coke by lecturing on botany or breaking stones. . . . Like the fire, the woman is expected to illuminate and ventilate, not by the most startling revelations or the wildest winds of thought, but better than a man can do it after breaking stones or lecturing. . . . Women were not kept at home in order to keep them narrow; on the contrary, they were kept at home in order to keep them broad. The world outside the home was one mass of narrowness, a maze of cramped paths, a madhouse of monomaniacs.

Chesterton's analogy of a wife being like fire sounds primitive. Who but a pre-Socratic philosopher looking for the primary stuff, the Urstoff, of all things, would compare a wife to one of the four basic elements. Everyone knows fire lives by feeding on, by consuming and transforming into itself, heterogeneous matter. A fire springs up, as it were, from a multitude of objects. If a wife were a fire she would consume and seek to transform those around her into herself, and without this supply of material (her husband and children) she would die down and cease to exist. (The very existence of fire depends on this "strife" and "tension.")

Furthermore, it does not seem right to the thoroughly modern-minded sociologist to compare a wife to a fire, since the home has been electrified and wives have given up the wood stove and the cauldron for the microwave and the crock-pot. Nor would the wife of the technological age, who is bent on finding herself outside the home as a professional, be thought the source of illumination to her children.

If Chesterton is going to compare a wife to a fire then certainly he would agree to carry the analogy further and compare a husband to the wind, and conclude by weaving the two together in one flesh, so that fire lives the death of air, and air the death of fire. Without the wind the fire cannot feed and without the fire the earth would be a windless, cold, and barren planet. Thus as the Heavens are to the Earth, so is a husband to a wife.

However, today there is much talk about a woman's right to be self-sufficient by landing a job outside the home, as though leaving home were a form of emancipation granted previously only to men. The idea that the wife has been discouraged from leaving home and the work of housewifery presupposes the husband had been liberated by leaving home for a job alongside other men. This, however, is a lie.

Before the electrification of America, most people were farmers and their ancestors were serfs, peasants and commoners. My forefathers left, or lost, their farms for a variety of reasons: war, the Depression, drought, tuberculosis, or the lure of the city.

It was not a glorious day when the first husband, in a long line of husbandmen, replaced the crow of the cock with the ring of a clock, and, showering before work, put on clean clothes to impress strangers. Nor was it a great moment in history -- fluxing before our eyes -- when the electrified husbandman started his day by listening to the news (which is forever the same thing being made to seem quite new) while eating cereal out of a box and then packing a lunch, all before punching a clock which calculates "hours worked" to be multiplied by X number of cents to be received at the end of the week. Conversely, a husbandman never had a job; he labored, but he did not have a job. No husbandman ever woke and said, "I have to go to my job." Nor did he dream about retirement, the day when his labor would be finished. There is no end to the "work" on a farm, as there is no beginning point in the cyclical movement of the seasons.

Today there are few husbandmen who "dress and keep" Mother Earth with their wives and their eight children (the average on my mother's side of the family from the Revolutionary War till the beginning of the 20th century.) Now there are husbands who work at one job. Whereas before a man had been a jack of many trades (laying brick, logging, blacksmithing, cutting ice, practicing animal husbandry, and mastering an assortment of tools: from spades to picks, to hammers, saws, planes, files, braces, chisels, belt punch, and tongs), now he has become a monomaniac who does one job the entire day (driving a bread truck, laying bricks, operating a lathe, stocking shelves, working in a mill, climbing telephone poles, or doing sociological surveys). But though the husband became narrower, his wife remained broad in the keeping of the kingdom at home.

The home is meant to be a place of schooling that begins on the first day with the baptism of a name and rises to the comfort of a mother's lap. It is in the home where the children learn their mother tongue and noble ways, through tales that warn against the witches, wizards, dragons, and demons who feed on the innocent. A good mother raises her children in work, study, and play: from feeding animals to weeding the garden; from peeling potatoes to baking bread; from churning butter to plucking chickens; from drawing out letters to reading Scripture; from pressing leaves to decorating cakes; from nature walks to sack races; from stitching clothes to sowing shut wounds; from bathing children to washing the bodies of her dead to be placed before the community in the front room of her home. Such a wife is ruler of the home, fueling her children with a love that understands the work of housewifery and motherhood as all consuming as it involved the transformation of the children.

No doubt such thoughts are labeled romantic, a glorification of a past that never existed and could not possibly exist now. However, that wives at home, who as Aristotelians taught morals, manners, theology, and hygiene to their children, were broader and their work more important than their husbands,' narrowed in the shop, cannot be denied.

And there will be those who argue that although there was such a time, there is no going back. However, there are mothers at home today burning brightly with their children. It can only be achieved one home at a time. Furthermore, do not be mistaken in thinking that we have gone forward. We have traded our cyclical life for a linear view of progress. We are being pushed down corridors into one mass of narrowness, a maze of cramped paths, a madhouse of monomaniacs all at the expense of being broad. *

"We must not let our rulers load us with perpetual debt." --Thomas Jefferson

Sunday, 29 November 2015 03:47

The Lasting Consequences of World War I

The Lasting Consequences of World War I

Michael S. Swisher

Michael S. Swisher is the owner of Bayport Printing House in Bayport Minnesota, and Chairman of the Board of the St. Croix Review. He is the happy combination of a businessman and scholar.

Most Americans, if asked which were the more significant, World War I or World War II, would probably answer the Second. They would have some good reasons for so doing -- American involvement in that war was longer, American casualties were greater, and the war was fought in a vast Pacific theatre as well as on the Old World battlefields of the First World War. Above all, the cause of America's involvement in the Second World War, namely the Japanese attack on Pearl Harbor, is clear and still well remembered. By contrast, probably not one in one hundred citizens could identify the reasons for the United States' declaration of war on the Central Powers in April of 1917. Indeed, the validity of those reasons was not widely agreed upon by many Americans at the time, and the sense that this country's prosecution of the war had been a waste of blood and treasure had a great deal to do with the development of isolationist sentiment in the following decades.

Nonetheless, as I hope to illustrate this evening, it was the First and not the Second World War that exerted more lasting influence on the character of the modern world. Although it seems a truism to note that there could not have been a Second World War had there not been a First, this is more than a mere question of ordinal numbering. The only sensible way to conceive of World War II, at least in its European aspects, is as arising from the results of World War I. The same may be said of the Cold War, which ensued after 1945, and (most importantly from our point of view today) the circumstances we now experience in the Middle East.

In order to understand how World War I changed the world -- perhaps more significantly than any European event since Pope Leo III crowned Charlemagne emperor of the Romans on Christmas day in 800 A.D. -- it is necessary to look briefly at the political order of Europe as it existed on the eve of the war. The last major re-drawing of the European map took place after the Napoleonic wars, at the Congress of Vienna in 1814-5. The effort of European diplomacy at this time was to return, as closely as it was possible to do, to the arrangements that had existed before the French revolution. France and Spain were restored to their Bourbon monarchies; the Netherlands, Denmark, and Russia to their respective royal houses. Italy remained divided among the Neapolitan Bourbons, the Papal States, and its northern provinces between their Savoy or Habsburg rulers. The Holy Roman Empire, after a thousand years, tottered to its end in 1806, and was not revived; but the German principalities it had comprised remained under their respective kings and princes, while the considerable territories under the personal rule of the Habsburgs became a multi-ethnic state within which the monarch retained the title of emperor. As it had been since its emergence from the Dark Ages, Europe was ruled by crowned heads and governed by an aristocracy of blood, to which prosperous members of the professional and mercantile classes occasionally were able to elevate themselves given the necessary intelligence and drive.

Two important changes to the European map during the 19th century involved countries that were to play important parts during the 20th. The ambitions of Prussia to become a significant military power had arisen in the early 18th century, and that Hohenzollern kingdom moved from strength to strength until, at the Battle of Waterloo, its Field Marshal Gebhard von Blcher shared victory with the duke of Wellington. In the post-Napoleonic European political atmosphere, Prussia asserted itself repeatedly -- here against Austria-Hungary, there against Denmark, or its neighboring German principalities, and finally against France in the Franco-Prussian war of 1870. This deposed Napoleon III, ushering in the French Third Republic, and, most importantly for our narrative, established the German empire -- the Second Reich (replacing the first, the defunct Holy Roman Empire) -- under the Hohenzollern Kaiser Wilhelm I. Similarly, following the Risorgimento led by Garibaldi, the fragmented states of the Italian peninsula were united under the house of Savoy.

The 19th century was an age of imperial expansion. The British and the French had been colonial powers since the 17th century, and continued to expand their possessions -- Britain held India, parts of sub-Saharan Africa, and its colonies in Canada, Australia, and New Zealand, settled by British immigrants, achieved self-governing status while retaining their ties to the Crown. France had African, Caribbean, and Southeast Asian colonies. Even smaller powers, like the Netherlands in the East Indies or Portugal in Africa, had significant overseas possessions.

To this competition for colonial dominion, the relatively new nation-states of Germany and Italy were latecomers. The rule of Wilhelm I, the first German emperor, had been sober and calculating; his great minister Otto von Bismarck knew when to be ruthless, and when to be cautious. Wilhelm's son Frederick III reigned for only 99 days, and was succeeded by his son, Kaiser Wilhelm II, a man of very different character from his grandfather. Wilhelm II had been a sickly boy, the product of a breech birth, and had a withered left arm, the result of Erb's palsy. His parents, Frederick III and the princess Victoria (daughter of the British queen), had admired the way in which Queen Victoria and her prince-consort Albert of Saxe-Coburg-Gotha had reigned over Britain. They had hoped to emulate them, and to establish a British-style cabinet to replace the executive ministry that Bismarck had created for himself. Bismarck attempted to alienate their son to use as a political weapon against them; but this plan backfired when the headstrong prince succeeded as Wilhelm II, and his imperial ambitions conflicted with Bismarck's conservative advice. Bismarck was soon dismissed, and Wilhelm undertook personal rule as Louis XIV had done upon coming of age. Unfortunately, Bismarck was not as dispensable as was Mazarin, and Wilhelm was not as astute as Louis XIV.

Germany acquired some African colonies, Tanganyika and Southwest Africa; but the grand stratagem of Wilhelm's colonial policy was to compete with the jewel in Britain's crown, India. To this end, Wilhelm cultivated the friendship of the Turkish sultan, Abdul-Hamid II, execrated as a despot by all the other European powers. The Ottoman Empire was a decaying enterprise; heir to the Byzantines by conquest, it still exercised at least nominal sovereignty over vast parts of the Middle East, including all of what we now identify as Syria, Lebanon, Palestine, and the modern state of Israel, Jordan, Mesopotamia, and the coasts of the Arabian peninsula along the Red Sea and Persian Gulf. Wilhelm proclaimed himself a protector of Islam as early as 1898. One of the German projects of this period was to build a railroad from Berlin to Baghdad.

German militarism and expansionism created a volatility in Europe that was primed to explode at the slightest incident, and this found its release on June 28, 1914, with the assassination of the Austrian heir-apparent, Archduke Franz Ferdinand, at Sarajevo. Austria-Hungary saw this as an act of war by Serbia. Serbia, in turn, brought into the war its Russian ally, and by the activation of a series of alliances, most of Europe, including England and France, was at war within a month. Imperial Germany, like Napoleonic France a century earlier, now found itself involved in a war on two European fronts. It is to this desperate circumstance that so many of its disastrous political and diplomatic initiatives in pursuit of the war originated, and their consequences continue to haunt us today.

Turkey joined the Central Powers in October 1914. The historian Peter Hopkirk, in his fascinating book Like Hidden Fire, points out that the first use in the 20th century of a word with which we have become depressingly familiar -- jihad -- occurred in November 1914, when a fatwa was read out in every mosque in the Ottoman Empire proclaiming it. The "infidels" at whom it was directed were carefully defined so as to exclude subjects of Germany and its allies -- the Kaiser's hope was that the large number of Muslims in British India, heeding the call of the Ottoman sultan in his role as Caliph, would rise against their colonial rulers, and at least would divert British resources from the war in Europe.

A similar German machination was the dispatch of V. I. Lenin on the so-called "sealed train" to Russia. The Russian war effort had caused enough distress in that country that a non-communist revolution had taken place there in February of 1917, bringing about the abdication of Czar Nicholas II. The German government hoped that the Bolsheviks under Lenin would cause such unrest that war on the eastern front would cease, allowing Germany to concentrate on fighting in the west. The use of political strategies of this kind, rather than purely military ones, was characteristic of the German approach. A comparable effort to light nearby fires to distract a potential enemy lay behind the so-called Zimmerman telegram, intercepted by British cryptanalysts, in which the German government proposed to induce Mexico to declare war against the United States, and to bring Japan into the fight, should America enter the war on the Allied side.

Unfortunately for Germany, while her efforts to incite turmoil around the world succeeded, they were not sufficient to help the German cause. The Bolshevik revolution prevailed, and the Russian war effort collapsed; it was still too late for the Central Powers. The Ottoman entry into the war on the side of the Central Powers did not bring about the hoped-for rising of Muslims in India. Instead it brought British and French armed forces into the Middle East in hopes of keeping the Turks out of Europe. It was in this connection that the Allies engaged in some dubious political strategies of their own, which backfired just as badly as those of the Germans. Three sets of political promises made at this time raised an imbroglio with which we are still contending.

Arab subjects of the Ottoman Empire had long chafed under their Turkish masters. Their common Islamic faith was not enough to salve the ethnic antipathies that have long characterized the Middle East. The Allies hoped to enlist Arab support against the Turks, promising them self-government if they won. To this end they enlisted the Hashemite family, hereditary sharifs of Mecca, on their side. It was this campaign, under the command of the British General Sir Edmund Allenby, in which the exploits of T. E. Lawrence were so celebrated and prominent.

Another promise of this period was the Balfour declaration, which pledged to the international Zionist movement a homeland in Palestine, which is today the state of Israel. In looking at the circumstances surrounding this well-known event, it must be admitted that its motives were not at all altruistic. Though it had been secured in Britain, through the agency of British subjects of the Jewish faith, it was not primarily aimed at them. Its purpose was to arouse support for the Allied war effort amongst Jews in non-belligerent countries (as the United States was at the time); in Russia, where because of historical circumstances Jews understandably viewed their own government, one of the Allies, with little loyalty; and within the Central Powers themselves.

The final promise of this group was one the British and French made to each other, the Sykes-Picot agreement, by which their respective governments proposed to divide the Middle East between themselves into colonial protectorates, with France to receive Lebanon and Syria, Britain Mesopotamia, Palestine, and parts of Arabia.

The entry of the United States into the war was decisive in turning the tide for the Allies. The German plan to foment a backyard war in Mexico was thwarted by the disorder that country had fallen into during its long revolution. The European war ended in the autumn of 1918. First to collapse was the smallest of the Central Powers, Bulgaria, which signed an armistice on September 29 at Saloniki. The Ottoman Empire gave up on October 30. The battle of Vittorio Veneto ended in defeat for Austria-Hungary, which led to an armistice signed at Padua on November 3. The final armistice with Germany, which we commemorate as Veterans' Day, was signed on November 11 at Compiegne.

In surveying the wreckage we note the fall of four historic monarchical empires: the Russian, the German, the Austro-Hungarian, and the Ottoman. What was to replace them?

The peace of Versailles was punitive to the Germans, requiring the payment of vast indemnities. The British economist John Maynard Keynes correctly foresaw the hyperinflation of the German currency to which this led. The destruction of economic value that followed completed the impoverishment of the German middle class that had begun with the privations of war. Popular discontent with the ineffectual Weimar republic set the stage for Hitler's rise. The fall of the Habsburg monarchy created a handful of small central European states organized primarily upon ethnic or linguistic identity, most of them incapable of sustained self-defense. The Bolsheviks consolidated power in Russia, as "Reds" mopped up the residue of "White" opposition. Out of the Italian victory over Austro-Hungary emerged an ex-soldier named Benito Mussolini, whose ambition it was to reclaim for Italy the Mediterranean littoral once ruled by the Romans.

Within a generation, Europe would again be at war. In the short term, it must be acknowledged that the Soviet Union, and not any of the Western Allies, was the real victor of World War II. It added to its territory the defenseless central European states that had previously been taken by Hitler, and held them for nearly fifty years, while manipulating politics in Western Europe through influential Communist parties, like those of Italy and France, which were in truth wholly owned Soviet subsidiaries. The United States was left only with responsibility of defending Western Europe against the Soviet threat, a net cost to the American taxpayer for which Europeans have rendered scant gratitude. Britain, which sacrificed two generations of the flower of her manhood in two world wars, was so weakened by its loss of blood and treasure that the British Empire was no longer sustainable.

In short, the after-effects of World War I changed the map of Europe, ended a thousand years of monarchical/aristocratic rule, ushered the present age of ideology, and led to ongoing turmoil there for seventy-five years. But this is far from being the end!

Let's turn now to the Middle East. I'm often reminded by that situation of Mel Brooks's movie (and now Broadway play) The Producers. Its plot, as you may recall, is that two dubious entrepreneurs sell half-shares in their musical to far more than two people. Their intention is that the show will flop, the investors will lose their money, and they will be left with the windfall. Instead the show succeeds and they face the dilemma of paying off their backers.

The denouement of World War I in the Middle East put France and Britain in the position of Bialystock and Bloom. How could the promises made to the Arabs by Lawrence and Allenby -- to the Zionists by the Balfour Declaration -- and by Sykes and Picot to their respective countries -- all be honored? It is hard to conceive that all these agreements could have been made in the realistic expectation that the Ottoman Empire, which had since the fall of the Abbasids maintained some sort of peace and order in the region, would collapse completely and leave the Allies in a position where they were expected to deliver. While Islamic militants are quite conscious of history, and point to all sorts of grievances including the mediaeval Crusades, or the loss of "al-Andalus" (Spain -- in 1492!), it is clearly to the aftermath of World War I that they owe their most recent and bitterest ones.

The United States had no direct involvement in the Middle East during World War I, but just as was the case with the war's consequences in Europe, it has been left in the unenviable position of dealing with them in this part of the world. The war's great significance as an historic turning point can be seen in that the world remains unsettled by its effects. It has not been made "safe for democracy," as Woodrow Wilson so naively hoped, and shows no sign of returning to its pre-1914 stability any time soon. *

"[T]here is a degree of depravity in mankind which requires a certain degree of circumspection and distrust." --James Madison

Sunday, 29 November 2015 03:47

How Much Risk Is Too Much?

How Much Risk Is Too Much?

Robert T. Smith

Robert T. Smith is a Senior Environmental Scientist and part owner of an environmental management and site development engineering firm, headquartered in Duquesne, Pennsylvania. He has over 28 years of experience as an Environmental Scientist.

An ancient common law principle has been hijacked by radical environmentalists. Life is full of trade-offs between the risks associated with our daily activities and the benefits. For environmental matters, the Precautionary Principle is a widely accepted idea that has now become a tool for environmental activists to effect social and economic change.

The Precautionary Principle is a human health and environmental protection philosophy that has been internationally accepted. It is most famously set forth as Principle 15 of the United Nations' 1992 Rio Declaration. As defined, the Precautionary Principle states:

. . . where there are threats of serious and irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.

The Precautionary Principle in itself is not necessarily a radical environmentalist concept. The Principle has its basis in the same concepts of common law as "duty of care" or "reasonable care." In general, we are not to act in a manner that could be foreseen to harm others. The key word "foreseen" indicates that there is no guarantee, simply a probability or likelihood that no harm will come to others. These concepts pose an obligation of responsibility to others in society to act prudently and with due diligence, but they recognize that there are inherent risks to any activity.

The Precautionary Principle could also be seen as a part of individual life choices. It recognizes and weighs the relative certainty of harm, or risk, over a wide range of individual health threats, as examples: a high-fatty food diet, smoking, and just about any activity that starts with the phrase, "hold my beer and watch this." The Precautionary Principle is what allows us to drive 65 miles per hour on many of our highways instead of 25 miles per hour. In this example of the speed limit, the level of risk, or probability of harm traveling faster, is at an acceptable level. The risk associated with traveling in our cars could be eliminated only by banning car travel, which only then would result in a no-risk scenario for car travel.

In order to establish what is an acceptable amount of exposure to chemicals or compounds present in our environment, the Precautionary Principle is implemented daily by the federal Environmental Protection Agency (EPA). The EPA has established the Integrated Risk Information System (IRIS). IRIS provides consistent information on chemicals or compounds for use in risk assessments, decision-making, and regulatory activities. This information could be used in many ways, ranging from how much cleanup is needed at a toxic chemically-contaminated property, to the safe amount of chemicals that can be in food additives or food packaging, to the safe amount of chemicals that can be present in toys, or even to the safe amount of chemicals that should be present in drinking water or air. When conducting their risk assessments in support of the IRIS database, the EPA prefers to use conservative input values, calculations, and assumptions to overestimate the risk rather than underestimate it -- i.e., the Precautionary Principle.

We cannot eliminate risk to our health from some of the chemicals and compounds we are exposed to daily; many are naturally occurring in the environment. As an example, surface water and groundwater contain toxic metals such as arsenic and chromium that are naturally occurring by these metals being dissolved from the rocks that make up the earth's crust. These toxic metals can be present within our drinking water sources at concentrations related to their prevalence within the local geologic rock structures, and at times, they can exceed the EPA's allowable probability of harm in drinking water concentration.

For man-made contributions of chemicals to the environment, additions to the natural level of chemicals present typically occur not on a whim, but rather to enhance our daily lives, make them easier, provide products that extend our lives, or mitigate other issues that arise as a result of modern life. An example would be the production of chlorine. An unavoidable outcome for the production of chlorine is a group of carcinogenic compounds called dioxins, which are also produced as part of the chlorine-manufacturing process. While dioxins do pose a human health risk if released from the chlorine production process, the benefits of manufacturing chlorine far outweigh the potential risks associated with dioxins. Chlorine is used as a water purifier throughout our country and the world, and sustains millions of lives by providing potable water free of microbial contamination.

There needs to be a balance between the benefits and conveniences to our lives and the probability of being reasonably protective of our health. Mortality is well-established at 100 percent. There cannot be a no-exposure scenario; some amount of health risk from exposure to naturally occurring and manufactured chemicals is inherent in being alive.

Environmental activists have twisted the Precautionary Principle subjectively and eliminated the consideration of cost-effectiveness. The activists are willing to freely spend others' money to whatever extent they determine is needed. The need to respond to an unreasonable quantifiable risk has been replaced by the activists with a new threshold of demonstrating a reasonable certainty of no harm -- the terms "reasonable certainty" and "no harm" being quite subjective. The Precautionary Principle is now used as an environmental activist's tool for social and economic manipulation.

Consider the use of the Precautionary Principle as it applies to the mandate to phase out incandescent light bulbs in our homes and offices and replace them with compact fluorescent lights (CFLs) containing mercury. This change of lighting was mandated in order to reduce electrical use, thereby reducing electric utility power plant greenhouse gas emissions. Even if we were to accept the allegation of man-induced global warming, the risk associated with a slightly warmer climate is debatable, while the risk from exposure to mercury when a CFL is inevitably broken in our homes is not.

According to the Maine Compact Fluorescent Lamp Study, as a result of a CFL breaking, mercury concentrations in their study room air often exceeded their allowable exposure level, based on Maine's human exposure and mercury toxicological studies for the protection of human health. There are clear and permanent toxic health effects caused by an unacceptable exposure to mercury, based on the federal Agency for Toxic Substances and Disease Registry website http://www.atsdr.cdc.gov/toxfaqs/tf.asp?id=113&tid=24.

With the addition of the mercury-containing CFLs into our homes and offices, the cumulative risk, or probability of harm to human health, has increased, as a result of actions taken to address alleged man-induced global warming. In this example, it is clear that the Precautionary Principle is being used by environmental activists to manipulate social and economic activity. On one hand, CFL bulbs for in-home use are mandated but are a clearly unacceptable risk activity, while on the other hand, the vast uncertainty over man-induced global warming is accepted and used as the basis to cause the unacceptable potential risk of exposure to toxic mercury.

Recent examples of environmental activist misuse of the Precautionary Principle are numerous, and the numbers are growing. We have come to a time in our country where precaution has become subjective prohibition, where humanistic vanity to control others is prevalent, and where proportionality addressing the pressing problems facing our world today has been lost. We should be spending our efforts wisely to maintain our own economic abundance so that we may continue to help ourselves and others. Our abundance allows us to maintain a relatively high standard of living that benefits all in our country, in addition to providing huge contributions throughout the world to assist poor nations to grow crops, access clean water, overcome diseases that are only history to us, and respond to natural and man-made disasters.

Instead, the environmental activists would rather have us deal with a subjectively applied Precautionary Principle, cripple ourselves economically, and diminish our ability to help others and ourselves. We should bear in mind that economic and social manipulation, not reasonable precautions, is the environmental activist's true principle. *

"If Congress can do whatever in their discretion can be done by money, and will promote the General Welfare, the Government is no longer a limited one, possessing enumerated powers, but an indefinite one." --James Madison

When They Dropped the Bomb -- Remembering August 1945

Paul Kengor

Paul Kengor is professor of political science and executive director of the Center for Vision & Values at Grove City College. This article is republished from V & V, a web site of the Center for Vision & Values. Paul Kengor is author of God and Ronald Reagan: A Spiritual Life (2004) and The Crusader: Ronald Reagan and the Fall of Communism (2007). His latest book is The Judge: William P. Clark, Ronald Reagan's Top Hand (Ignatius Press, 2007).

This week marks 65 years since the United States dropped the atomic bomb. On August 6, 1945, President Harry Truman delivered a "rain of ruin" upon Hiroshima, Japan, with Nagasaki hit three days later, killing 100,000 to 200,000.

Truman's objective was to compel surrender from an intransigent enemy that refused to halt its naked aggression. The barbarous mentality of 1940s Japan was beyond belief. An entire nation lost its mind, consumed by a ferocious militarism and hell-bent on suicide. Facing such fanaticism, Truman felt no alternative but to use the bomb. As George C. Marshall put it, the Allies needed something extraordinary "to shock [the Japanese] into action." Nothing else was working. Japan was committed to a downward death spiral, with no end in sight.

"We had to end the war," said a desperate Marshall later. "We had to save American lives."

Evidence shows the bomb achieved precisely that, saving millions of lives, not merely Americans but Japanese. The Japanese themselves acknowledged this, from the likes of Toshikazu Kase to Emperor Hirohito himself. Kase was among the high-level officials representing Japan at its formal surrender aboard the USS Missouri. "The capitulation of Japan," Kase said definitively, "saved the lives of several million men."

As we mark the anniversary of this period, we should first and foremost think about those boys -- our fathers, grandfathers, great grandfathers, uncles, brothers, some now in their 80s and 90s -- who lived lives of faith and freedom and family because of Truman's decision. I've met many of them. Anytime I find myself in conversation with a World War II vet, I ask where he was when the first bomb hit.

"I'll tell you where I was!" snapped George Oakes of Churchill, Pennsylvania.

I was a 22-year-old kid on a troop transport preparing to invade the Japanese mainland. . . . We were sitting there as targets for kamikazes when they dropped the first one. All they told us was that there was a new weapon brought into the war that landed on Japan proper, and everything we were planning was on hold. A couple of days later, they dropped the other one.

Oakes, who served with the Army combat engineers, didn't want to die. "I was engaged to an absolutely beautiful girl named Virginia. All I knew was that I wanted to go home."

George remembered the U.S. military's frustration in striking Japan mercilessly in conventional bombing raids. In one case, Allied bombs killed 100,000 people in Tokyo in one night. As George Marshall noted, "It had seemingly no effect whatsoever . . . . [Japanese] morale was affected, so far as we could tell, not at all."

George Oakes saw that firsthand. "We were bombing the hell out of Japan with B-29s. Every Japanese soldier and person was ready to die for the Emperor. And they weren't surrendering."

No, they weren't. In fact, even after both atomic bombs, the Japanese War Cabinet remained deadlocked on whether to give up. The Emperor broke the stalemate.

"Boy, were we thrilled," recalled George when they got the news on their boat. They were spared an apocalyptic invasion that would have made Normandy look like a picnic at the beach.

When I asked George if he felt gratitude toward President Truman, he responded with some colorful imagery: "Am I thankful? If Harry Truman walked down my street right now, I'd kiss his bare rear-end."

Instead of storming Japan with guns and grenades and flamethrowers, dodging kamikazes, shooting and stabbing and slicing and dicing not only Japanese men but screaming women-and-children-turned-combatants, George went home -- to peace. He became a charter member of East Pittsburgh VFW Post 5008, and worked for Westinghouse for 44 years. He served as scoutmaster for Boy Scout Troop 98 and was a founding member, Eucharistic minister, and greeter at St. John Fisher Church. He was a frequent caller to Pittsburgh radio talkshows and contributor to "Letters to the Editor" sections, which is where he caught my attention when I tracked him down in August 1995.

Oh -- and he married Virginia.

George Oakes of Churchill died on Dec. 12, 2001, at age 78, an extra half-century after Harry Truman dropped the bomb, and arguably because Harry Truman dropped the bomb. For George and Virginia, married 55 years, that meant the added gift of life to three sons. He was buried with honors amid loved ones -- not ripped to bloody, smoldering chunks of flesh on the death-strewn soil of Imperial Japan.

George Oakes was far from alone. There were countless American boys-turned-men, husbands and dads and granddads, in the same boat. *

"Were the pictures which have been drawn by the political jealousy of some among us faithful likenesses of the human character, the inference would be, that there is not sufficient virtue among men for self-government; and that nothing less than the chains of despotism can restrain them from destroying and devouring one another." --Federalist No. 55

Sunday, 29 November 2015 03:47

The Federal Reserve's Historic Announcement

The Federal Reserve's Historic Announcement

Fred A. Kingery

Fred A. Kingery is a self-employed, private-equity investor in domestic and international financial markets from New Wilmington, Pennsylvania, and a guest commentator for The Center for Vision & Values. This article is republished from V & V, a web site of the Center for Vision & Values at Grove City College, Grove City, Pennsylvania.

Mark it down. At 2:15 p.m. on Tuesday, August 10, 2010, the U.S. Federal Reserve made a historic announcement. It signaled that the central bank was going to "preserve the size of its balance sheet." The announcement didn't sound all that dramatic, but don't be fooled. In the two subsequent days, the stock market fell over 300 points, and the price of gold rose $20.

The Fed's balance sheet, which historically consisted of nearly 100 percent U.S. Treasury securities, has grown in size from about $850 billion to a towering $2.3 trillion (or $2,300 billion) currently. In the middle of the financial crisis two years ago, the Fed expanded its holding of securities by purchasing lower-quality, mortgage-backed debt securities primarily from our nation's domestic banking system. The need for this balance sheet expansion was to provide massive liquidity for our entire financial system.

The cash used to purchase the debt securities was literally created out of thin air, or in other words, the Fed simply printed the money. Two years ago, the financial emergency was deemed severe enough to require this dramatic money-printing exercise by the Fed. There was never any intent to make the vast expansion of money injected into the banking system anything other than "temporary" due to the financial emergency. There was always discussion in the financial press and among Fed policy makers of an "exit strategy." The "exit strategy" discussion implied that the inflationary (or even "hyper-inflationary") potential of this massive expansion of the banking system's base reserves was being monitored closely. The financial markets took comfort that the Fed was standing at the ready, to withdraw the cash, should there be any sign that the central bank's monetization exercise was having a negative effect on investor inflation psychology. That feeling of comfort has been dealt a blow with the Fed's announcement on August 10. There is now no "exit strategy" being considered, and the size of the central bank's balance sheet may very well become permanent.

Historically, the U.S. Federal Reserve has been given two primary objectives: one is the preservation of the purchasing power of the U.S. Dollar, and the other is to conduct a monetary policy that supports full employment. It is not an easy task to serve two masters. Additionally, in its role as a central bank, the Fed is to remain an independent institution that resists political influences. This, too, is not an easy task. The Fed's track record as an independent institution that has preserved the purchasing power of our currency and maintained full employment is fully open to challenge. The central bank has not always demonstrated a firm independence from political influence, and the purchasing power of the U.S. Dollar has significantly diminished over the past 40 years.

An independent central bank, free of political influence, has always been a critical cornerstone supporting confidence in whatever the currency the bank is charged with managing. Confidence is the one and only real currency of a central bank. What has just transpired here with the Fed's announcement is that, in no uncertain terms, the central bank has explicitly stated it is prepared to "preserve the size of its balance sheet." And I would add, what it didn't say explicitly, but did signal to the political class in Washington, is a willingness to "further expand the balance sheet dramatically if need be." The fancy term being used to describe its intent here is called "quantitative easing" or "QE."

The real inflationary (or hyperinflationary) risk that the financial markets will calculate very carefully going forward is that the central bank, with this announcement, has now opened itself to being fully co-opted by the political process in Washington. Consider, why make hard political decisions on taxes and spending when the central bank has, in effect, just announced that it stands at the ready to print the money to finance any deficit of any size in order to underwrite any amount of future debt accumulation?

The political class in Washington will see the Fed's announcement as a potential gold mine. They will no doubt attempt to mine it for all it's worth. The significantly rising risk is that the accumulation of future government debt attended to this process will result in hyperinflation rather than a garden-variety, modest inflation.

Hyperinflation occurs when there is a total collapse of confidence in a currency. A central bank that is willing to simply print money out of thin air to finance unlimited amounts of debt will eventually undermine the confidence in the currency being managed as lenders eventually seize on the realization that they will never be paid back in anything other than worthless paper.

The road to hell is indeed paved with good intentions. *

"What a glorious morning this is!" Samuel Adams

Page 26 of 53

Calendar of Events

Annual Seminar 2021
Thu Oct 14, 2021 @ 2:30PM - 05:00PM
Annual Seminar 2022
Thu Oct 13, 2022 @ 2:30PM - 05:00PM
Annual Dinner 2022
Thu Oct 13, 2022 @ 6:00PM - 08:00PM
Annual Seminar 2023
Thu Oct 19, 2023 @ 2:30PM - 05:00PM
Annual Dinner 2023
Thu Oct 19, 2023 @ 6:00PM - 08:00PM