Allan C Brownfeld

Allan C Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Friday, 07 July 2017 10:20

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The Attack on Robert E. Lee Is an Assault on American History Itself

Early in February, the City Council of Charlottesville, Virginia, voted 3-2 to remove a bronze equestrian monument to Robert E. Lee that stands in a downtown park named in his honor. Vice Mayor Wes Bellamy, the council’s only African American member, led the effort to remove the statue. In the end, this vote may be largely symbolic. Those opposed to the statue’s removal intend to file a lawsuit and point to a state statute that says Virginia cities have no authority over the war memorials they inherited from past generations. “If such are erected,” the law reads, “it shall be unlawful for the authorities of the locality, or any other person or persons, to disturb or interfere with any monuments or memorials so erected.”

The attack on the Robert E. Lee statue is, in reality, an attack on American history itself. It has been suggested that the Washington Monument and Jefferson Memorial are inappropriate, since they celebrate men who owned slaves. Those who seek to erase our history seem a bit like the Taliban and ISIS, who are busy destroying historic structures all over the Middle East if they predate the rise of Islam. History is what it is, a mixed bag of mankind’s strengths and weaknesses, of extraordinary achievements and the most horrible depredations. To judge the men and women of past eras by today’s standards is to be guilty of what the Quaker theologian Elton Trueblood called the “sin of contemporaneity.”

Those who refer to slavery as America’s “original sin” should review history. Sadly, from the beginning of recorded history until the 19th century, slavery was the way of the world. When the U.S. Constitution was written in 1787, slavery was legal everyplace in the world. What was unique was that in the American colonies there was a strenuous objection to slavery and that the most prominent framers of the Constitution wanted to eliminate it at the very start of the nation.

Our Judeo-Christian tradition, many now forget, accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever. In the New Testament, St. Paul urges slaves to obey their masters with full hearts and without equivocation. St. Peter urges slaves to obey even unjust orders from their masters.

At the time of its cultural peak, ancient Athens may have had 115,000 slaves to 43,000 citizens. The same is true of Ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market. The British historian of classical slavery, Moses I. Finley, writes: “The cities in which individual freedom reached its highest expression — most obviously Athens — were cities in which chattel slavery flourished.” 

American history is flawed, as is any human enterprise. Yet those who now call for the removal of statues and monuments commemorating our past are measuring our history against perfection, not against other real places. What other societies in 1787 — or any date in history prior to that time — would these critics find freer and more equitable than ours? Where else was religious freedom to be found in 1787? Compared to perfection, our ancestors are found wanting. Compared to other real places in the world, they were clearly ahead of their time, advancing the frontiers of freedom. 

In the case of Robert E. Lee himself, there is more to his story than the Charlottesville City Council may understand. Everyone knows that Lee’s surrender to Ulysses S. Grant at Appomattox effectively ended the Civil War. What few remember today is the real heroism of Robert E. Lee. By surrendering, he was violating the orders given by Jefferson Davis, the elected leader of the Confederacy. The story of April 1865 is not just one of decisions made, but also of decisions rejected. Lee’s rejection of continuing the war as a guerrilla battle, the preference of Jefferson Davis, and Grant’s choice to be magnanimous, cannot be overestimated in importance. 

With the fall of Richmond, Davis and the Confederate government were often on the run. Davis, writes Prof. Jay Winik in his important book April 1865: The Month That Saved America,

“. . . was thinking about such things as a war of extermination . . . a national war that ruins the enemy. In short, guerrilla resistance. . . . The day after Richmond fell, Davis had called on the Confederacy to shift from a conventional war to a dynamic guerrilla war of attrition, designed to wear down the North and force it to conclude that keeping the South in the Union would not be worth the interminable pain and ongoing sacrifice.”

But Robert E. Lee knew the war was over. Grant was magnanimous in victory and, Winik points out,

“. . . was acutely aware that on this day, what had occurred was the surrender of one army to another — not of one government to another. The war was very much on. There were a number of potentially troubling rebel commanders in the field. And there were still some 175,000 other Confederates under arms elsewhere; one-half in scattered garrisons and the rest in three remaining rebel armies. What mattered now was laying the groundwork for persuading Lee’s fellow armies to join in his surrender — and also for reunion, the urgent matter of making the nation whole again.”

Appomattox was not preordained. “If anything,” notes Winik,

“. . . retribution had been the larger and longer precedent. So, if these moments teemed with hope — and they did — it was largely due to two men who rose to the occasion, to Grant’s and Lee’s respective actions: one general, magnanimous in victory, the other gracious and equally dignified in defeat, the two of them, for their own reasons and in their own ways, fervently interested in beginning the process to bind up the wounds of the last four years. . . . Above all, this surrender defied millenniums of tradition in which rebellions typically ended in yet greater shedding of blood. . . . One need only recall the harsh suppression of the peasants’ revolt in Germany in the 16th century, or the ravages of Alva during the Dutch rebellion, or the terrible punishments inflicted on the Irish by Cromwell, and then on the Scots after Culloden, or the bloodstained vengeance executed during the Napoleonic restoration, or the horrible retaliation imposed during the futile Chinese rebellion in the mid-19th century.”

If it were not for Robert E. Lee’s decision not to blindly follow irrational instructions to keep fighting a guerrilla war indefinitely, the surrender at Appomattox never would have taken place and our nation’s history would have been far different. Fortunately, our American tradition has never embraced the notion of blindly following orders, particularly if they involved illegal or immoral acts. No American could ever escape responsibility for such acts by saying, “I was simply following orders.” 

The effort to erase our past, as the Charlottesville City Council proposes, comes about, in large part, because we know so little about our own history. Pulitzer Prize winning historian David McCullough declares that:

“We are raising a generation of people who are historically illiterate. We can’t function in a society if we don’t know who we are and where we came from.”

More than two-thirds of college students and administrators who participated in a national survey were unable to remember that freedom of religion and the press are guaranteed by the Bill of Rights. In surveys conducted at 339 colleges and universities, more than one-fourth of students and administrators did not list freedom of speech as an essential right protected by the First Amendment. 

If we judge the past by the standards of today, must we stop reading Plato and Aristotle, Sophocles, and Aristophanes, Dante and Chaucer? Will we soon hear calls to demolish the Acropolis and the Coliseum, as we do to remove memorials to Washington and Jefferson, and statues of Robert E. Lee? Must we abandon the Bible because it lacks modern sensibility? Where will it end? As theologian Elton Trueblood declared, “contemporaneity” is indeed a sin. We would all do well to avoid its embrace.”

Free Speech Is Not Only Under Attack at Our Universities, But “Objective Truth” Itself Is Referred to as a “Racist Construct”

Free speech is not faring well on the nation’s college and university campuses. In mid-April, the University of California at Berkeley canceled a scheduled talk by conservative author Ann Coulter in what The New York Times called “the latest blow to the institution’s legacy and reputation as a promoter and bastion of free speech.” In a letter to the Berkeley College Republicans, which was sponsoring the talk, two vice chancellors said the university had been “unable to find a safe and suitable venue for your April 27 event . . .”

In February, a speech by controversial right-wing writer Milo Yiannopoulos, also sponsored by the College Republicans, was canceled after masked protestors smashed windows, set fires, and pelted the police with rocks. The Washington Post notes that:

“The decisions by U.C.-Berkeley to cancel both events are especially notable given the campus’s role during the 1960s and 1970s as the birthplace of the Free Speech Movement and its long tradition of social protest.”

Throughout the country, assaults on free speech are widespread at our colleges and universities. In March, author Charles Murray of the American Enterprise Institute was forced to abandon a lecture at Middlebury College in Vermont. The professor who was hosting him, a liberal Democrat, was assaulted. Recently, Notre Dame students said that they felt “unsafe” at the prospect of Vice President Mike Pence speaking at their commencement. In April, the Student Senate at the University of California at Davis voted to remove the American flag from their meetings. One student declared that the flag “represents a history of genocide, slavery and imperialism.”

Things at our universities are becoming increasingly difficult to understand. The Wall Street Journal reports that:

“Every year, Stanford asks its applicants an excellent question: ‘What matters to you, and why?’ Ziad Ahmed of Princeton, N.J. summed up his answer in three words. His essay consisted of the hashtag ‘#BlackLivesMatter’ repeated 100 times. He got in.”

Carrying things to an extreme even unusual for the advocates of political correctness, a group of students at California’s five college Claremont Consortium says that objective truth is itself a “myth” espoused by “white supremacists.” This came after Pomona College President David Oxtoby released a statement in defense of free speech after conservative author Heather MacDonald of the Manhattan Institute had an event disrupted at nearby Claremont McKenna College.

President Oxtoby’s letter was met with a list of demands by minority activist students who called MacDonald “a white supremacist fascist supporter of the police state,” and objective truths, such as those cited in the Declaration of Independence, “a means of silencing oppressed peoples.” The authors, Dray Denson, Avery Jonas, and Shanaya Stephenson, received 22 co-signers. They said that silencing conservative speakers, like MacDonald, whose work has been published widely in The Wall Street Journal, The Washington Post, and other newspapers and journals, is a valid option for activists since such speaking engagements constitute “a form of violence.”

During her lecture, MacDonald was attempting to discuss the rise of anti-police attitudes when she was derailed by protestors banging on windows and shouting “F--k The Police” and “Black Lives Matter.” Campus security ultimately forced MacDonald to live stream her lecture from a near-empty room across campus. 

In his letter, President Oxtoby wrote that:

“Protest has a legitimate and celebrated place on college campuses. What we cannot support is the act of preventing others from engaging with an invited speaker. Our mission is founded upon the discovery of truth, the collaborative development of knowledge and the betterment of society.”

This call for free speech was rejected by the student protestors. They wrote:

“Free speech, a right that many freedom movements have fought for, has recently become a tool appropriated by hegemonic institutions. It has not just empowered students from marginalized backgrounds to voice their qualms and criticize aspects of the institution, but it has given those who seek to perpetuate systems of domination a platform to project their bigotry. Thus, if our mission is founded upon the discovery of truth, how does free speech uphold that value?”

The students said that the very idea of objective truth is a concept devised by “white supremacists” in an “attempt to silence oppressed peoples.” They declare that:

“Historically, white supremacy has venerated the idea of objectivity and wielded a dichotomy of ‘subjectivity vs. objectivity’ as a means of silencing oppressed peoples. The idea that there is a single truth —‘the Truth’ — is a construct of the Euro-West that is deeply rooted in the Enlightenment, which was a movement that also described Black and Brown people as both subhuman and impervious to pain. This construction is a myth and white supremacy, imperialism, colonization, capitalism, and the United States of America are all of its progeny. The idea that truth is an entity for which we must search, in matters that endanger our abilities to exist in open spaces, is an attempt to silence oppressed peoples.”

The assault on Heather MacDonald, viewed as a mainstream commentator, not an extremist, was particularly harsh. The students write:

“If engaged, Heather MacDonald would not be debating on mere difference of opinion, but the right of Black People to exist. Heather MacDonald is a fascist, a white supremacist, a War-hawk, a transphobic, a queerphobe, a classist, and ignorant of interlocking systems of domination that produce the lethal conditions under which oppressed peoples are forced to live. . . . Engaging with her, a white supremacist fascist supporter of the police state, is a form of violence.”

The assault on Western Civilization at our universities is hardly new. In the 1980s, Jesse Jackson led a group of militant students through the campus chanting, “Hey Hey, Ho Ho, Western Civilization Has Got To Go.” The opposition to transmitting our culture and civilization is based on the unusual idea that only books, music, and art created by men and women who share our own racial or ethnic background can be meaningful to us and should be transmitted. Under such a notion, only Jews could read the Bible, only Greeks could contemplate Plato or Aristotle, only those of English descent read Shakespeare, and only Italians appreciate Dante or Leonardo da Vinci.

Western culture is relevant to men and women of all races and backgrounds, particularly those living in the midst of our Western society — such as the students at Pomona College. The distinguished black scholar W. E. B. Du Bois recognized this reality when he wrote more than a hundred years ago:

“I sit with Shakespeare and he winces not. Across the color line, I walk arm in arm with Balzac and Dumas, where smiling men and welcoming women glide in gilded halls. From out of the caves of evening that swing between the strong-limbed earth and the tracery of the stars, I summon Aristotle and Aurelius and what soul I will, and they come all graciously, with no scorn or condescension. So, wed with truth, I dwell above the veil.”

When the attacks upon transmitting Western civilization began at our universities, in his address to the freshman class at Yale College in September 1990, Donald Kagan, Professor of History and Classics and Dean of the College, declared:

“The assault on the character of Western civilization badly distorts history. The West’s flaws are real enough, but they are common to almost all the civilizations known in any continent at any time in human history. What is remarkable about the Western heritage, and what makes it essential, are the important ways in which it has departed from the common experience. More than any other it has asserted the claims of the individual against those of the state, limiting the state’s power and creating a realm of privacy into which it cannot penetrate. . . . Western Civilization is the champion of representative democracy as the normal way for human beings to govern themselves, in place of the different varieties of monarchy, oligarchy, and tyranny that have ruled most of the human race throughout history and rule most of the world today. It has produced the theory and practice of separation of church and state, thereby creating a safe place for individual conscience. At its core is a tolerance and respect for diversity unknown in most cultures. One of its most telling characteristics is its encouragement of itself and its ways. Only in the West can one imagine a movement to neglect the culture’s own heritage in favor of some other.”

Our civilization is now under attack on many of our university campuses, as is the idea of objective truth itself, as the students at Pomona College have shown us. When will universities finally decide to remove from their campuses students who silence the speech of those with whom they disagree? When will alumni cut back their contributions to institutions that embrace identity politics and limit the speech of those who dare to differ? This is a serious challenge to our institutions of higher learning. Some of them are resisting. Others, such as Berkeley, seem to be acquiescing. It is hard to imagine student protestors who deny that there is such a thing as “truth” being taken seriously. That many take such irrationality as legitimate discourse tells us as much about today’s academic world as it does about those who would destroy free speech.

The Russian Revolution at 100: Remembering the Naïve Westerners Who Embraced It

One hundred years ago, Russia’s czar was overthrown and Communism began its reign. Sunday, March 12, is the date generally recognized as the start of the uprising. In Moscow, no celebrations are planned. Evidently the country remains too divided over Communism’s legacy. Mikhail Zygar, a Russian journalist and author of the book All The Kremlin’s Men, points out that:

“Vladimir Putin cannot compare himself to Nicholas II, nor to Lenin, nor to Kerensky because that is not Russian history to be proud of. In terms of 1917, nothing can be used as a propaganda tool.” 

Communism’s toll was a heavy one. The Black Book of Communism, an 846-page academic study, holds Communism responsible for the deaths of between 85 million and 100 million people worldwide. It estimates that the ideology claimed 45 million to 72 million in China, 20 million in the Soviet Union, between 1.3 million and 2.3 million in Cambodia, 2 million in North Korea, 1.7 million in Africa, 1.5 million in Afghanistan, 1 million in Vietnam, 1 million in Eastern Europe, and 150,000 in Latin America.

Through all those years, many intellectuals in the West insisted on disassociating Communism from the crimes committed in its name. Incredibly, in retrospect, we see many Western academics, clergymen, journalists, and literary figures not resisting Communist tyranny, but embracing it, defending it, and apologizing for it.

Consider the German playwright Bertolt Brecht, who created the modern propaganda play. When he visited the Manhattan apartment of American philosopher Sidney Hook in 1935, Stalin’s purges were just beginning. Hook, raising the cases of Zinoviev and Kamenev, asked Brecht how he could bear to work with the American Communists who were trumpeting their guilt. Brecht replied that the U.S. Communists were no good — nor were the Germans either — and that the only body that mattered was the Soviet party. Hook pointed out that they were all part of the same movement, responsible for the arrest and imprisonment of innocent former comrades. 

Brecht replied: “As for them, the more innocent they are, the more they deserve to be shot.” Hook asked, “Why, why?” Brecht did not answer. Hook got up, went into the next room and brought Brecht his hat and coat. During the entire course of Stalin’s purges, Brecht never uttered a word of protest. When Stalin died, Brecht’s comment was: “The oppressed of all five continents . . . must have felt their heartbeats stop when they heard that Stalin was dead. He was the embodiment of all their hopes.” 

Another case in point is French philosopher Jean Paul Sartre. In a July 1954 interview with Liberation, Sartre, who had just returned from a visit to Russia, said that Soviet citizens did not travel, not because they are prevented from doing so, but because they had no desire to leave their wonderful country. “The Soviet citizens,” he declared, “criticize their government much more and more effectively than we do.” He maintained that, “There is total freedom of criticism in the Soviet Union.”

Another intellectual defender of tyranny was Lillian Hellman, the American playwright. She visited Russia in October 1937, when Stalin’s purge trials were at their height. On her return, she said she knew nothing about them. In 1938 she was among the signatories of an ad in the Communist publication New Masses that approved the trials. She supported the 1939 Soviet invasion of Finland, saying: “I don’t believe in that fine, lovable little Republic of Finland that everyone is weepy about. I’ve been there and it looks like a pro-Nazi little republic to me.” There is no evidence that Hellman ever visited Finland — and her biographer states that this is “highly improbable.”

The American Quaker H. T. Hodgkin provided this assessment: “As we look at Russia’s great experiment in brotherhood, it may seem to us some dim perception of Jesus’ way, all unbeknown, inspiring it.”

The case of New York Times correspondent Walter Duranty who covered the Soviet Union in the 1930s is also instructive. In the midst of the enforced famine in the Ukraine, Duranty visited the region and denied that starvation and death were rampant. In November 1932, Duranty reported that “. . . there is no famine or actual starvation nor is there likely to be.” In The Times of August 23, 1933, Durany wrote: “Any report of a famine in Russia is today an exaggeration or malignant propaganda. . . . The food shortage which has affected almost the whole population last year . . . has, however, caused heavy loss of life.”

He estimated the deaths at nearly four times the usual rate, but did not blame Soviet policy. What Americans got was not the truth — but false reporting. But its influence was widespread. What Walter Duranty got was the highest honor in journalism — the Pulitzer Prize for 1933, complimenting him for “dispassionate, interpretive reporting of the news from Russia.” The citation declared that Duranty’s dispatches — which the world now knows to be false — were “marked by scholarship, profundity, impartiality, sound judgment, and exceptional clarity.”

Walter Duranty was only one of many correspondents and writers in the 1920s and 1930s who fed their readers in the West a steady diet of disinformation about the Soviet Union. Louis Fischer, who wrote for The Nation magazine, was also reluctant to tell his readers about the flaws in Soviet society. He, too, glossed over the searing famine of 1932-33. He once referred to what we now know as the “Gulags” as “a vast industrial organization and a big educational institution.” In 1936, he informed his readers that the dictatorship was “voluntarily abdicating” in favor of “democracy.”

Somehow, liberal intellectuals, who were harsh in their judgment of the American society, eagerly embraced the ruthless dictatorship of Joseph Stalin. Concerning the forced collectivization of Soviet agriculture, author Upton Sinclair wrote: “They drove rich peasants off the land — and sent them wholesale to work in lumber camps and on railroads. Maybe it cost a million lives — maybe it cost five million — but you cannot think intelligently about it unless you ask yourself how many millions it might have cost if the changes had not been made.”

W. E. B. Du Bois, the black intellectual, thought that, “He (Stalin) asked for neither adulation nor vengeance. He was reasonable and conciliatory.” It was not only Stalin who was embraced by many in the West but Mao as well. Visiting in Communist China, New York Times columnist James Reston said that he thought Chinese Communist doctrines and the Protestant ethic had much in common and was generally impressed by “the atmosphere of intelligent and purposeful work.” (New York Times, July 30, 1971). He wrote:

“China’s most visible characteristics are the characteristics of youth . . . a kind of lean, muscular grace, relentless hard work, and an opportunistic and even amiable outlook on the future. . . . The people seem not only young but enthusiastic about their changing lives.”

Reston also believed that young people from the city who were forced to work as manual laborers in rural areas “were treating it like an escape from the city and an outing in the countryside.” When Mao died in 1976, the Times devoted three pages to his obituary, but only a few lines alluded to his enormous crimes against the Chinese people. It has been estimated that Mao was responsible for the deaths of 30 million to 60 million people. The Times referred to the execution of “a million to three million people, including landlords, nationalist agents, and others suspected of being class enemies.” The Washington Post also devoted three pages to Mao, concluding, “Mao the warrior, philosopher, and ruler was the closest the modern world has been to the God-heroes of antiquity.” The Post acknowledged that some three million persons had lost their lives in the 1950 “reign of terror,” but the only victims mentioned were “counter-revolutionaries.”

In his landmark study of intellectual support for Communism, Political Pilgrims, Professor Paul Hollander writes that an important myth to be laid to rest “is the belief in the unflinching commitment of intellectuals to freedom, and particularly to freedom of expression.” In the case of the Soviet Union and other Communist societies, he notes:

“It is very clear that the absence of freedoms . . . hardly concerned the visitors or interfered with the attractions of these societies. To the extent that the lack of free expression was observed — and it is by itself noteworthy how frequently it was overlooked — it was excused or rationalized on the familiar grounds of temporary necessity, amply compensated for by the various achievements of the regimes concerned.”

In addition, states Hollander:

“Attributions of idealism and disinterestedness also call for re-examination when intellectuals move with lightning speed from vehement moral indignation and moral absolutism (generally reserved for their own society) to a strangely pragmatic moral relativism brought to the assessment of policies of countries they are committed to support. . . . Scott Nearing, who often left his home in Maine in November rather than watch hunters kill deer, defended Soviet tanks in Budapest (in 1956). . . . Such misjudgments and moral double ‘bookkeeping’ (or double standards) are in part due to the readiness to believe ‘the other side.’”

The anniversary of the Russian Revolution is particularly meaningful for those of us who are old enough to remember the reality of what Communism was really like. This writer spent time in Eastern Europe during the darkest days of Communist rule, visiting both the Berlin Wall and Czechoslovakia in 1969, shortly after the Soviet Union brutally marched into Prague and put down the attempts at liberalization. Wherever one went in Czechoslovakia, the contempt for the occupying Soviet Army was clear. At a student club I visited, when word got around that an American was on the premises, many young people came by to extend greetings. I was invited to the homes of a number of Czechs who openly declared their hostility to Communism and their desire for their country to once again join the Western world. It is fair to say that I did not encounter a single Czech who spoke well of either Communism or the Soviet Union.

As we commemorate the 100th anniversary of the Russian Revolution, we should remember how easily naïve Westerners were eager to embrace it. Vladimir Putin, who served Communism as a KGB agent, was an eager participant in the Communist enterprise. It is interesting to observe his current reluctance to celebrate the totalitarian and imperialistic system to which he devoted much of his life. Sadly, he now seems intent upon restoring as much of the Soviet empire as he can and to destabilize NATO, the EU, and our own country. Let us hope that we do not engage in the same wishful thinking about Putin’s goals and objectives that so many in the West did about Communism. Remembering those who naïvely embraced tyranny should immunize us against following such a path in the future — that is if we are willing to learn from history, something that is all too rare.     *

 

 

Monday, 27 March 2017 14:39

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

America Is Exceptional — But Now There Is an Effort to Make It Ordinary

Our society is unique in history — in other words “exceptional.” Ronald Reagan described it as a “City on a Hill.” Now, we confront an effort to make it ordinary, to build walls, promote fear of strangers, and promote a narrow nationalism. Perhaps those who would make America small and narrow do not understand what generations of Americans, Republicans and Democrats, liberals and conservatives, have meant by the term “American exceptionalism.”

America has never simply been another country. From the very beginning, its vision of Liberty attracted people of every ethnic background and religion. At the time of the American Revolution, Thomas Paine wrote:

“If there is a country in the world where concord, according to common calculation, would be least expected, it is America. Made up, as it is, of people from different nations, accustomed to different forms and habits of government, speaking different languages, and more different in their modes of worship, it would appear that the Union of such a people was impracticable. But by the simple operation of constructing government on the principles of society and the rights of man, every difficulty retires and the parts are brought into cordial unison.”

In Redburn, (1849), Herman Melville spelled out a vision of America which is as true today as it was then:

“There is something in the contemplation of the mode in which America has been settled that, in a noble breast, should forever extinguish the prejudices of national dislikes. Settled by the people of all nations, all nations may claim her for their own. You cannot spill a drop of American blood without spilling the blood of the whole world. Be he Englishman, German, Dane or Scot: the European who scoffs at an American . . . stands in danger of judgment. We are not a narrow tribe of men. . . . No: our blood is as the flood of the Amazon, made up of a thousand noble currents all pouring into one. We are not a nation, so much as a world.”

To make America simply another country, concerned only with its narrow self-interest, is to reverse our history. F. Scott Fitzgerald wrote:

“France was a land. England was a people, but America, having about it still the quality of an idea, was harder to utter — it was the graves at Shiloh, and the tired, drawn, nervous faces of its great men, and the country boys dying in the Argonne for a phrase that was empty before their bodies withered. It was a willingness of the heart.”

Today, our country is the most powerful and most prosperous in the world. We defeated Communism, Fascism and Nazism. Of course, there are always challenges to be confronted. ISIS threatens the West with terrorism, and it is important that it be defeated. But the promotion of fear by some in Washington is irrational. In his first Inaugural Address, in the midst of the Great Depression, as democracy was collapsing in Europe, Franklin D. Roosevelt told the country that:

“. . . the only thing we have to fear is fear itself — nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”

Now, our new administration stirs fear with no basis for doing so. Scholars of the subject say they can think of no previous president so enamored as Mr. Trump of scare tactics. Historian Robert Dallek says, “If he frightens people, it puts him in the driver’s seat. These are what I think can be described as demagogic tendencies.”

There is nothing conservative about what we are hearing from the White House in recent days. Of the catchphrase “America First,” the antecedents of which go back to keeping the U.S. out of the war against Nazism, which many who now use it do not understand, conservative commentator Charles Krauthammer writes:

“Some claim that putting America first is a reassertion of American exceptionalism. On the contrary, it is the antithesis. It makes America no different from all the other countries that define themselves by a particularist blood-and-soil nationalism. What made America exceptional, unique in the world, was defining its own national interest beyond its narrow economic and security needs to encompass the safety and prosperity of a vast array of allies. A free world, markedly open trade and mutual defense was President Truman’s vision, shared by every president since. Until now. . . . For seventy years, we sustained an international system of open commerce and democratic alliances that has enabled America and the West to grow and thrive. Global leadership is what made America great. We abandon it at our peril.” 

The people around presidential adviser Stephen Bannon and the “alt-right” philosophy he promoted on the Breitbart news website, have more in common with the right-wing racial nationalism to be found in European parties such as Marine Le Pen’s National Front in France than anything in our own history. To what degree President Trump has embraced such views, remains less than clear. But many traditional conservatives see all of this as a dramatic departure from American exceptionalism. New York Times columnist David Brooks notes that:

“We are in the midst of a Great War of national identity. We thought we were in an ideological battle against radical Islam, but we are really fighting the national myths spread by Trump, Bannon, Putin, Le Pen and Farage. We can argue about immigration and trade and foreign policy, but nothing will be right until we restore and revive the meaning of America. Are we still the purpose-driven experiment Lincoln described and Emma Lazarus wrote about: assigned by providence to spread democracy and prosperity; to welcome the stranger. . . . Or are we just another nation, hunkered down in a fearful world?”

In 1866, Lord Acton, the British liberal leader, said that America was becoming the “distant magnet.” Apart from “the millions who have crossed the ocean, who shall reckon with the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West.”

America has been a nation much loved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This, of course, is only natural. But America has been beloved not only by native Americans, but by men and women throughout the world who have yearned for freedom. America dreamed a bigger dream than any nation in the history of man. Now, in Washington, that dream is being replaced with something far different. The Republican Party, which always embraced the idea of American exceptionalism, one to which Ronald Reagan and conservatives were particularly committed, now has a choice. Will it abandon its vision of America as exceptional and adopt the very ordinary nationalism that now is manifesting itself in the White House, or will it maintain its belief in an America that is, indeed, something new and positive in history? All of us will be losers if the vision of America embraced by the Founding Fathers and generations of Americans is abandoned by those whose notion of America is narrow and completely ahistorical.

The Strange Assault on Thomas Jefferson at the University He Founded

At the University of Virginia, its founding father, Thomas Jefferson, is under attack by some students and faculty.

After the November presidential election, university president Theresa Sullivan wrote a letter in which she quoted Jefferson in expressing the hope that students from the University would help our republic. Sullivan wrote:

“By coincidence, on this exact day 191 years ago — November 9, 1825, in the first year of classes at the University of Virginia — Thomas Jefferson wrote to a friend that U.V. students ‘. . . are not of ordinary significance only; they are exactly the persons who are to succeed to the government of our country, and to rule its future enmities, its friendships and fortunes.’ I encourage today’s U.V. students to embrace that responsibility.”

Almost immediately, a response was drafted by Noelle Brand, an assistant professor of psychology, who declared that Thomas Jefferson “was deeply involved in the racist history of this university” and he noted that:

“We would like for our administration to understand that although some members of this community may have come to this university because of Thomas Jefferson’s legacy, others of us came here in spite of it. For many of us, the inclusion of Jefferson quotations in these e-mails undermines the message of unity, equality, and civility that you are attempting to convey.”

Approximately 500 students and faculty signed the letter, with more adding their names later. President Sullivan responded that:

“Quoting Jefferson (or any historical figure) does not imply an endorsement of all the social structures and beliefs of his time, such as slavery and the exclusion of women and people of color from the university.”

Sullivan acknowledged “the university’s complicated Jeffersonian legacy.” She pointed out that:

“Today’s leaders are women and men, members of all racial and ethnic groups, members of the LGBTQ community and adherents of all religious traditions. All of them belong to today’s University of Virginia whose founders most influential and quoted words were ‘all men are created equal.’ Those words were inherently contradictory in an era of slavery, but because of their power they became the fundamental expression of a more genuine equality today.”

What President Sullivan’s critics are doing is applying the standards of 2016 to 1787, when the Constitution was written, and finding our ancestors seriously deficient. They are guilty of what the Quaker theologian Elton Trueblood called “the sin of contemporaneity,” of applying the standards of our own time to those who have come before. It is possible to look at the colonial period from both the vantage point of the period that preceded it as well as the period which has followed. This is instructive when considering the question of slavery.

Slavery played an important part in many ancient civilizations. Indeed, most people of the ancient world regarded slavery as a natural condition of life, one that could befall anyone at any time, having nothing to do with race. It has existed almost universally through history among peoples of every level of material culture — among nomadic pastoralists in Asia, hunting societies of North American Indians and sea people such as the Norsemen. The legal codes of Sumer provide documentary evidence that slavery existed there as early as the 4th millennium B.C. The Sumerian symbol for slave in cuneiform writing suggests “foreign.”

When the Constitutional Convention met in Philadelphia in 1787, not a single nation had made slavery illegal. As they looked back through history, the framers saw slavery as an accepted and acceptable institution. It was not until 1792 that Denmark became the first Western nation to abolish the slave trade. In 1807, the British Parliament passed a bill outlawing the slave trade — and slavery was abolished in British colonies between 1834 and 1840. France freed the slaves in its colonies in 1848. Spain ended slavery in Puerto Rico in 1873, and in Cuba in 1886. Brazil abolished slavery in 1888.

The respected British historian of classical slavery, Moses L. Finley, writes:

“The cities in which individual freedom reached its highest expression — most obviously Athens — were cities in which chattel slavery flourished.”

The same is true of Ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market.

Our Judeo-Christian tradition was also one that accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever, but to employ poor Jews as servants only, and to free them and their children on the year of Jubilee. There is no departure from this approach to slavery in the New Testament. St. Paul urges slaves to obey their masters with full hearts and without obfuscation.

What is historically unique is not that slavery was the accepted way of the world in 1787, but that so many of the leading men of the American colonies of that day wanted to eliminate it — and pressed vigorously to do so.

Benjamin Franklin and Alexander Hamilton were ardent abolitionists. John Jay, who would become the first Chief Justice, was president of the New York Anti-Slavery Society. Rufus King and Gouverneur Morris were in the forefront of opposition to slavery.

One of the great debates at the Constitutional Convention related to the African slave trade. George Mason of Virginia made an eloquent plea for making it illegal. He said:

“Every master of slaves is born a petty tyrant. They bring the judgment of heaven on a country.”

In his original draft of the Declaration of Independence, one of the principal charges made by Thomas Jefferson against King George III and his predecessors was that they would not allow the American colonies to outlaw the importation of slaves. When Jefferson was first elected to the Virginia legislature at the age of twenty-five, his first political act was to begin the elimination of slavery. Though unsuccessful, he tried to further encourage the emancipation process by writing into the Declaration of Independence that “all men are created equal.” In his draft of a constitution for Virginia, he provided that all slaves would be emancipated in that state by 1800, and that any child born in Virginia after 1801, would be born free. This, however, was not adopted.

In his draft, instructions to the Virginia delegation to the Continental Congress of 1774, published as “A Summary View of the Rights of British America,” Jefferson charged the British crown with having prevented the colonies from abolishing slavery in the interest of avarice and greed:

“The abolition of domestic slavery is the great object of desire of these colonies, where it was, unhappily, introduced in their infant state. But previous to the enfranchisement of the slaves we have, it is necessary to exclude all further importations from Africa. Yet our repeated efforts to effect this by prohibition, and by imposing duties which might amount to a prohibition, have been hitherto defeated by his Majesty’s negative.” 

Thomas Jefferson and the other framers of the Constitution were imperfect men and it is not difficult to discover their personal flaws. But these imperfect men did an extraordinary thing in creating a new nation, which now has the world’s oldest continuous form of government. Professor Forrest McDonald points out that:

“The framers were guided by principles but not by formulas. They understood that no form or system of government is universally desirable or workable; instead, if government is to be viable, it must be made to conform to human nature and to the genius of the people — to their customs, morals, habits, institutions, aspirations. The framers did just that, and thereby used old materials to create a new order for the ages.” 

While the majority of the framers of the Constitution were opposed to slavery, a small minority supported it and if it were outlawed the union never would have come into being. Thus, they compromised. What they did do was outlaw the slave trade as of 1808 and Congress, in 1787, outlawed slavery in the new territories by passing the Northwest Ordinance. It was, we must remember, the framers of the Constitution who were the first duly constituted authority in the Western world to act decisively against slavery. 

One wonders how much of this history is known by those who wrote and signed the letter calling upon University of Virginia President Theresa Sullivan to stop quoting Thomas Jefferson. To her credit, President Sullivan understands the distinction between intrinsic principle and historical personality. To hold leaders of the past to the standards of the present time is to be guilty of missing the larger message of our history. Jefferson and our other Founding Fathers set in place a system of government that permitted growth and change. While they may not have shared the views of today, neither did Socrates, Plato, Dante, or Shakespeare. Shall we only be able to quote those from the 20th and 21st centuries who share the standards we only ourselves came to accept a very short time ago? This would be “contemporaneity” gone mad.

Thomas Sowell Ends His Column, But His Intellectual Legacy Will Only Grow

Thomas Sowell, one of America’s foremost public intellectuals and most outspoken black conservatives, submitted his final column in December after 25 years in syndication. At 86, he said, he thought the time had come to retire from this enterprise. Hopefully, his other literary pursuits will continue.

For more than 50 years, Sowell has published books and journals on race, economics, and government policy. He grew up in Harlem and was the first member of his family to go beyond 6th grade — eventually graduating from Harvard. A self-proclaimed Marxist in his 20s, he received his Ph.D. in economics from the University of Chicago, where he studied under Milton Friedman, the Nobel Prize-winning economic advocate of free markets. Sowell slowly lost faith in the ability of government to effectuate positive change in our economic life. He taught economics at Cornell and UCLA and has been a senior fellow at the Hoover Institution at Stanford University since 1980. (Shortly after he moved into his office at Stanford, I visited him there. I remember having dinner at a Mexican restaurant in Palo Alto, and putting a tape recorder on the table, and engaging in a lengthy interview, which was subsequently published in Human Events).

Thomas Sowell examined the history of race relations in America, and throughout the world. He questioned much of the orthodoxy to be found in intellectual circles and asked — and tried to answer — the most difficult questions. Do certain groups advance in society at varying rates because of the attitude of society toward them? Does discrimination against a given group cause it to do less well economically and educationally than those groups that do not face such external barriers?

In a landmark study, “The Economics and Politics of Race: An International Perspective”(1983), followed by an impressive succession of important books, Sowell uses an international framework to analyze group differences. Examining the experience of different groups in more than a dozen countries, he seeks to determine how much of each group’s economic fate has been due to the surrounding society and how much to internal patterns that follow the same group around the world.

The Italians in Australia and Argentina, for example, show social and economic patterns similar in many respects to those of Italians in Italy or in the United States. Chinese college students in Malaysia specialize in very much the same fields that they specialize in American colleges — a far different set of specializations from those of other groups in both countries. Germans have, similarly, concentrated in very similar industries and occupations in South America, North America, or Australia.

Analyzing the successes of each group, Sowell points to the group’s culture, which rewards some behaviors over others, as the determinant of skills, orientations and therefore economic performance. “Race may have no intrinsic significance,” he writes, “and yet be associated historically with vast cultural differences that are very consequential for economic performance.”

In Southeast Asia, for example, the overseas Chinese have been subjected to widespread discrimination. Quota systems were established in government employment and in admission to universities in Malaysia, and a “target” of 30 percent Malayan ownership in business and industry was established. In Indonesia, a 1959 law forbade the Chinese to engage in retailing in the villages. Chinese-owned rice mills were confiscated. In the Philippines, it was decreed that no new Chinese import business could be established, and Chinese establishments were closed by law. 

Despite all of this, Sowell points out, the Chinese thrived. As of 1972, they owned between 50 and 95 percent of the capital in Thailand’s banking and finance industries, transportation, wholesale and retail trade, restaurants, and the import and export business. In Malaysia, the Chinese earned double the income of Malays in 1976, despite a massive government program imposing preferential treatment of Malays in the private economy. In the U.S., as in Southeast Asia, writes Sowell, “The Chinese became hated for their virtues.” Despite discrimination, the Chinese advanced rapidly in the U.S., as did the Japanese, who met similar forms of racial bigotry, including special taxes and job restrictions.

In Europe, Sowell points out, precisely the same story can be told with regard to Jews. Anti-semitism was a powerful force in many countries, yet Jews continued to advance. Although Jews were only one percent of the German population, they became 10 percent of the doctors and dentists, 17 percent of the lawyers, and won 27 percent of the Nobel Prizes awarded Germans from 1901 to 1975. In the U.S., notes Sowell:

“Although the Jewish immigrants arrived with less money than most other immigrants, their rise to prosperity was unparalleled. Working long hours at low pay, they nevertheless saved money to start their own small businesses . . . or to send a child to college. While the Jews were initially destitute in financial terms, they brought with them not only specific skills but a tradition of success and entrepreneurship which could not be confiscated or eliminated, as the Russian and Polish governments had confiscated their wealth and eliminated most of their opportunities.”

In the case of blacks in the U.S., Sowell shows that West Indians have advanced much more rapidly than native born American blacks because of major cultural differences. In the West Indies, slaves had to grow the bulk of their own food — and were able to sell what they did not need from their individual plots of land. They were given economic incentives to exercise initiative, as well as experience in buying, selling, and managing their own affairs — experiences denied to slaves in the U.S.

The two black groups — native-born Americans and West Indians — suffered the same racial discrimination, but advanced at dramatically different rates. By 1969, black West Indians earned 94 percent of the average income of Americans in general, while native blacks earned only 62 percent. Second generation West Indians in the U.S. earned 15 percent more than the average American. More than half of all black-owned businesses in New York State were owned by West Indians. The highest-ranking blacks in the New York City Police Department in 1970 were all West Indians, as were all the black judges in the city.

It is a serious mistake, Sowell believes, to ignore the fact that economic performance differences between whole races and cultures “are quite real and quite large.” Attitudes of work habits, he argues, are key ingredients of success or failure. The market rewards certain kinds of behavior, and penalizes other behavior patterns — in a color-blind manner. Blaming discrimination by others for a group’s status, he states, ignores the lessons of history. 

Political efforts to address the “problems” of minorities, such as race-based affirmative action programs, usually fail, Sowell reports, because they refuse to deal with the real causes of such difficulties:

“. . . political ‘solutions’ tend to misconceive the basic issues. . . . Black civil rights leaders . . . often earn annual incomes running into hundreds of thousands of dollars, even if their programs and approaches prove futile for the larger purpose of lifting other blacks out of poverty.”

 Crucial to a group’s ability to advance is the stability of its family life and the willingness to sacrifice:

“. . . more than four-fifths of all white children live with both their parents. But among black children, less than half live with both parents. . . . What is relevant is the willingness to pay a price to achieve goals. Large behavioral differences suggest that the trade-off of competing desires vary enormously among ethnic groups. . . . The complex personal and social prerequisites for a prosperous level of output are often simply glided over, and material wealth treated as having been produced somehow, with the only real question being how to distribute it justly.”

If we seek to understand group differences, it is to “human capital” that we must turn our attention, Sowell declares. The crucial question is not the fairness of its distribution but “whether society as a whole — or mankind as a whole — gains when the output of both the fortunate and the unfortunate is discouraged by disincentives.”

It is Sowell’s view that many black leaders have not served their constituencies but themselves. Instead of expressing concern over the decline of the black family, the increasing out-of-wedlock birth rate, the rise of inner-city crime — they speak only of “discrimination.” Instead of calling for an end to such government licensing laws as those that limit the number of taxicabs in cities such as New York and Philadelphia, they call for more government “make-work” jobs. 

While many blame all problems within the black community on the legacy of slavery, Sowell points to the fact that more black children lived in two-parent families during slavery, Reconstruction, and the years of segregation than at the present time. He writes that:

“In reality, most black children were raised in two-parent homes even during the era of slavery, and for generations after, blacks had higher rates of marriage than whites in the early 20th century, and higher rates of labor force participation in every census from 1890 to 1950. The real causes of the very different patterns among blacks in the world of today must be sought in the 20th century, not in the era before emancipation.”

Tom Sowell has been telling the hard truth for many years, and has received much abuse for doing so. He has been a strong advocate for a genuinely color blind society, in which men and women would be judged on their individual merit, not on the basis of race. All Americans who believe in such a society, and believe that one’s view about economic, political, and other matters should be based on the facts as one sees them — not on race, religion, or ethnicity as the promoters of today’s “identity politics” would have it — should recognize what a champion of freedom Sowell has been. We will miss his regular column, but hope he will continue to share his wisdom with us. It is certain that his intellectual legacy will grow for it is based upon scholarship and a search for truth, not upon the changing needs of our political class for convenient and popular responses to the complex challenges we face. Sadly, there are too few such people among us. For a free society to thrive, we need more Thomas Sowells. We have been lucky indeed to have him with us.

Washington Once Again Shows Us That “Congressional Ethics” Is an Oxymoron

On the very first day of the new Congress, House Republicans met in secret. Their very first order of business was to vote to eliminate the quasi-independent office that investigates House ethics. Rep. Bob Goodlatte (R-VA) was the architect of the attack on the Office of Congressional Ethics, known as O.C.E. The rules change would have prevented the office from investigating potentially criminal allegations, allowed members of the House Ethics Committee to shut down any O.C.E. investigation, and silenced staff members in their dealings with the news media. 

The O.C.E. was created in 2008, after a series of bribery and corruption scandals involving members of both parties. Three House members were sent to jail. Among those joining Rep. Goodlatte in calling for the end of O.C.E. were Rep. Blake Farenthold (R-TX), who had been investigated by the O.C.E. for sexual harassment, Rep. Peter Roskam (R-IL), who was investigated after he and his wife took a $24,000 trip to Taiwan, which appeared to have been improperly paid for by the Taiwanese government, and Rep. Sam Graves (R-MO), who was ranking member of the House Committee on Small Business in 2009 when he invited expert testimony on the renewable fuel industry from a representative of a renewable fuels business in which his wife had a financial stake, a potential conflict of interest. Another advocate of ending O.C.E. was Rep. Steve Pearce (R-NM), who last year tried to eliminate the entire O.C.E. budget after it investigated one of his staff members. Or consider Rep. Duncan Hunter (R-CA), another supporter of eliminating O.C.E., who has used campaign funds for personal expenses, which is illegal. Among his reported expenditures: $1,400 for a dentist, $2,000 for a Thanksgiving trip to Italy, and $600 to take his children’s pet bunny on a commercial airplane. (After these expenses were exposed, he reimbursed his campaign $62,000). The list of those supporting the elimination of O.C.E. who have been the targets of investigation is not a short one.

President-elect Donald Trump quickly weighed in, questioning the priorities of Republican members of Congress. Shortly after, lawmakers were summoned to the basement of the Capitol for a meeting with Republican leaders. Rep. Kevin McCarthy (R-CA), the majority leader, asked his fellow Republicans whether they had campaigned to repeal the Affordable Care Act or to eliminate the ethics office? Shortly after this, the idea of eliminating the O.C.E. was scrapped.

This was not the first time that House lawmakers — Democrats and Republicans — had tried to curtail the powers or budget of the O.C.E. In 2011, Rep. Melvin Watt (D-NC), who later left Congress to join the Obama administration, tried to cut the agency’s budget by 40 percent, a proposal that failed on a 302-102 vote. The Republican effort, just after the election of Donald Trump, who promised to “drain the swamp” of Washington, was viewed as tone-deaf in the extreme. The vote to eliminate the O.C.E., noted The Economist, “showed those lawmakers to lack self-awareness to an amazing degree.” Rep. Walter Jones (R-NC) said:

“Mr. Trump campaigned that he was going to drain the swamp, and here we are on Day One trying to fill the swamp. . . . I just could not believe that the Congress does not understand that, if anything, we need to bring sunshine in.”

Many years ago, Mark Twain pointed out that Congress was our only “native born criminal class.” The evidence in recent years would fill many pages. In 2009, Rep. William Jefferson (D-LA) was convicted of corruption charges in a case made famous by the $90,000 in bribe money stuffed into his freezer. Federal jurors found Jefferson guilty of using his congressional office as a criminal enterprise to enrich himself, soliciting and accepting hundreds of thousands of dollars in bribes to support his business ventures in Africa. While the Jefferson case is an extreme example of congressional corruption, his attorney’s defense that, in effect, “everyone does it,” is not as far fetched as it may appear. Other members of Congress may not have $90,000 in their freezers, but too many are guilty of questionable activities. 

Just as Jefferson’s trial began, we learned of Sen. John Ensign’s (R-NV) affair with an aide and the subsequent payments to her family by his parents. Also at that time, Rep. Charles Rangel (D-NY), then chairman of the House Ways and Means Committee, was the subject of several ethics investigations over matters ranging from his occupying four apartments at below market rents in a Harlem building owned by a prominent real estate developer, and his admission that he had neglected to pay some taxes by failing to report $75,000 in income in rental income earned from a beachfront villa in the Dominican Republic. The Wall Street Journal commented: “Ever notice that those who endorse high taxes and those who actually pay them aren’t the same people?”

There is, of course, the larger question of the ethical standards of the Congress, beyond activities that are clearly illegal. Members of Congress subsidize, in one form or another, a host of special interests — farmers, businessmen, Wall Street, universities, welfare recipients, labor unions — and each group has a special Political Action Committee (PAC) that contributes to members’ campaigns. Cuts in subsidies to these groups will provoke cuts in contributions. The result: every group gets what it wants, and the budget deficits skyrocket. Added to this business-as-usual subsidization are the bailouts of failed banks, Wall Street firms, and auto companies — turning traditional ideas of free enterprise on their head. This is the “crony capitalism” now embraced by both political parties. 

We have created in America a permanent political class that has an interest in ever-expanding government. The party out of power always says government is too big — but once it comes to power, it makes it even bigger. Republicans accuse Democrats of being supporters of “big government,” which is true enough, but government power has also grown dramatically under Richard Nixon, Ronald Reagan, George H. W. Bush and George W. Bush. When will voters finally understand that both of our political parties are co-conspirators in the growth of both government power and our huge deficits? This is something the Founding Fathers sought to prevent — and would have been sorry to see. But they wouldn’t have been surprised.

Thomas Jefferson, in a letter to Edward Carrington, observed that:

“The natural progress of things is for liberty to yield and government to gain ground. . . . One of the profoundest preferences in human nature is for satisfying one’s needs and desires with the least possible exertion; for appropriating wealth produced by the labor of others, rather than producing it by one’s own labor. . . . In other words, the stronger the government, the weaker the producer, the less consideration need be given him and the more might be taken away from him. A deep instinct of human nature being for these reasons in favor of strong government, nothing could be a more natural progress of things than for Liberty to yield and government to gain ground.”

It was because of their fear of governmental power that the Framers of the Constitution limited government through the Bill of Rights and divided its authority through our federal system. By establishing the executive, legislative, and judicial branches — and by dividing authority between the state and national governments — the Framers hoped to ensure that no branch of government would ever obtain so much power that it would be a threat to freedom. 

The kind of activist government we have now — involved in every aspect of people’s lives, even running an automobile company — is the opposite of what the Founding Fathers had in mind. From the beginning of history, the great philosophers predicted that democratic government would not long preserve freedom. Plato, Aristotle and, more recently, De Tocqueville, Lord Bryce and Macauley predicted that men would give away their freedom voluntarily for what they perceived as greater security. French political philosopher Bertrand De Jouvenel noted that:

“The state, when once it is made the giver of protection and security, has but to urge the necessities of its protectorate and over-lordship to justify its encroachments.”

Voters say that they are against big government and oppose inflation and deficit spending, but when it comes to their own particular share, they act in a different way entirely. Walter Judd, who represented Minnesota in Congress for many years, once recalled that a Republican businessman from his district:

“. . . who normally decried deficit spending, berated me for voting against a bill which would have brought several million federal dollars into our city. My answer was, ‘Where do you think federal funds for Minneapolis come from? People in St. Paul?’. . . My years in public life have taught me that politicians and citizens alike invariably claim that government spending should be restrained, except where the restraints cut off federal dollars flowing into their cities, or their pocketbooks.”

If each group curbed its demands upon government, it would not be difficult to balance the budget and restore health to the economy. But as long as we allow politicians to solicit virtually unlimited amounts of money from those special interests with business before Congress, this is unlikely — and both parties are in it together. Human nature leads to the unfortunate situation in which, under representative government, people have learned that they can secure funds for themselves that have, in fact, been produced by the hard work of others.

This point was made more than 200 years ago by the British historian Alexander Tytler:

“A democracy cannot exist as a permanent form of government. It can only exist until the voters discover they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidate promising the most benefits from the public treasury — with the result that democracy collapses over a loose fiscal policy, always to be followed by dictatorship.”

The Founding Fathers never envisioned the creation of a permanent political class such as the one we have now. They believed that men would be farmers, businessmen, doctors, lawyers, teachers — and would devote several years of their lives to public service and then go home to their careers. Today, however, we have professional politicians — men and women who support their families by holding public office and intend to do so for many years. When they do leave public office, most do not go home and many remain in Washington as high-priced lobbyists. Their motivation, it seems, is clearly whatever will permit them to do so, not the long-run best interests of the country. Incumbents running for re-election in one-party districts raise millions from special interests that they do not need for their campaigns, and can keep it when they leave Congress.

The incoming Trump administration promises to “drain the swamp.” It will be interesting to see how — and if — this proceeds. Still, we must keep in mind that Members of Congress respond to our demands. As long as we — whether individuals, farmers, Wall Street banks or any other special interest, seek to be subsidized by government, and this is the price they must pay for our support, the politicians of both parties will comply. In this sense, our own selfishness, as well as theirs, is the culprit. The term “congressional ethics” may indeed be an oxymoron. But the ethics of the rest of us may not be far behind. In a sense, then, we have the kind of government we deserve — one that indeed represents our values. A brazen effort to eliminate the independent ethics office by a secret vote of House Republicans shows us how far we have gone down this path.     * 

Sunday, 22 January 2017 14:13

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Why Did Fidel Castro, a Brutal Dictator, Attract So Much Western Support?

The death of Fidel Castro at the age of 90 marks the end of a long life filled with brutally inflicting a tyrannical regime upon the people of Cuba. People with AIDS were confined to sanitariums. Artists and writers were forced to join an official Union and told that their work must support the Castro regime. In 1965, Castro admitted to holding 20,000 political prisoners. Foreign observers said the number was twice as high. The Castro regime carried out thousands of political executions. 

Fidel Castro eliminated the celebration of Christmas. There were no elections and only a state-controlled press. Hundreds of thousands of Cubans simply left, most of them for the United States. Soon, Castro imposed restrictions, making it almost impossible to leave the country. In April 1980, he opened the port of Mariel to any Cuban wishing to leave. More than 125,000 people — branded as “worms” and “scum” by Castro — took advantage of the “boatlift” before it ended in October of that year. By 1994, economic conditions were so bad that riots in Havana were followed by another exodus. Thousands fled from the country’s beaches on makeshift rafts. 

Under Castro, Cubans lived mostly on black beans and rice. Once one of the richest countries in Latin America, under Castro, Cuba sank into decay and poverty. Castro himself lived in luxury. His former bodyguard, Juan Sanchez, reports the Castro lived on a private island, Cayo Pledra, and liked to travel aboard a large yacht with Soviet-built engines, the Asuarama II.

After taking power, he turned on those of his former comrades who naively thought his “revolution” would bring democracy, not tyranny. One of them, Huber Matos, a long- time democratic opponent of the dictator Fulgencio Batista, protested against Castro’s increasing closeness with Moscow. After a show trial, including a 7-hour tirade of denunciation by Castro. Matos was jailed for 20 years — 16 in solitary confinement, during which he was repeatedly tortured. 

Another victim was the poet Armando Valladares, originally a supporter of the revolution who, as Castro’s anti-democratic policies emerged, refused to put an “I’m With Fidel” sign on his office desk. He was charged with “terrorism” and sentenced to 30 years. He served 8,000 days — 20 years — often confined to cells so small he could not lie down. 

First, Valladares was sent to the huge complex on Isla de Pinos, where 100 lbs. of foodstuffs each day were allotted to feed 6,000 prisoners. Ironically, during the Batista regime, Fidel Castro had been held in the same prison. But Valladares points out:

“. . . he had been allowed visitors, national and international news, uncensored books, unlimited correspondence, a conjugal pavilion, and any food he wanted. He had never been mistreated.”

In his widely read book Against All Hope, Valladares quotes from a letter written by Castro on April 4, 1955:

“I get sun several hours every afternoon. . . . I’m taking two baths a day now. . . . I’m going to have dinner. . . . spaghetti and squid, Italian chocolates for dessert, then fresh-brewed coffee. . . . Don’t you envy me? . . . What would Karl Marx say about such revolutionaries?”

Under Castro, prison was quite different. Valladares and his fellow inmates suffered repeated beatings at the hands of the guards and were isolated for long stretches of time. Often, they were taken to punishment cells where they were held naked, unwashed and unable to escape the stench and disease produced by their own accumulating wastes. The food was the equivalent of a near-starvation diet. Less than a pound was allotted for every fifty prisoners each day — and this included almost no protein or vitamins.

The level of medical care in the prisons was reminiscent of the Nazi death camps. After repeated beatings, Valladares was suffering excruciating pain in his leg. He writes that:

“The military doctor was a Communist who tried to look like Lenin, wearing the same kind of goatee. . . . He wore the uniform of a doctor but was a sadist. When I asked for medical care, he looked through the peephole, stared at my leg, and told me he hoped it turned into a good case of gangrene, ‘so I can come in myself and cut it off.’”

While Fidel Castro imposed a totalitarian regime upon the people of Cuba he was, somehow, viewed in heroic terms by many Americans and others in the West, particularly intellectuals.

Author Norman Mailer, the pillar of many radical causes, declared:

“So Fidel Castro, I announce to the city of New York that you gave all of us who are alone in this country. . . . some sense that there were heroes in the world. . . . It was as if the ghost of Cortez had appeared in our century riding Zapata’s white horse. You were the first and greatest hero to appear in the world since the Second World War.”

Elizabeth Sutherland, book and arts editor of The Nation, wrote, “He (Castro) seems, first of all, utterly devoted to the welfare of his people — and his people are the poor, not the rich.” Author Jonathan Kozol declared, “Each of my two visits to Cuba was a pilgrimage and an adventure.” The writer Susan Sontag wrote that, “. . . it seems sometimes as if the whole country (Cuba) is high on some beneficial kind of speed, and has been for years.” Frank Mankiewicz, once an aide to Sen. George McGovern and later head of National Public Radio, visited Cuba with Kirby Jones and wrote a book lauding the revolution. He and Jones found Castro “one of the most charming and entertaining men either of us had ever met.”

Author France’s Fitzgerald, originally a sympathizer with the Cuban revolution, observed that:

“Many North American radicals who visited Cuba or live there have performed a kind of surgery on their critical faculties and reduced their conversation to a form of baby talk, in which everything is wonderful including the elevator that does not work and the rows of Soviet tanks on military parade that are in ‘the hands of the people.’”

When Castro visited New York in 1995 to address the U.N., Mort Zuckerman, owner of The New York Daily News, hosted a reception for him at his penthouse on Fifth Avenue. Time Magazine declared, “Fidel, Takes Manhattan!” Newsweek called Castro “The Hottest Ticket in Manhattan.” 

The adoration of Castro by Western intellectuals was hardly unique. They also embraced Stalin. In 1954, the French philosopher Jean Paul Sartre returned from a visit to the Soviet Union and declared that Soviets did not travel, not because they were prevented from doing so, but because they had no desire to leave their wonderful country. “The Soviet citizens,” he declared, “criticize their government much more, and more effectively, than we do. There is total freedom of criticism in the Soviet Union.”

Even during Stalin’s purge trials, many Western intellectuals warmly embraced the brutal dictator. Playwright Lillian Hellman, for example, visited Moscow in October 1937 — at the height of the trials — and returned to sign an ad in the Communist publication New Masses that approved of them. She even supported the 1939 Soviet invasion of Finland. Discussing Stalin’s powers, the British writers Beatrice and Sidney Webb wrote:

“He (Stalin) has not even the extensive power which the Congress of the U.S. has temporarily conferred on President Roosevelt or that which the American Constitution entrusts for four years to every successive President. . . . Stalin is not a dictator. . . . he is the duly elected representative of one of the Moscow constituencies of the Supreme Soviet. . . ” (The Truth About Soviet Russia, 1942). 

The world’s reaction to Fidel Castro’s death gives little indication that a brutal dictator has died. Vladimir Putin called Castro “a wise and strong leader . . . . an inspiring example for all the world’s people’s.” Narendra Modi, Prime Minsters of India, called Castro “one of the most iconic personalities of the 20th century.” Bashar Al-Assad of Syria, himself a brutal dictator, called Castro “a great leader.” Even Canadian Prime Minister Justin Trudeau referred to him as “a remarkable leader.”

It is important that the world recognize Fidel Castro’s real legacy. Yale historian Carlos Eire portrays that legacy in these terms:

“He turned Cuba into a colony of the Soviet Union and nearly caused a nuclear holocaust. He sponsored terrorism wherever he could and allied himself with many of the worst dictators on Earth. He was responsible for so many thousands of executions and disappearances in Cuba that a precise number is hard to reckon. He brooked no dissent and built concentration camps and prisons at an unprecedented rate, filling them to capacity, incarcerating a higher percentage of his own people than most other modern dictators, including Stalin. . . . He persecuted gay people and tried to eradicate religion. He censored all means of expression and communication. . . . He created a two-tier health-care system, with inferior medical care for the majority of Cubans and superior care for himself and his oligarchy. . . .”

Why Fidel Castro attracted admirers in our own society is part of the larger question of why Stalin and Communism itself had appeal to men and women who seemed indifferent to Communism’s rejection of free speech, free elections, a free press, freedom of religion, freedom of movement, and had contempt for the rights of minorities, racial, religious, ethnic and sexual. Now that Fidel Castro is dead, perhaps those who admired him will take a closer look at his legacy, one which the suffering Cuban people will, hopefully, overcome and move beyond to a better, freer, and more prosperous future.

By Opposing Charter Schools, the NAACP Would Harm the Black Students Whose Interest It Claims to Support

At its national convention in July, the NAACP approved a resolution calling for a moratorium on the expansion of privately managed charter schools. The Movement For Black Lives, a network of Black Lives Matter organizers, also passed resolutions criticizing charter schools and calling for a moratorium on their growth. The NAACP went so far as to liken the expansion of charters to “predatory lending practices” that put low-income communities at risk.

Charter schools provide parents with an opportunity for school choice and give inner city parents an opportunity to remove their children from poorly performing schools. Several studies by Stanford University’s Center for Research on Education Outcomes found that students enrolled in charter schools in 41 of the nation’s urban regions learned significantly more than their traditional public school counterparts. According to one study, charter school students received the equivalent of 40 days of additional learning a year in reading. Educational gains for charter school students turned out to be significantly larger for black, Hispanic, low-income, and special education students in both math and reading.

Although some charter schools have been poorly run, a performance advantage has been found to be particularly significant in the San Francisco Bay Area, Boston, Washington, D.C., Memphis, and Newark. There is heavy demand for more charter schools among low-income black and Latino families who are often trapped in failing school districts.

Where charter schools are doing well, demand for admission is high. In New York City, charter schools enroll about 107,000 students, roughly 10 percent of the city’s total enrollment. But more than 44,000 students who sought admission for the current school year were turned away. In Harlem and the South Bronx there are now four applicants for every charter school seat.

The Black Alliance for Free Educational Options (BAEO) and the National Alliance for Public Charter Schools has launched a campaign to tell the story of why more than 700,000 African American families have chosen charter schools. More than 160 African American advocates and community leaders have urged the NAACP to reconsider and learn more about how charter schools are helping black families.

In a letter, they declare:

“A blanket moratorium on charter schools would limit black students’ access to some of the best schools in America and deny black parents the opportunity to make decisions about what’s best for their children. Instead of enforcing a moratorium, let’s work together to improve low-achieving public schools and expand those that are performing well.”

One of the letter’s signers was Cheryl Brown Henderson, daughter of Oliver Brown, plaintiff in Brown v. Board of Education, and founding president and CEO of the Brown Foundation for Educational Equity, Excellence and Research. Brown Henderson said:

“Over 60 years ago my father joined with numerous parents to stand with the NAACP and fight for African American students stuck in a separate, broken education system. Brown v. Board of Education created better public education options for African American students and made it the law of the land that neither skin color, socioeconomic status, nor geography should determine the quality of education a child receives.”

In opposing charter schools and real school choice, the NAACP is flying in the face of the views of black parents. There are now more than 6,600 charter schools across the nation, educating nearly three million children. Black students account for 17 percent of charter school enrollment nationally. The American Federation for Children national school choice survey, conducted in January 2016, found that 76 percent of African Americans support school choice. Polling by BAEO also found that the majority of black voters surveyed supported charter schools.

In September, more than 25,000 parents, students, and educators attended a rally in Brooklyn’s Prospect Park calling for an expansion of the city’s charter schools. Among the speakers was actor and hip-hop artist Common, who said, “I’m here to tell you that you participating and being a part of charter school success stories is your path to possibility.” In New York City, black charter school students were 60 percent more likely than their public school counterparts to earn a seat in one of the city’s specialized high schools.

The idea of choice in education, which includes a voucher system and charter schools, is attracting both liberal and conservative support. Opposition comes primarily from teacher unions. Last summer, speaking to the National Education Association, Hillary Clinton declared: “When schools get it right, whether they’re traditional public schools or public charter schools, let’s figure out what’s working and share it with schools across America.” For this statement, Clinton was booed by NEA members.

Editorially, The Washington Post declared:

“The reaction speaks volumes about labor’s uninformed and self-interested opposition to charter schools and contempt for what’s best for children. . . . Since the first charter school opened 25 years ago in Minnesota, support for the non-traditional schools has grown with nearly 3 million students in more than 6,700 charters in 42 states and the District. Demand is high with parents of school-age children — particularly those who have low incomes — overwhelmingly saying they favor the opening of more charter schools. . . . We urge the NAACP leadership to put the interests of African American children ahead of the interests of political allies who help finance the group’s activities. . . .”

Wall Street Journal columnist Jason Riley, who is black, says of the NAACP:

“The organization would rather deny black children good schools than risk losing money from teacher unions. The organization’s primary concern today is self-preservation and maintaining its own relevance, not meeting the 21st century needs of the black underclass.”

While some charter schools have had problems, as in Detroit, where charter schools are not outperforming the traditional school alternative, the real reason for opposition by teacher unions, the NEA, and the NAACP is that they threaten the union monopoly on education. Most charters are non-Union and their growth is at the expense of poorly performing union-run public schools.

The NAACP interest in opposing charter schools seems to have nothing to do with the well being of black students. According to the Labor Department, unions have given the NAACP and its affiliates at least $3 million since 2010. The two major national teacher unions, the American Federation of Teachers (AFT) and the National Education Association (NEA) gave the NAACP $265,000 last year, significantly increasing their contribution between 2010 and 2014.

Teacher unions have also been giving financial aid to the Congressional Black Caucus (CBC), which influences groups such as the NAACP. According to the Labor Department, the AFT and NEA have given the CBC Foundation and CBC Institute $911,000 since 2010. Open Secrets campaign donation data shows that the AFT and NEA have given CBC members $253,000 and $206,000 respectively, this cycle.

It is ironic to see civil rights groups oppose charter schools even though black Americans are learning more at charters than at traditional public schools. In Boston, for example, students in charter middle schools outperform those in traditional public schools by two to three years worth of learning in math and about half that in reading. The Black Alliance for Educational Options, a pro-charter civil rights group, calls the NAACP resolution “inexplicable,” and urges the NAACP board to reject it.

Competition in education, which gives parents a right to choose where their children should be educated, is something all Americans should support. It is particularly important for low-income families in minority communities, who are often consigned to poorly performing schools. Conservatives have long embraced the idea of free choice in the form of a voucher system and of charter schools. Observing the success of such schools, many liberal voices, such as The New York Times and The Washington Post, have joined them. The NAACP should rethink its position if it hopes to remain relevant to the needs of the constituency it seeks to represent.

The Latest Target of Political Correctness on Campus: America’s “Melting Pot” Tradition

Strange things are happening in the name of political correctness at colleges and universities across the country.

California State University at Los Angeles (CSULA) has debuted segregated housing available to students who “identify as Black/African-Americans.” The Halisi Scholars Black Living Learning Community has opened approximately nine months after the CSULA Black Student Union issued a list of demands including “black student only” living space with a “full time resident director who can cater to the needs of black students.” Racially segregated housing can also be found at other universities, including University of California branches at Davis and Berkeley and the University of Connecticut.

A student at the University of Houston was punished for tweeting “All Lives Matter” after the shooting of five policemen in Dallas. The university’s student government sentenced the offending student to undergo mandatory diversity training. At Princeton, the word “man” is considered sexist. Employees were told to use gender-neutral terms such as “human beings.” At the University of Iowa, a clinical professor of pediatrics wrote to the athletic director expressing dismay over the ferocious facial expressions of Herky the Hawk. Herky is the mascot of the Hawkeyes, and was criticized for conveying an “invitation to act aggressively and even violence,” and lacking in “emotional diversity.” We could fill pages with similar examples.

Last year, University of California administrators released a document warning professors not to describe America as a “melting pot” because this unduly pressured minorities to “assimilate to the dominant culture.” This is an assault on the very important history of our country embracing men and women of every race, religion, and ethnic background, and making them into Americans.

When the melting pot philosophy was alive and well, our society succeeded dramatically. Immigrants from around the world entered an America that had self-confidence and believed in its own culture, history, and values and was determined to transmit them to the newcomers. And the immigrants wanted to become Americans. That, after all, is why they came.

Remembering the way American public schools served to bring children of immigrants into the mainstream, Fotine Z. Nicholas, who taught for 30 years in New York City Schools and wrote an education column for a Greek-American weekly, noted:

“I recall with nostalgia the way things used to be. At P.S. 82 in Manhattan, 90 percent of the students had European-born parents. Our teachers were mostly of Irish origin, and they tried hard to homogenize us. We might refer to ourselves as Czech, or Hungarian, or Greek, but we developed a sense of pride in being American. . . . There were two unifying factors, the attitude of our teachers and the English language. . . . After we started school, we spoke only English to our siblings, our classmates, and our friends. We studied and wrote in English, we played in English, we thought in English.”

America is indeed a nation of immigrants. Speaking in Philadelphia in 1776, Samuel Adams declared:

“Driven from every other corner of the earth, freedom of thought and the right of private judgment in matters of conscience direct their course in this happy country as their last resort.”

Those who think that the idea of the “melting pot” is, somehow, demeaning to those who come to our country as immigrants fail to understand the reality of what has happened in America during the past centuries. In his now famous letter to the Jewish congregation in Newport, Rhode Island in 1790, George Washington wrote:

“The Citizens of the United States of America have a right to applaud themselves for having given to mankind examples of an enlarged and liberal policy: a policy worthy of imitation. All possess alike liberty of conscience and immunities of citizenship. . . . For happily the Government of the United States, which gives to bigotry no sanction, to persecution no assistance, requires only that they who live under its protection should demean themselves as good citizens giving it on all occasions their effectual support.”

The man who coined the term “melting pot” was the British author Israel Zangwill. In a now famous passage, written in 1904, he wrote:

“America is God’s Crucible, the Great Melting Pot, where all the races of Europe are reforming. Here you stand, good folk, think I, when I see them at Ellis Island, here you stand in your fifty groups and your fifty languages and histories and your fifty blood-hatreds and rivalries. But you won’t long be like that, brothers, for these are the fires of God you’ve come to — these are the fires of God. A fig for your feuds and vendettas, Germans and Frenchmen, Irishmen and English, Jews and Russians, into the crucible with you all. God is making the American.”

America has been a nation much loved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This, of course, is only natural. Yet, America is not simply another country. To think that it is — is to miss the point of our history. America has been beloved not only by Americans, but by men and women throughout the world who have yearned for freedom. By the millions they have come and found here the opportunities that existed in no other place.

America dreamed a bigger dream than any other nation in history. It was a dream of a free society in which a person’s race, religion, or ethnic origin would be completely beside the point. It was a dream of a common nationality in which the only price to be paid was a commitment to fulfill the responsibilities of citizenship. In the 1840s, Herman Melville wrote:

“We are the heirs of all time and with all nations we divide our inheritance. On this Western Hemisphere all tribes and peoples are forming into one federated whole and there is a future which shall see the estranged children of Adam restore as to the old hearthstone in Eden. The seed is sown and the harvest must come.”

America has been a new thing in the world, not without problems and challenges, which afflict any human enterprise, and which persist today. Yet, it remains a beacon for men and women in search of freedom in every corner of the world. When the enforcers of political correctness seek to proscribe the “melting pot” from our history, we can only lament that those in charge of some of our colleges and universities understand so little of the American story. How will the new generation learn that story at universities like these? That should be a question that concerns us all.

“Cultural Appropriation”: A Growing Political Correctness Tactic to Silence Free Expression

In the name of something called “cultural appropriation,” a growing assault upon free expression is now under way as “political correctness” expands its horizons.

This attack takes many forms. After the 2013 American Music Awards, Katy Perry was criticized for dressing like a geisha while performing her hit single, “Unconditionally.” Arab-American writer Randa Jarrar accused a Caucasian woman who practiced belly-dancing of “white appropriation of Eastern dance.” Daily Beast entertainment writer Amy Zimmerman wrote that pop star Iggy Azalea perpetuated “cultural crimes” by imitating African-American rap styles. At Oberlin College, students protested a “piratization” of Japanese culture when sushi was served in the school dining hall.

In 2015, the Museum of Fine Arts in Boston was charged with cultural insensitivity and racism for its “Kimono Wednesdays.” At the event, visitors were invited to try on a replica of the kimono worn by Claude Monet’s wife Camille in the painting “La Japonaise.” The historically accurate kimonos were made in Japan for that very purpose. Still, Asian-American activists and their supporters surrounded the exhibit with signs like, “Try on the kimono: learn what it’s like to be a racist imperialist today.” Others attacked “Yellow-face@the MFA” on Facebook. The museum eventually apologized and changed the program so that the kimonos were available for viewing only. Still, activists complained that the display invited a “creepy Orientalist gaze.”

At an Australian writers festival in Brisbane in September, American author Lionel Shriver stirred much attention by criticizing as runaway political correctness efforts to ban references to ethnicity, gender, or sexual orientation from Halloween celebrations, or to prevent artists from drawing on ethnic sources for their work. Ms. Shriver, the author of 13 books, was especially critical of efforts to stop novelists from “cultural appropriation.” She deplored critics of authors like Clive Cleave, an Englishman, for presuming to write from the point of view of a Nigerian girl in his best-selling book Little Bee.

Ms. Shriver noted that she had been criticized for using in The Mandibles the character of a black woman with Alzheimer’s disease, who is kept on a leash by her homeless white husband. And she defended her right to depict members of minority groups in any situation if it served her artistic purposes. “Otherwise, all I could write about would be smart-alecky 59-year-old 5-foot-2-inch white women from North Carolina,” she said.

Writing in The New York Times after the meeting in Australia, Ms. Shriver, criticized her fellow liberals for embracing cultural conformity:

“Do we really want every intellectual conversation to be scrupulously cleansed of any whiff of controversy? Will people be so worried about inadvertently giving offense, avoid those with different backgrounds altogether? Is that the kind of fiction we want — in which the novels of white writers all depict John Cheever’s homogeneous Connecticut suburbs of the 1950s, while the real world outside their covers becomes ever more diverse? . . . Protecting freedom of speech involves protecting the voices of people with whom you may violently disagree. In my youth, liberals would defend the right of neo-Nazis to march down Main Street. I cannot imagine anyone on the left making that case today.”

Professor Susan Scafidi of the Fordham University Law School notes that:

“Taking intellectual property, traditional knowledge, cultural expressions, or artifacts from someone else’s culture without permission is the definition [of] cultural appropriation.”

Writing in The New York Review of Books, novelist Francine Prose asks:

“Should Harriet Beecher Stowe have been discouraged from including black characters in Uncle Tom’s Cabin — a book that helped persuade the audience of the evils of slavery? Should Mark Twain have left Jim out of Huckleberry Finn, a novel that, more fully than any historical account, allows modern readers to begin to understand what it was like to live in a slave-owning society? Should someone have talked Kazuo Ishiguro out of writing The Remains of the Day, the beautiful novel whose protagonist — a white butler in England before World War II — presumably shares few surface similarities with his creator? Should immigrant writers and writers of color be restricted to portraying their own communities?”

Francine Prose asks questions which today’s cultural police seem never to have considered:

“What would modern art be like if the impressionists and later Van Gogh had not been so profoundly affected by Japanese woodblock prints or if Picasso and Braque had not been drawn to the beauty and sophistication of African Art? Should Roberto Bolano, a Chilean who lived mostly in Mexico, not have focused, in the third section of his novel 2666, on an African-American journalist, or set the novel’s final chapters in Europe during World War II? Don’t we want different cultures to enrich one another? Reading Chekov, we are amazed by his range, by his ability to see the world through the eyes of the rich and the poor, men and women, the old and the young, city dwellers and peasants. But had he caved to the pressures of identity politics and only described characters of his own gender and class, few of his six hundred or so stories would have been written.”

Another author, Cathy Young, provides this assessment:

“Welcome to the new war on culture. At one time such critiques were leveled against truly offensive art — work that trafficked in demeaning caricatures, such as blackface, 19th century minstrel shows, or ethnological expositions which literally put indigenous people on display, often in cages. But these accusations have become a common attack against any artist or artwork, no matter how thoughtfully or artfully presented. A work can reinvent the material or even serve as a tribute, but no matter. If artists dabble outside their own critical experience, they’ve committed a creative sin.”

The protests being launched by the militant advocates of political correctness, in Young’s view, have a potential not only to chill creativity and artistic expression but are equally bad for diversity:

“This raises the troubling specter of cultural cleansing when we attack people for stepping outside their own cultural experiences, we hinder our ability to develop empathy and cross-cultural understanding. What will be declared ‘problematic’ next? Picasso’s and Matisse’s works inspired by African art? Puccini’s ‘Orientalist’ operas, ‘Madame Butterfly’ and ‘Turandot?’ Should we rid our homes of Japanese prints? . . . Can Catholics claim appropriation when religious paintings of Jesus or the Virgin Mary are exhibited in a secular context, or when movies from ‘The Sound of Music’ to ‘Sister Act’ use nuns for entertainment? . . . Appropriation is not a crime. It’s a way to breathe new life into culture. People have borrowed, adopted, taken, infiltrated or reinvented from time immemorial. . . . Russian culture with its Slavic roots is also the product of Greek, Nordic, Tatar and Mongol influences. America is the ultimate blended culture.”

Actor and playwright J. B. Alexander points out that:

“William Shakespeare never personally felt the sting of racism, yet he wrote the character of Othello. He was never subjected to anti-Semitism, yet he wrote the character of Shylock. Nor was he ever a female adolescent, yet he wrote the character of Juliet. And we are all the richer for it. Artists must be free to create characters that lie within the scope of their imaginations, not merely to replicate their own identities, because great art allows us to transcend those identities and recognize our common humanity.”

If the crusade against “cultural appropriation” continues, we may reach a point where only Jews can read the Bible, only Greeks can read Plato or Aristotle, and only Italians read Dante or Machiavelli. Where will it end? Can only those of British descent appreciate Shakespeare or those of Russian descent read Tolstoy or Dostoevsky?

More than 100 years ago, the distinguished black intellectual W. E. B. Du Bois understood that art and culture, whatever the source, are relevant to men and women of all backgrounds. He declared:

“I sit with Shakespeare and he winces not. Across the color line I walk arm in arm with Balzac and Dumas, where smiling men and welcoming women glide in gilded halls. From out of the caves of evening that swing between the strong-limbed earth and the tracery of the stars, I summon Aristotle and Aurelius and what soul I will, and they come all graciously, with no scorn or condescension. So, wed with truth, I dwell above the veil.”     *

Friday, 04 November 2016 14:21

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The 2016 Election Campaign Shows the Dramatic Decline in American Politics

The 2016 presidential campaign looms large, and there are no positive things to be said about it. The two candidates, Donald Trump and Hillary Clinton, are viewed in negative terms by the majority of voters. They are viewed as not being trustworthy, a strange characteristic for presidential candidates.

In preparation for a trip to visit my four grandchildren, I went through old copies of Cobblestone, a children’s history magazine. I asked my oldest grandson, who will soon be ten, which issues I should bring with me. These were magazines his father read, together with his aunt and uncle. He selected those that featured Thomas Jefferson, James Madison, Abraham Lincoln and Benjamin Franklin.

This immediately brought to mind the contrast between the men who involved themselves in public life in the colonial era and those who are offering themselves as leaders at the present time. Washington, Jefferson, Madison and others, in revolting against the most powerful empire in the world, risked everything they had, including their lives. If they did not succeed, which seemed likely, their families would have been left destitute. Contrast that with Hillary Clinton, who has spent her career in public life, and made millions of dollars as a result. The product she sold was influence. The Clinton Foundation, now known as the Bill, Hillary and Chelsea Clinton Foundation, accepted millions of dollars from at least seven foreign governments while Mrs. Clinton served as Secretary of State. The Foundation has admitted that a $500,000 donation it received from Algeria violated a 2008 ethics agreement between the foundation and the Obama administration. Or Donald Trump, whose business career is littered with bankruptcies, lawsuits, and charges of fraud, and whose campaign has insulted his opponents, made fun of those with disabilities, and flirted with racism. His experience in government is non-existent and his understanding of international affairs seems limited, at best. He has suggested that NATO is irrelevant, seems prepared to have Japan and South Korea pursue nuclear weapons, and seems sympathetic to Vladimir Putin. He has endorsed torture and the murder of innocent relatives of those involved in terrorism.

The authors of the Constitution carefully studied the history of Athens and the Roman Republic and created a system of limited government and checks and balances that they hoped would prevent a descent into tyranny, which ended these early democracies in the ancient world. The system they established is now the oldest existing form of government in the world, which tells us a great deal about the difficulty men and women have had in establishing governmental systems which provide for free speech, free elections, and individual rights. In the lifetime of many of us, countries that are now functioning democracies — Germany, Italy, Japan, Poland, Hungary and many others — were totalitarian states. Democracy is a difficult and easily threatened way to organize a society. When economies fail and times become difficult, demagogues are waiting in the wings. Hitler and Mussolini are the best-known examples, but there are many others. In history, tyranny is, sadly, not an exception. It has been the rule. Our own Constitutional system is a rare exception. But, as any human enterprise, it is fragile. Many predicted that it would not survive in the long run.

In On Power, the French political philosopher Bertrand De Jouvenel points out that we frequently say, “Liberty is the most precious of all goods” without noticing what this concept implies. He writes:

“A good thing which is of great price is not one of the primary necessities. Water costs nothing at all, and bread very little. What costs much is something like a Rembrandt, which though its price is above rubies, is wanted by very few people and by none who have not, as it happens, a sufficiency of bread and water. Precious things, therefore, are really desired by but few human beings and not even by them until their primary needs have been amply provided. It is from this point of view that Liberty needs to be looked at — the will to be free is in time of danger extinguished and revives again when once the need of security has received satisfaction. Liberty is in fact only a secondary need, the primary need is security.”

From the beginning of history, the great philosophers predicted that democratic government would produce this result. Plato, Aristotle, and more recently, De Tocqueville, Lord Bryce and Macaulay, predicted that people would give away their freedom voluntarily for what they perceived as greater security. De Jouvenel concludes:

“The state when once it is made the giver of protection and security, has but to urge the necessities of its protectorate and overlordship to justify its encroachments.”

In a similar vein, Thomas Babington Macaulay, the British historian, lamented in 1857 in a letter to Henry Randall, an American, that:

“I have long been convinced that institutions purely democratic must, sooner or later, destroy liberty or civilization or both. In Europe, where the population is dense, the effect of such institutions would be almost instantaneous. . . . Either the poor would plunder the rich, and civilization would perish; or order and prosperity would be saved by a strong military government and liberty would perish.”

Macaulay, looking to America, declared that:

“Either some Caesar or Napoleon will seize the reigns of government with a strong hand; or your republic will be as fearfully plundered and laid waste by barbarians in the 20th century as the Roman Empire was in the Fifth — with this difference — that your Huns and Vandals will have been engendered within your own country by your institutions.”

Nearly 200 years ago, the British historian Alexander Fraser Tytler, declared that:

“A democracy cannot exist as a permanent form of government. It can only exist until the voters discover that they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidates promising the most benefits from the public treasury — with the result that democracy collapses over a loose fiscal policy, always to be followed by a dictatorship.”

In the colonial era, the best men in the American society were engaged in public life. They had little to gain personally for their efforts, and much to lose. It has been said that the American society is rare in history, for its Golden Age was at the beginning. Our system has evolved to become one in which to engage in political life means to be on an endless quest for funds from special interest groups which, in turn, determine policy. Politicians argue that taking millions of dollars from Wall Street has nothing whatever to do with bailing out failing banks with taxpayer funds. Does anyone really believe this? Would Jefferson or Washington or Adams have entered public life if it involved endless fund-raising and subservience to those who contribute?

John Adams observed that: “Democracy never lasts long. It soon wastes, exhausts and murders itself. There never was a democracy that did not commit suicide.” As he left the Constitutional Convention, Benjamin Franklin was asked what form of government had been created. He replied, “A republic if you can keep it.”

The Founding Fathers would be disappointed in the dramatic decline in American politics, but they would not be surprised. They feared it would happen. It is now time for America’s elder statesmen — of both parties — to speak up and decry any politics that divides the American people on the basis of race, ethnicity, religion, or gender. As the late Rep. Shirley Chisholm (D-NY) used to say, “We came over on different ships, but we’re in the same boat now.” And that boat is now in increasingly troubled waters.

Growth of Executive Power Has Exploded Under President Obama — Altering Our System of Checks and Balances

Most Americans learned in government and civics classes that we live under a constitutional system of checks and balances. The elected representatives of the people in Congress pass the laws and the executive carries them out. This has not been our reality for some time, and both parties are responsible for the growth of executive power and the decline of the Congress. President Obama, The New York Times notes, “has been one of the most prolific authors of major regulations in presidential history.”

In its first seven years, the Obama administration finalized 500 major regulations — which were never passed by Congress. These were classified by the Congressional Budget Office as having particularly significant economic or social impacts. That was nearly 50 percent more than the George W. Bush administration during the comparable period, according to data kept by the regulatory studies center at George Washington University.

In recent years, whichever party has been in power, the power of the president and his willingness to issue executive orders rather than going to Congress for legislation has grown. Under President George W. Bush, what some called a new “Imperial Presidency” was said to have emerged. In The Cult of the Presidency, the Cato Institute’s Gene Healy noted that the administration’s broad assertion of executive power included:

“. . . the power to launch wars at will, to tap phones and read e-mail without a warrant, and to seize American citizens, and hold them for the duration of the war on terror — in other words perhaps forever.”

Healy points out that:

“Neither Left nor Right sees the president as the Framers saw him: a constitutionally constrained chief executive with an important, but limited job: to defend the country when attacked, check Congress when it violates the Constitution, enforce the law — and little else. Today, for conservatives as well as liberals, it is the president’s job to protect us from harm, to “grow the economy,” to spread democracy and American ideals abroad, and even to heal spiritual malaise.”

In 2014, President Obama vowed,

“I’ve got a pen and I’ve got a phone — and I can use that pen to sign executive orders and take executive actions and administrative action that move the ball forward.”

This notion of executive power has little to do with the Constitution. It is a formula for the rule of one individual, and has had appeal to those in the White House, of both parties.

In one celebrated case, President Obama issued an executive order that would have allowed millions of undocumented immigrants to remain in the country and work legally. This was challenged in the courts by 26 states, which said the president did not have the authority to issue an order determining how the immigration laws should be enforced in the case of millions of people, saying the issue should be left to Congress. The lower courts agreed. The Supreme Court, in the case of United States v. Texas, split 4-4 on the question. As a result, the lower court ruling blocking the president’s executive order was upheld. This, of course, was very rare. Most executive orders are quietly implemented with little discussion or debate.

The decline of the Congress and the growth of executive power is clear to all when it comes to the war-making power. The war in Iraq was not declared by Congress, nor was that in Korea or Vietnam, Panama, Haiti, Grenada or Somalia. In recent years, Congress has relinquished more authority than ever before over the nation’s foreign policy.

The Constitution reserves to Congress alone the power to declare war, despite its naming the president as commander-in-chief of the armed forces. In Federalist No. 69, Alexander Hamilton notes that the president’s authority:

“. . . would be nominally the same with that of the King of Great Britain, but in substance much inferior to it. . . . While that of the British King extends to the declaring of war and to the raising and regulating of fleets and armies, all of which, by the Constitution under consideration appertain to the legislature.”

According to the decision in the case of Perkins v. Rogers, the Supreme Court declared:

“The war making power is, by the Constitution, vested in Congress and . . . the President has no power to declare war or conclude peace except as he may be empowered by Congress.”

In Presidential War Power, Louis Fisher, a senior specialist in separation of powers at the Library of Congress, states:

“From 1789 to 1950, Congress either declared or authorized all major wars. Members of Congress understood that the Constitution vests in Congress, not the president, the decision to take the country from a state of peace to a state of war. The last half-century has witnessed presidential wars, including President Truman going to war against North Korea and President Clinton using military force against Yugoslavia, with neither president seeking authority from Congress.”

Indeed, in more than 200 years and more than 100 U.S. military engagements, Congress has formally declared only five wars — the War of 1812, the Mexican-American War (1846), the Spanish American War (1898), World War I (1917) and World War II (1941).

The office of president that we now observe is far different from what the Framers of the Constitution had in mind. According to the Cato Institute:

“The constitutional presidency, as the Framers conceived it, was designed to stand against the popular will as often as not, with the president wielding the veto power to restrain Congress when it transgressed its constitutional bounds. In contrast the modern president considers himself a tribune of the people, promising transformative action and demanding the power to carry it out.”

The result is what political scientist Theodore J. Lowi has called:

“. . . the plebiscitary presidency . . . an office of tremendous personal power drawn from the people. . . and based on the New Democratic theory that the presidency with all powers is the necessary condition for governing a large democratic nation.”

The growth of executive power is a real threat to the system of government established by the Constitution. Both parties are co-conspirators in expanding the power of the president. Even at the beginning of the Republic, perceptive observers such as John Calhoun, in his Disquisition on Government, predicted that the powers of government would inevitably grow, that those in power would always advocate a “broad” use of power, and those out of power would always argue for a “narrow” use of power, and that no one would ever turn back governmental authority that had once been assumed.

The scope of federal regulation has continued to grow and there is little reason to believe that this trend will not continue. Presidents, both Democrats and Republicans, have asserted greater power in recent decades to dictate the shape of regulations while Congress has become less specific in its instructions, in effect, abdicating its own authority. When she was a Harvard law professor, Elena Kagan, now a Supreme Court justice, said, “We live in an era of presidential administration.” Professor Robert Hahn of Oxford says:

“The big issue that I grapple with is that the regulatory state keeps growing. And as it keeps growing, when does it become too much?”

Whether our new president is Donald Trump or Hillary Clinton, major opposition to their campaign promises will be found in Congress. To bypass Congress, they now have the legacy of presidents such as George W. Bush and Barack Obama, who believe that issuing executive orders is an easy way to avoid the legislative process. This is not the system our Constitution established, but it is the one we seem to have now. This is not good news for those who believe in the system of checks and balances and division of powers that the Constitution established.

Looking at Race Relations Beyond the Overheated Rhetoric in the Political Arena

In the wake of police shootings in Baton Rouge and St. Paul and the murder of five police officers in Dallas by a shooter who said his goal was killing white policemen, there has been increasing focus upon and discussion of race relations. The overheated rhetoric in the political arena, on all sides, obscures a more complex reality.

According to a New York Times/CBS News poll conducted in July, 69 percent of Americans say race relations are generally bad. Six in ten Americans say race relations are growing worse, up from 38 percent a year ago. Relations between black Americans and the police have become so brittle that more than half of black people say they were not surprised by the attack that killed five police officers and wounded nine others in Dallas. Nearly half of white Americans say that they, too, were unsurprised by the episode.

While the particulars of recent killings of black men by white police officers are subject to differing analyses — in a number of cases police officers have been found innocent by federal and state authorities of any wrongdoing — there can be no doubt that a real problem exists. Recent events caused Sen. Tim Scott (R-SC), a conservative black Republican, to tell his own story.

The first black senator elected in the South since Reconstruction, Scott reports many run-ins with police officers over the course of his life. He recalls drawing the suspicion of a Capitol Police Officer last year who insisted on seeing identification even though he was wearing the distinctive lapel pin worn by senators. “The officer looked at me, full of attitude, and said, ‘The pin I know, you I don’t. Show me your ID,’ he said.” “I was thinking to myself, either he thinks I’m committing a crime — impersonating a member of Congress — or what?”

Sen. Scott states:

“While I thank God I have not endured bodily harm, I have, however, felt the pressure applied by the scales of justice when they are slanted. I have felt the anger, the frustration, the sadness and the humiliation that comes with feeling like you are being targeted for nothing more than being yourself. . . . The vast majority of time, I was pulled over for nothing more than driving a new car in the wrong neighborhood, or some reason just as trivial. Imagine the frustration, the irritation, the sense of a loss of dignity that accompanies each of those stops.”

When it comes to the question of the use of lethal force by police, the claim that blacks are more often victims has been widely claimed. Here, the rhetoric and reality seem at odds. A new study confirms that black men and women are treated differently in the hands of law enforcement. They are more likely to be touched, handcuffed, pushed to the ground or pepper-sprayed by a police officer even after accounting for how, where, and when they encounter the police. But when it comes to the most lethal force — police shootings — the study finds no racial bias.

“It is the most surprising result of my career,” said Roland G. Fryer, Jr., the author of the study, and a professor of economics at Harvard. The study examined more than 1,000 shootings in 10 major police departments in Texas, Florida, and California.

The results of this study by Dr. Fryer, who is black, contradicts the image of police shootings that many Americans hold. He said that anger after the deaths of Michael Brown, Freddie Gray, and others drove him to study the issue. “You know, protesting is not my thing, but data is my thing,” he said.

“So I decided that I was going to collect a bunch of data and try to understand what really is going on when it comes to racial differences in police use of force.”

The idea that race relations are approaching the divisiveness of the 1960s is hard to justify. President Obama states that:

“When we start suggesting that somehow there is this enormous polarization and we’re back to the situation in the 1960s — that’s just not true. You’re not seeing riots, and you’re not seeing police going after people who are protesting peacefully.”

The progress made by black Americans since the years of segregation is impressive. In 1950, only 13.7 percent of adult black Americans (25 and older) had completed high school or more; by 2014, this was 66.7 percent, according to the Department of Education. Over the same period, the number of African Americans with a bachelor’s degree or higher went from 2.2 percent to 22.8 percent. The black upper-middle class — defined as households with incomes of at least $100,000 — has grown from 2.8 percent of households in 1967 to 13 percent in 2014.

In every area of American life, blacks have been advancing dramatically. Black elected officials have made huge gains, reports the Joint Center for Political and Economic Studies. When the Voting Rights Act was passed in 1965, five African Americans served in the House and Senate; now there are 44 House members and two senators. Over a similar period, the number of black state legislators grew from about 200 to 700. We have, of course, elected our first black president — and then re-elected him.

Attitudes toward racial intermarriage have changed dramatically. NORC, an academic polling organization at the University of Chicago, periodically explores intermarriage in its surveys. One question asks whites and blacks whether they would favor or oppose a marriage of “a close relative” to a person of the other race. In 1990, only 5 percent of whites favored interracial marriage; 30 percent were neutral, and 65 percent opposed. By 2014, only 16 percent opposed. Blacks have been even more open to interracial marriage; since 2000, roughly 90 percent have either approved or not objected.

According to a recent study by the U.S. Census, residential segregation has been dramatically curtailed. The study of census results from thousands of neighborhoods by the Manhattan Institute found that the nation’s cities are more economically integrated than at any time since 1910. It was found that all-white enclaves “are effectively extinct.” Prof. Reynolds Farley of the University of Michigan’s Population Studies Center, says that:

“There is now much more black-white neighborhood integration than 40 years ago. Those of us who worked on segregation in the 1960s never anticipated such decline.”

The fact that major disparities exist between black and white Americans is true. Yet, to argue that “white racism” is the cause of all such disparities is to overlook a larger reality. The fact that 70 percent of black births involve unmarried mothers has serious consequences. As Child Trends, a research group, puts it, “These children tend to face unstable living arrangements, life in poverty and . . . have low educational achievement.” When it comes to the large number of young black men killed in shootings, Wall Street Journal columnist Jason Riley, who is black, notes that:

“More than 95 percent of black shooting deaths don’t involve the police. . . . Sadly, rates of murder, rape, robbery, assault and other violent crimes are 7 to 10 times higher among blacks than among whites.”

It is such high rates of crime, in Riley’s view, “that obviously underlie tensions between poor minority communities and cops.”

Those of us old enough to have lived through the years of segregation remember an era of segregated schools, segregated bus and train stations, “white” and “black” restrooms (visit the Pentagon and see the proliferation of rest rooms which were constructed in the years when it was illegal in Virginia for men and women of different races to use the same facilities), with water fountains reserved for “white” and “colored.” In many parts of the country blacks could not vote or sit on juries. Black travelers never knew when they would be able to stop for a meal. There was no pretense that racial equality of any kind existed.

Today, we live in an imperfect society, but one in which all citizens, regardless of race, have equal rights. It is against the law to discriminate on the basis of race. Men and women can go as far as their individual abilities can take them. Black Americans hold every conceivable position in our society — from CEO of major corporations, to chief of police in major cities (such as Dallas), to university president, to governor, to attorney general, to President of the United States.

No one should pretend that problems of race do not exist. Compared to the distance we have come, however, these problems should be put in perspective. “The sky is falling” is not an appropriate posture for those on the left or the right, although in this political season, speaking before thinking is increasingly becoming the norm. Our reality is far more positive and hopeful than the political debate we are forced to endure would indicate.

Kaepernick’s Protest: A Look Back at the Patriotism of Black Americans in Difficult Times

Controversy is now swirling around the decision by San Francisco 49ers quarterback Colin Kaepernick not to stand with his teammates during the national anthem. He has been condemned by some and hailed by others for this action, which he said was to protest racism in the American society.

Whatever one thinks of Kaepernick, the controversy it has provoked provides us with an opportunity to review the long history of patriotism on the part of black Americans, even in the years when they faced severe discrimination. Many Americans, of all backgrounds, are largely unaware of this history.

Few understand the complex history blacks have played in the history of the United States. The first blood shed in the struggle for American independence was shed by the leader of the group that precipitated the “Boston Massacre,” a black man named Crispus Attucks. The first electric streetlights in a metropolitan area of New York City were installed under the supervision of a black man, Lewis H. Latimer, assistant and associate of Thomas A. Edison. The U.S. Flag was first placed at the North Pole by a black explorer, Mathew A. Henson. The list goes on and on.

Black Americans, although they suffered the indignity of slavery and, after slavery, the legal barriers of segregation, have been committed patriots. In his important book, The Negro in the Making of America, Professor Benjamin Quarles, a distinguished black historian, points out that from the beginning, black Americans made one important decision: they would remain in America. From the time of the Revolutionary War, blacks had been advised — by many, black as well as white — to return to Africa. Instead, the decision to remain in America and be free was pervasive. (The book, The Negro in the Making of America was, published in 1964, at which time the term “Negro” was in common usage.)

At a black church meeting in Rochester, New York, in 1853, whose chairman was the noted orator Frederick Douglass, a statement was adopted which declared: “We ask that in our native land we shall not be treated as strangers, and worse than strangers.” The delegates officially rejected any move to abandon the United States and supported, instead, a proposal to establish a manual labor school that would teach the skilled trades.

Many efforts have been made by the enemies of the United States to enlist the support of black Americans, a group they viewed as likely to endorse their calls for revolution because of the legitimate grievances they felt.

To the Communist Party in the 1920s and 1930s, the black American was viewed as the prototype of the oppressed, exploited worker. During a 1925 meeting in Moscow, Joseph Stalin asked why blacks were not better represented in the U.S. Communist Party. To improve their standing with blacks, the Communists adopted a policy calling for self-determination for those areas of the American South where blacks lived in large numbers. Blacks were called an “oppressed nation” who had the right to separation from the United States.

The response to this effort to attract black membership was a dismal failure. Blacks wanted to be free and equal within America, not separate from it. Dr. Quarles writes:

“Negroes simply did not seem to be attuned to the Communist message, for reasons that are not hard to fathom. Typically American, the Negro was individualistic, not likely to submerge his personality in conformity to a party line from which there could be no deviation. . . . The Negro, again like other Americans of his day, was not class-conscious — the vocabulary of the Communists struck him as foreign. Basically, too, the Negro was a man of conservative mold.”

Because black Americans protested against segregation, some thought blacks were radical. Instead, they sought only the opportunity to enter the American society as free and equal citizens, to be able to go as far as their individual abilities would take them, to share in a common color-blind citizenship.

Some black Americans who briefly entered the Communist Party were repelled by it, discovering that the very freedom they sought was rejected within the party itself. Thus, the respected author Richard Wright recalled his experience as a young party member in Chicago in the 1930s:

“I found myself arguing alone against the majority opinion, and then I made still another amazing discovery, I saw that even those who agreed with me would not support me. At that meeting I learned that when a man was informed of the wish of the party he submitted, even though he knew with all the strength of his brain that the wish was not a wise one, was one that would ultimately hurt the party interests. . . . It was not courage that made me oppose the party. I simply did not know any better. It was inconceivable to me, tough-bred in the lap of Southern hate, that a man could not have his say. I had spent a third of my life traveling from the place of my birth to the North just to talk freely, to escape the pressure of fear. And now I was facing fear again.”

Discussing the meaning of black history that is often overlooked, J. A. Parker, one of the early black conservatives and president of the Lincoln Institute, noted that:

“In reviewing the history of black Americans, we should focus upon those who vigorously opposed the efforts of extremists to turn them against America, to isolate them from others in society and to cause them to abandon their goal of a free society in which men and women were to be judged as individuals, not as members of one racial or ethnic or religious group or another. We should focus upon individuals such as Gen. Daniel “Chappie” James, authors Max Yergan and George Schuyler, and composer William Grant Still, to name only several whose proper place in black history often seems to be overlooked. These men were outstanding in their individual careers and never ceased to fight for the day when race would be incidental in determining the place of any man or woman in the American society. They understood that America was the last, best hope of the world to achieve a truly free and just society.”

Benjamin Quarles was correct when he wrote that:

“To most Negroes . . . the vision of the founders of this republic was still a vital force. Americans to the core, they believe that freedom and equality for all could be achieved in their native land. . . . The belief has been one of their significant contributions in the making of America. In enlarging the meaning of freedom and in giving it new expressions, the Negro has played a major role. He has been a watchman on the wall. More fully than other Americans, he knew that freedom was hard-won and could be preserved only by continuous effort. The faith and works of the Negro over the years has made it possible for the American creed to retain so much of its appeal, so much of its moving power.”

America has been a unique and ethnically diverse society from the beginning. By the time of the first census in 1790, people of English origin were actually already a slight minority. Enslaved Africans and their American-born descendants made up 20 percent of the population, and there were large clusters of Scotch-Irish, German, Scottish and Dutch settlers, and smaller numbers of Swedes, Finns, Huguenots and Sephardic Jews.

America has always been something unique, not simply another country. In the 1840s, Herman Melville wrote: “We are the heirs of all time and with all the nations we divide our inheritance. If you kill an American, you shed the blood of the whole world.” Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters from an American Farmer, J. Hector St. John Crevecoeur wrote in 1782: “Here individuals of all nations are melted into a new race of men, whose labor and posterity will one day cause great changes in the world.”

Mario Puzo, the author of The Godfather, wrote that:

“What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries, whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn’t get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.”

As a young man, growing up in Manhattan’s Lower East Side, Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, “For a thousand years in Italy, no one in our family was even able to read.” But in America everything was possible — in a single generation.

Ours is a complex society of more than 300 million people of every race, religion, and ethnic background. Inevitably, such a society will have problems and difficulties. These we must confront and resolve. To see only the problems and overlook the larger American story is to misunderstand reality. It would be good for Colin Kaepernick to review this history. In our free society, he has a right not to stand for the national anthem. Perhaps upon further consideration and a review of the dramatic progress our society has made, and continues to make, he will make a different decision in the future.     *

Sunday, 20 December 2015 08:12

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

"White Privilege": Not a Term Generations of Hardworking Immigrants Would Understand

A new term has emerged as an energized, and very youthful, civil rights movement has sought to focus attention upon what it perceives as widespread racism in today's American society. That term is "white privilege."

There is a very aggressive policing of language now under way. Former Maryland Governor Martin O'Malley was interrupted by protestors when the 2016 Democratic presidential candidate said "all lives matter" at the Netroots Nation conference in Phoenix in mid-July. He later apologized. "That was a mistake on my part, and I meant no disrespect," he said on "This Week In Blackness," a digital show. Several dozen demonstrators interrupted O'Malley's talk big shouting "Black Lives Matter," which has become a rallying cry in the wake of recent shootings of black men by police officers.

At the Phoenix meeting, O'Malley responded, "Black lives matter. White lives matter. All lives matter." This was unacceptable to the protestors, who also shouted down Sen. Bernie Sanders of Vermont, one of O'Malley's Democratic rivals. "Black lives, of course, matter. I spent 50 years of my life fighting for civil rights and for dignity," Sanders declared. "But if you don't want me to be here, that's OK. I don't want to outscream people."

A great deal of attention is being paid to the new book, Between the World and Me, by Ta-Nehisi Coates of The Atlantic. The 176-page book is addressed to his 14-year-old son and the subject is what it is like to be black in America today.

In America, it is traditional to destroy the black body - it is heritage. "White America" is a syndicate arrayed to protect its exclusive power to dominate and control our bodies.

Coates said that if he were king, he would let criminals out of prison, "And, by the way, I include violent criminals in that." He writes in his book that he watched the smoldering towers of the World Trade Center on 9/11 with a cold heart. He felt that the police and firefighters who died "were menaces of nature, they were the fire, the comet, the storm, which could - with no justification - shatter my body."

Racism is a blemish on our society. Older black observers, who lived through the years of segregation, recognize how far we have come. Ellis Close wrote a book, The Rage of a Privileged Class, in 1993 in which he argued that many successful black Americans "were seething about what they saw as the nation's broken promise of equal opportunity." More recently, Close wrote in Newsweek:

Now, Barack Obama sits in the highest office in the land and a series of high-powered African-Americans have soared to the uppermost realms of their professions. The idea of a glass ceiling is almost laughable. Serious thinkers are searching for a new vocabulary to explain an America where skin color is an unreliable marker of status . . .

The history of the world, sadly, shows us people at war with one another over questions of race, religion and ethnicity. Today, radical Islamists are killing people because they are of another religion. In Israel there are efforts to define the state as legally "Jewish," thereby making its 20 percent non-Jewish population less than full citizens. Russia has invaded and absorbed Ukraine to absorb its ethnic Russian population. When Britain left India, millions of Muslims were forced to leave Hindu-majority India to form Pakistan - at the cost of an untold number of lives. We have seen Armenians slaughtered by Turks and have witnessed genocide in Nazi Germany, Rwanda and Burundi.

Those who glibly call America a "racist" society are not comparing it to anyplace in the real world, either historically or at the present time. They are comparing it to perfection and here, of course, we fail, as would any collection of human beings. But our collection of human beings includes men and women of every race and nation. And the notion of "white privilege" seems not to understand the reality of the immigrant experience. Most of today's "white" Americans are descendants of those immigrants, who often suffered great prejudice and many indignities which they overcame through perseverance and hard work. They hardly considered themselves beneficiaries of "white privilege."

Consider the experience of Irish and Italian immigrants who arrived in the U.S. in the 19th and early 20th centuries.

Between 1840 and 1924, 35 million immigrants arrived, many of them illiterate, and most unable to speak English. People who are now described as "white Europeans," were viewed quite differently in the past. A century ago, the Irish were considered by many to be a separate and inferior race. As Mike Wallace and Edwin Burrows write:

Just as the English had long characterized their neighboring islanders more harshly than they had Africans, plenty of Anglo New Yorkers routinely used adjectives like "low-browed," "savage," "bestial," "wild" and "simian" to describe the Irish Catholic "race."

Thomas Nast, the leading political cartoonist from the 1870s to the 1890s portrayed Irishmen almost as monkeys and drew Catholic bishops' hats as sharks' jaws. Andrew Greeley described the Irishman in American cartoons:

By the mid-19th century, he was a gorilla, stovepipe hat on his head, a shamrock in his lapel, a vast jug of liquor in one hand and a large club in the other. His face was a mask of simian brutality and stupidity.

Italian immigrants, largely illiterate peasants from southern Italy and Sicily who had no experience of urban life, were a visually distinctive group, viewed by many as belonging to another race. "Swarthy" was a term often used to describe them and, as Richard Alba notes, "To the eyes of Americans they bore other physical signs of degradation such as low foreheads." Leonard Dinnerstein and David Reimers write that in addition to using epithets such as "wop," "dago," and "guinea," Americans referred to Italians as "the Chinese of Europe." Many Americans doubted that Italians were "white." In the American South, Italians were often segregated like blacks and were classified as yet another race - "between." Eleven Italians were lynched in New Orleans in 1891and five Italians were lynched in Tallulah, Louisiana in 1899.

The notion that all immigrants from Europe were regarded as white Europeans and accepted without prejudice - upon which notions of "white privilege" are based - is an artifact of 1990s "multiculturalism" with no historical basis. Life was difficult and challenging. By 1910, there were 540,000 Eastern European Jews living in 1.5 square miles on the lower East Side of Manhattan. There were 730 people per acre, possibly the highest density on earth. They lived in five- or six- story tenement houses, sleeping three or more to a room with most rooms opening only to narrow airshafts. These grim conditions were highlighted in Jacob Riis's book, How the Other Half Lives.

But despite all of this, America was different and unique, a society in which, no matter your origin, you could go as far as your ability would take you. As a young man growing up in Manhattan, author Mario Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, "For a thousand years in Italy, no one in our family was even able to read." But in America, everything was possible - in a single generation.

Puzo writes:

It was hard for my mother to believe that her son could become an artist. After all, her own dream in coming to America had been to earn her daily bread, a wild dream in itself, and looking back she was dead right. Her son an artist? To this day she shakes her head. I shake mine with her. . . . What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries . . . whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn't get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.

America has been a nation much loved. Germans have loved Germany, Frenchmen have loved France, Swedes have loved Sweden. This, of course, is only natural. America has been beloved not only by Native Americans but by men and women of every race and nation who have yearned for freedom. For all its failings, America dreamed a bigger dream than any nation in history. Those who think "white privilege" explains reality know little of America and the world.

America is more than simply another country. Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters From An American Farmer, J. Hector St. John Crevecoeur wrote in 1782: "Here individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world."

Looking at our complex history and recognizing only its shortcomings - and comparing America only to perfection, not to other real places in the world, may lead Ta-Nehisi Coates and other young people in the "Black Lives Matter" movement to believe that "white privilege" is, somehow, an explanation for a reality that is multi-faceted. The millions of immigrants who suffered the travails of displacement and discrimination would not recognize the term as representing the experience they and their descendants have had.

The Sin of Contemporaneity: Cleansing History by Applying Today's Standards to Our Ancestors

It is good that the Confederate battle flag has been removed from the South Carolina statehouse grounds. It properly belongs in a museum. Robert E. Lee himself would agree. After surrendering in 1865, he sought to bring the country together. He urged his fellow Confederates to furl their flags. He left instructions that the Confederate flag not be displayed at his funeral. In fact, when Lee surrendered at Appomattox, he was going against Jefferson Davis' order to fight on. "It's over," Lee declared.

What we are witnessing now, however, is a wholesale assault upon our history. The Founding Fathers have been targeted. It has been suggested that the Washington Monument and Jefferson Memorial are inappropriate, since they celebrate men who owned slaves. CNN commentator Don Lemon suggested that we "rethink" any homage to Jefferson. Even in states where slavery was outlawed at an early date, state flags are under attack, because of their depiction of Native Americans. Boston Globe columnist Yvonne Abraham said the Massachusetts flag "is no Confederate flag, but . . . still pretty awful." The Memphis City Council voted to dig up the bodies of Confederate Gen. Nathan Bedford Forrest and his wife from their public grave. The rebel flag-clad General Lee automobile from "The Dukes of Hazard" has been removed from memorabilia shops and the show itself removed from re-runs. The Washington National Cathedral is considering breaking its own windows because they contain Confederate flag imagery which was meant to be conciliatory. Louis Farrakhan has demanded that the American flag itself be hauled down. Speaking at a Washington church he declared:

I don't know what the fight is about over the Confederate flag. We've caught as much hell under the American flag as under the Confederate flag.

It's time for all of us to take a deep breath. Those who seek to erase our history sound a bit like the Taliban and ISIS, who are busy destroying historic structures all over the Middle East if they predate the rise of Islam. History is what it is, a mixed bag of mankind's strengths and weaknesses, of extraordinary achievements and the most horrible depredations. To judge the men and women of past eras by today's standards is to be guilty of what the respected Quaker theologian Elton Trueblood called the "sin of contemporaneity." In the case of those who refer to slavery as our "original sin," a look at history is instructive.

Sadly, from the beginning of recorded history until the 19th century, slavery was the way of the world. Rather than some American uniqueness in practicing slavery, the fact is that when the Constitution was written in 1787, slavery was legal every place in the world. What was unique was that in the American colonies there was a strenuous objection to slavery and that the most prominent framers of the Constitution wanted to eliminate it at the very start of the nation.

Slavery played an important part in many ancient civilizations. Indeed, most people in the ancient world regarded slavery as a natural condition of life, which could befall anyone at any time. It has existed almost universally through history among peoples of every level of material culture - it existed among nomadic pastoralists of Asia, hunting societies of North American Indians, and sea people such as the Norsemen. The legal codes of Sumer provide documentary evidence that slavery existed there as early as the 4th millennium B.C. The Sumerian symbol for slave in cuneiform writing suggests "foreign."

The British historian of classical slavery, Moses I. Finley, writes:

The cities in which individual freedom reached its highest expression - most obviously Athens - were cities in which chattel slavery flourished.

At the time of its cultural peak, Athens may have had 115,000 slaves to 43,000 citizens. The same is true of ancient Rome. Plutarch notes that on a single day in the year 167 B.C., 150,000 slaves were sold in a single market.

Our Judeo-Christian tradition was also one which accepted the legitimacy of slavery. The Old Testament regulates the relationship between master and slave in great detail. In Leviticus (XXV: 39-55), God instructs the Children of Israel to enslave the heathen and their progeny forever. By classical standards, the treatment of slaves called for in the Bible was humane. In Exodus (XXI: 20-21) it states that if a master blinded his slave or knocked out one of his teeth, the slave was to go free. There is no departure from this approach to slavery in the New Testament. In a number of places, St. Paul urges slaves to obey their masters with full hearts and without equivocation. St. Peter urges slaves to obey even unjust orders of their masters.

Slavery was a continuous reality throughout the entire history which preceded the American Revolution. In England, ten percent of the persons enumerated in the Domesday Book (A.D. 1086) were slaves, and they could be put to death with impunity by their owners. During the Viking age, Norse merchant sailors sold Russian slaves in Constantinople. Venice grew to prosperity and power as a slave-trading republic, which took its human cargo from the Byzantine Empire. Portugal imported large numbers of African slaves from 1444 on. By the middle of the 16th century, Lisbon had more black residents than white.

Slavery was not a European invention, but was universal. Throughout the Middle Ages, black Africans sold slaves to other Africans and to Moslem traders who also brought slaves to Asia. Among the Aztecs, a man who could not pay his debts sold himself into slavery to his creditor. In China, poor families who could not feed all of their children often sold some as slaves. As the Founding Fathers looked through history, they saw slavery as an accepted institution.

What is historically unique is not that slavery was the accepted way of the world in 1787, but that so many of the leading men in the American colonies of that day wanted to eliminate it, and pressed vigorously to do so. Benjamin Franklin and Alexander Hamilton were ardent abolitionists. John Jay, who would become the first Chief Justice, was president of the New York Anti-Slavery Society. Rufus King and Gouverneur Morris were in the forefront of opposition to slavery.

One of the great debates at the Constitutional Convention related to the African slave trade. George Mason of Virginia made an eloquent plea for making it illegal:

The infernal traffic originated in the avarice of British merchants. The British government constantly checked the attempt of Virginia to put a stop to it. . . . Slavery discourages arts and manufactures. The poor despise labor when performed by slaves. . . . Every master of slaves is born a petty tyrant. They bring the judgment of heaven on a country.

The provision finally adopted read:

The Migration or Importation of such Persons as any of the States now existing shall think proper to admit, shall not be prohibited by the Congress prior to the year one thousand eight hundred and eight.

This clause was widely viewed by opponents of slavery as an important first step toward abolition. The delay of 20 years was considered the price ten of the states were willing to pay in order to assure that the original union would include the three states of Georgia, South Carolina and North Carolina. Even in those states there was sympathy for an end to slavery, but they wanted additional time to phase out their economic dependence on it.

In his original draft of the Declaration of Independence, one of the principal charges made by Thomas Jefferson against King George III and his predecessors was that they would not allow the American colonies to outlaw the importation of slaves. When Jefferson was first elected to the Virginia legislature, at the age of 25, his first political act was to begin the elimination of slavery. Though unsuccessful, he tried to further encourage the emancipation process by writing in the Declaration of Independence that "all men are created equal." In his draft of a constitution for Virginia he provided that all slaves would be emancipated in that state by 1800, and that any child born in Virginia after 1801 would be born free. This, however, was not adopted.

In his autobiography, Jefferson declared, "Nothing is more certainly written in the book of fate than that these people are to be free." In 1784 when an effort was unsuccessfully made to exclude slavery from the Northwest Territory, Jefferson was one of its leading supporters. Finally, with the passage of the Northwest Ordinance of 1787, slavery was indeed excluded from these territories - a further step along the path to the final elimination of slavery, and a clear indication of the view of slavery which predominated among the framers of the Constitution.

American history is flawed, as is any human enterprise. Yet those who now call for the removal of statues and monuments commemorating our past are measuring our history against perfection, not against other real places. What other societies in 1787 - or any date in history prior to that time, would these critics find more free and equitable than the one established by the Constitution? Where else was religious freedom with no religious test for public office in 1787? Compared to perfection, our ancestors are found wanting. Compared to other real places in the world, they were clearly ahead of their time, advancing the frontiers of freedom.

If we judge the past by the standards of today, must we stop reading Plato and Aristotle, Sophocles and Aristophanes, Dante and Chaucer? Will we soon hear calls to demolish the Acropolis and the Coliseum, as we do to remove memorials to Jefferson and statues of Robert E. Lee? Must we abandon the Bible because it lacks modern sensibility. Where will it end? As theologian Elton Trueblood said, "contemporaneity" is indeed a sin. We would do well to avoid its embrace.

Remembering a Time When Our Leaders Risked Their Lives and Fortunes for What They Believed

Prior to the American Revolution, when Patrick Henry's famous declaration of "Give me liberty or give me death" was made at a church in Richmond, Virginia, his words had real meaning. Indeed, by advocating revolution against England, the most powerful nation in the world at the time, with the world's largest army and navy, the Founding Fathers were risking everything. If the revolution failed, which seemed likely to many, they would have lost their property - and their lives.

George Washington's home at Mt. Vernon, Thomas Jefferson's at Monticello, James Madison's at Montpelier - all would have been confiscated by the victorious British, had the war been lost. At the time the Declaration of Independence was signed in 1776, only one third of the population of the thirteen colonies supported breaking away from the British Empire. Those who supported independence put their lives on the line.

In his book American Creation, historian Joseph Ellis writes:

The revolutionary generation won the first successful war for colonial independence in the modern era, against all odds, defeating the most powerful army and navy in the world. . . . The British philosopher and essayist Alfred North Whitehead observed that there have been only two instances in the history of Western civilization when the political leaders of an emerging nation behaved as well as anyone could reasonably expect. The first was Rome under Caesar Augustus and the second was America's revolutionary generation. . . . The late eighteenth century was the most politically creative era in American history. They were, in effect, always on their best behavior because they knew we would be watching, an idea we should find endearing because it makes us complicitous in their greatness.

The Founding Fathers did not consult the colonial equivalent of pollsters to find out what people would like to hear. Instead, they developed ideas about how a government should be run and how freedom could be established in an environment of order and law. Alexander Hamilton and James Madison were prime movers behind the summoning of the Constitutional Convention and the chief authors of The Federalist Papers, an undertaking to convince Americans to support the Constitution of 1787.

In his biography, Alexander Hamilton, Ron Chernow notes that:

He had a pragmatic mind that minted comprehensive programs. In contriving the smoothly running machinery of a modern nation-state - including a budget system, a funded debt, a tax system, a central bank, a custom service, and a coast guard - and justifying them in some of America's most influential state papers, he set a high-water mark for administrative competence that has never been equaled. If Jefferson provided the essential poetry of American political discourse, Hamilton established the prose of American statecraft. No other founder articulated such a clear and prescient vision of America's future political, military, and economic strength or crafted such ingenious mechanisms to bind the nation together.

Hamilton, a careful reader of the skeptical Scottish philosopher David Hume, quoted his view that in framing a government "every man ought to be supposed a knave and to have no other end in all his actions but private interests." The task of government, he believed, was not to stop selfish striving - a hopeless task - but to harness it to public good. In starting to outline the contours of his own vision of government, Hamilton was spurred by Hume's dark vision of human nature, which corresponded to his own. From the "First Philippic" of Demosthenes, he plucked a passage that summed up his conception of a leader as someone who would not pander to popular whims. "As a general marches at the head of his troops," so should wise political leaders

. . . march at the head of affairs, insomuch that they ought not to wait the event to know what measures to take, but the measures which they have taken out to produce the event.

The Founding Fathers - Washington, Adams, Jefferson, Madison, Monroe, Hamilton, Franklin and the others - were an extraordinary group of men, truly representing a golden age in our history. The creation of the new American government clearly required both Republicans and Federalists, both a Jefferson and a Hamilton, both those jealous for individual freedom and those concerned that such freedom could only exist and be maintained within an orderly society ruled by law. In a society of only a few million people, we produced leaders who have stood the test of time. Such a generation has never again been seen, on these shores or elsewhere.

These men did not hire ghostwriters for The Federalist Papers. Their words and their thoughts were their own. They did not hire consultants and pollsters to tell them what their views should be on the issues of the day. They often took highly unpopular positions and did their best to convince their colleagues and the public at large of their merits. They risked their lives and everything they owned to declare independence, and knew very well that the possibility of losing everything was very real.

The contrast between the Founding Fathers and those engaged in public life at the present time could not be greater. In the colonial period, our leaders risked their fortunes for the principle of independence. Today, men and women make their fortunes through their participation in politics.

Hillary Clinton, for example, reported that she earned $10.2 million from 45 speeches in 2014, her first full year out of office. Of that, almost $4.6 million came from clients who did lobbying to shape policies on issues as varied as taxes, trade, financial regulation and health care. Later, we learned that the Clinton Foundation had received as much as $26.4 million in previously undisclosed payments from major corporations, universities, foreign sources, and other groups. The money was paid as "fees" for speeches for Bill, Hillary, and Chelsea Clinton.

There can be little doubt that this money was given to the Clintons because of her candidacy for president and her ability to provide assistance to those contributing. Sheila Krumholz, executive director of the money-tracking Center for Responsive Politics, states:

It's big money. They're spending it because they have far greater sums riding on those decisions that they're trying to shape. Every man, or woman in the street thought Hillary Clinton would run again.

Even those who are sympathetic to Mrs. Clinton's candidacy, such as liberal Washington Post columnist Ruth Marcus, have expressed dismay: "Again with the speeches. The gross excessiveness of it all, vacuuming up six-figure checks well past the point of rational need or political seemliness. . . ."

But Hillary Clinton is hardly alone. Norman R. Braman, a Florida billionaire who has long bolstered the career and personal finances of Sen. Marc Rubio (R-FL), is reported ready to invest $10 million or more for the Senator's presidential candidacy. Las Vegas casino billionaire Sheldon Adelson has auditioned possible Republican candidates who seek his support. Endorsement of the policies of Israeli Prime Minister Netanyahu, including his rejection of a two-state solution, is a requirement. When New Jersey Governor Chris Christie appeared, he made the mistake of referring to the West Bank as "occupied territory" (which, of course, it is under International law, as well as U.S. policy, under both Republicans and Democrats). Christie quickly apologized for his "mistake." Jeb Bush also sought Adelson's support and turned his back on long-time Bush family friend and former Secretary of State James Baker to get it. Baker, in a recent talk to J Street, was critical of Israel's rejection of the two-state solution, which was unacceptable to Adelson.

Beyond this, many candidates don't seem to know where they stand on the issues - except when their pollsters tell them what is necessary to win in Iowa or South Carolina or New Hampshire. Hillary Clinton once was a supporter of the trade pact being considered in the Congress. Now, she refuses to take a position - or even take questions from the press. Scott Walker was first hot and then cold on a path to citizenship for undocumented immigrants. Marco Rubio was in and then out on offsetting increased military spending with other cuts. And what exactly is his current position on immigration? Sen. Lindsay Graham (R-SC) is sure of one thing: his opposition to gambling on the Internet. This, of course, is a crusade of Sheldon Adelson, who wants no competition for his gambling casinos. He seems to favor competition and free enterprise in every commercial undertaking but his own.

The contrast between the political leaders of America's golden age and those we observe today could not be starker. No one today is risking his life, property, or honor for anything. The state of our government reflects this fact all too well. *

Sunday, 20 December 2015 08:08

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Every Tragic Incident - Such as That in Missouri - Produces Cries That America Is a "Racist" Society, but Overlooks a More Complex Reality

The killing of 18-year-old Michael Brown, who is black, by a white police officer in Ferguson, Missouri led to days of demonstrations, rioting, and looting. There has been criticism of the overwhelming police response, as well as charges that racism was involved in the death of this teenager. Beyond this, many have proclaimed that this incident shows us that America is a "racist" society, and that talk of racial progress and a movement toward a genuinely "color-blind" society is false.

Exactly what happened in Ferguson will be determined by a thorough investigation, including participation by the FBI and the Department of Justice. If there was wrongdoing by the police officer involved, this will be documented and appropriate action will be taken. In the meantime, we can only withhold judgment on what actually occurred.

What we can properly lament, however, is the manner in which a chorus of voices is immediately heard after every negative event telling us that racism is alive and well in almost every sector of our society. The reality is far more complex.

Typical of this phenomenon is a column in The New York Times by Charles Blow, who is black. He declares that:

The criminalization of black and brown bodies, particularly male ones, [starts] from the moment they are first introduced to the institutions and power structures with which they must interact. . . . Black male dropout rates are more than one and a half times those of white males, the bias of the educational system bleeds easily into the bias of the criminal justice system, from cops to courts to correctional facilities. The school-to-prison pipeline is complete.

Earlier this year, the Department of Education's Office for Civil Rights released "the first comprehensive look at civil rights from every public school in the country in nearly 15 years." Attorney General Eric Holder said:

The critical report shows that racial disparities in school discipline policies are not only well documented among older students but actually begin during pre-school.

The fact that more young black men drop out of school, that they are over-represented in our criminal justice system, and that they are more often subjected to school discipline, is not necessarily an indication of "institutional racism" in our society, as Mr. Blow and so many others rush to proclaim. There are other, much more plausible explanations.

By 2004 federal data showed that black Americans, 13 percent of the population, accounted for 37 percent of the violent crimes, 54 percent of arrests for robbery, and 51 percent for murder. Most of the victims of these violent criminals were also black. If black men are over-represented in our prison population, the reason appears to be that they are guilty of committing an over-represented amount of crime. Commentator Juan Williams, who is black, laments that:

Any mention of black America's responsibility for committing the crimes, big and small, that lead so many people to prison is barely mumbled, if mentioned at all.

In a column titled "Our Selective Outrage," The Washington Post's Eugene Robinson, who is black, notes that:

The killing of 18-year-old Michael Brown has rightly provoked widespread outrage, drawing international media attention and prompting a comment from President Obama. The same should be true, but tragically is not, of the killing of 3-year-old Knijah Amore Bibb. Brown was killed in Ferguson, Missouri; Knijah died the following day in Landover, Maryland. Both victims were African-American. Both had their whole lives before them. The salient difference is that Brown was shot to death by a white police officer, according to witnesses, while the fugitive suspect in Bibb's killing is a 25-year-old black man with a long criminal record.

Robinson points to statistics showing the dimensions of the problem. According to the FBI, in 2012, the last year for which figures are available, 2,614 whites were killed by white offenders, and 2,412 blacks were killed by black offenders, similar numbers. "But," writes Robinson,

. . . the non-Hispanic white population is almost five times as large as the African-American population, meaning the homicide rate in black communities is staggeringly higher. . . . We need to get angry before we have to mourn the next Knijah Bibb.

It is not "white racism" which causes black-on-black crime, and it may be something other than racism that causes disciplinary disparities and the number of school dropouts. The breakdown of the black family is a more likely cause for such disparities.

In 1940, the black rate of out-of-wedlock birth was around 14 percent. Now, it's 75 percent. In 1870, right after slavery, 70 to 80 percent of black families were intact. Today, after segregation came to an end and the enactment of legislation making racial discrimination illegal, and myriad affirmative action programs, 70 percent of black children have single mothers, and estimates are that an even larger percentage will grow up without a father in the home.

Blaming the problems we confront on "racism" misses the point of the real dilemmas we face. Attorney General Holder does black Americans no favor by ignoring the disintegration of the black family in explaining disparities in school dropouts and disciplinary problems. White racism is not, somehow, compelling out-of-wedlock birth in the black community, a far more plausible causative factor in statistical disparities than blaming an amorphous "institutional racism."

What was missing in the response to developments in Missouri, which included rioting and arson, and cries of "No Justice, No Peace," was "the calming voice of a national civil rights leader of the kind that was so impressive during the 1950s and 1960s," writes author Joseph Epstein:

In those days, there were Martin Luther King, Jr. of the Southern Christian Leadership Conference, Roy Wilkins of the NAACP, Whitney Young of the National Urban League, Bayard Rustin of the A. Philip Randolph Institute - all solid, serious men, each impressive in different ways, who through dignified forbearance and strategic action, brought down a body of unequivocally immoral laws aimed at America's black population.

The NAACP, the Urban League, and the SCLC still exist, notes Epstein,

. . . . yet few people are likely to know the names of their leaders. That is because no black leader has come forth to set out a program for progress for the substantial part of the black population that has remained for generations in the slough of poverty, crime, and despair. . . . In Chicago, where I live, much of the murder and crime that has captured the interest of the media is black-on-black and cannot be chalked up to racism. Except when Bill Cosby, Thomas Sowell, or Shelby Steele and a few others have dared to speak about the pathologies at work, and for doing so these black figures are castigated.

Soon enough, exactly what happened in Ferguson, Missouri will become clear and the matter will be resolved through our legal system. It will take a much longer time before our society begins to confront the real causes of the racial disparities and pathologies which are all too easily, and falsely, attributed to "white racism." Until we do, the sad story of Ferguson is likely to happen again and again.

Family Breakdown: One Important Cause of Many of Society's Ills

In 1965, Daniel Patrick Moynihan, then assistant secretary of labor who went on to serve as Democratic U.S. senator from New York for nearly a quarter century, issued a report warning of a crisis growing for America's black families. It reported a dramatic increase in out-of-wedlock births and one-parent families and warned of the "tangle of pathologies" which resulted. Among these were poor performance in school, increased drug use, and a growing rate of incarceration for crime.

"The Moynihan argument . . . assumed that the troubles impending for black America were unique," writes Nicholas Eberstadt of the American Enterprise Institute:

. . . a consequence of the singular historical burdens that black Americans had endured in our country. That argument was not only plausible at the time, but also persuasive. Yet today that same "tangle of pathology" can no longer be described as characteristic of just one group within our country. Quite the contrary . . . these pathologies are evident throughout all of America today, regardless of race or ethnicity.

Single motherhood has become so common in America that demographers believe that half of all children will live with a single mother at some point before age 18. Research from Princeton University's Sara McLanahan and Harvard University's Christopher Jencks shows that more than 70 percent of all black children are born to an unmarried mother, a threefold increase since the 1960s.

In a new paper, McLanahan and Jencks assess the state of children born to single mothers, nearly fifty years after the Moynihan Report warned that the growing number of fatherless black children would struggle to avoid poverty. The report looks prescient. Black children today are about twice as likely as the national average to live with an unmarried mother. Research is confirming Moynihan's fears that children of unmarried mothers face more obstacles in life.

In the studies reviewed by McLanahan and Jencks, it was found that these children experience more family instability, with new partners moving in and out, and more half-siblings fathered by different men. The growing number of studies in this field also suggest that these children have more problem behaviors and more trouble finishing school.

The growing debate about income inequality ignores the evidence that shows that unwed parents raise poorer children. Isabel Sawhill of the Brookings Institution calculates that returning marriage rates to their 1970 level would lower the child poverty rate by a fifth. There may be a partisan political reason why this point is not made more often. The Economist suggests that, "This omission may be deliberate. Democrats are reluctant to offend unmarried women, 60 percent of whom voted for the party's candidates in 2014."

There may be, some observers point out, a connection between government welfare programs and the breakdown of the family, as well as the declining number of men in the workforce. As late as 1963, on the eve of the War on Poverty, more than 93 percent of American babies were coming into the world with two married parents. According to the 1960 census, nearly 88 percent of children under 18 were then living with two parents. For the quarter century from 1940 to 1965, official data recorded a rise in the fraction of births to unmarried women from 3.8 percent to 7.7 percent. Over the following quarter century, 1965-1990, out-of-wedlock births jumped from 7.7 percent of the nationwide total to 28 percent. The most recently available data are for 2012, which shows America's over-all out-of-wedlock ratio had moved beyond 40 percent.

The trends discussed in the 1965 Moynihan Report for black families have now extended to American families of all racial backgrounds. Among Hispanic Americans, more than 30 percent of children were in single-parent homes by 2013, and well over half were born out-of-wedlock by 2012. Among non-Hispanic white Americans, there were few signs of family breakdown before the massive government entitlement programs began with the War on Poverty in the 1960s. Between 1940 and 1963, the out-of-wedlock birth ratio increased, but only from 2 percent to 3 percent. In 1960, just 6 percent of white children lived with single mothers. As of 2012, the proportion of out-of-wedlock births was 29 percent, nearly 10 times as high as it was just before the War on Poverty.

In his study, The Great Society at Fifty: The Triumph and the Tragedy, Nicholas Eberstadt argues that:

What is indisputable . . . is that the new American welfare state facilitated these new American trends by helping to finance them: by providing support for working-age men who are no longer seeking employment and for single women with children who would not be able to maintain independent households without government aid. Regardless of the origins of the flight from work and family breakdown, the War on Poverty and successive welfare policies have made each of these modern tendencies more feasible as mass phenomena in our country today.

The War on Poverty, of course, did not envision such a result. These were unintended consequences that, as we have seen, are often the case with many well-intentioned government programs. President Lyndon Johnson wanted to bring dependence on government handouts to an eventual end, and did not intend to perpetuate them into the future. Three months after his Great Society speech, Johnson declared:

We are not content to accept the endless growth of relief rolls, of welfare rolls. . . . Our American answer to poverty is not to make the poor more secure in their poverty but to reach down and to help them lift themselves out of the ruts of poverty and move with the large majority along the high road of hope and prosperity.

In Eberhardt's view:

Held against this ideal, the actual unfolding of America's antipoverty policies can be seen only as a tragic failure. Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters much worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth - and on a scale far more vast than could have been imagined in an era before such antipoverty aid was all but unconditionally available.

Any serious discussion of poverty and the growing gaps in income must confront the reasons why, for example, in the past 50 years, the fraction of civilian men ages 25 to 34 who were neither working nor looking for work has quadrupled and that for many women, children, and even working-age men, the entitlement state has become the breadwinner. Daniel Patrick Moynihan once said, "the issue of welfare is not what it costs those who provide it, but what it costs those who receive it."

At the heart of the social and economic decline we face at the present time is the breakdown of the family. Few in the political arena, in either party, are addressing this question. Unless they do, their proposals to move our economy forward and lessen the gaps in income and wealth are unlikely to succeed.

There Is a Growing Danger That Police Are Being Made Scapegoats for Larger Racial Problems That Society Ignores

The attacks upon police for "racism" have been mounting as a result of the killings of black men in Ferguson, Staten Island, and elsewhere. Many with a history of demagoguery when it comes to questions of race relations, Jesse Jackson and Al Sharpton among them, have done their best to keep this issue alive. Sadly, they have cast more heat than light on a question that is far more complex than their self-serving analysis would lead Americans to believe.

Recently, FBI director James Comey addressed this question. At the outset, he declared certain "hard truths," including the fact that the history of law enforcement has been tied to enforcing slavery, segregation, and other forms of discrimination. "One reason we cannot forget our law enforcement legacy," he said, "is that the people we serve and protect cannot forget it, either."

Mr. Comey also acknowledged the existence of unconscious racial bias "in our white-majority culture," and how that influences policing. He conceded that people in law enforcement can develop "different flavors of cynicism" that can be "lazy mental shortcuts," resulting in more pronounced racial profiling.

But he then warned against using police as scapegoats to avoid coming to grips with much more complex problems affecting minority communities, including a lack of "role models, adequate education, and decent employment," as well as "all sorts of opportunities that most of us take for granted." In his address at Georgetown University, Comey declared:

I worry that this incredibly important and difficult conversation about policing has become focused entirely on the nature and character of law enforcement officers when it should also be about something much harder to discuss.

Citing the song "Everyone's a Little Bit Racist" from the Broadway show "Avenue Q," Comey said that police officers of all races viewed black and white men differently using a mental shortcut that "becomes almost irresistible and maybe even rational by some lights" because black men commit crime at a much higher rate than white men.

Comey said that nearly all police officers had joined the force because they wanted to help others. Speaking in personal terms, he described how most Americans had initially viewed Irish immigrants like his ancestors "as drunks, ruffians, and criminals." He noted that, "Law enforcement's biased view of the Irish lives on in the nickname we still use for the vehicle that transports groups of prisoners. It is, after all, the 'Paddy Wagon.'"

If black men are committing crime out of proportion to their numbers, it is important to consider the reason. According to a report just released by the Marriage and Religion Research Institute (MARRI), by age 17 only 17 percent of black teenagers live with two married parents. Professor Orlando Patterson, a Harvard sociologist who is black, published an article in December in the Chronicle of Higher Education, lamenting that "fearful" sociologists had abandoned "studies of the cultural dimensions of poverty, particularly black poverty," and declared that the discipline had become "largely irrelevant."

Now, Patterson and Ethan Fosse, a Harvard doctoral student, are publishing a new anthology called The Cultural Matrix: Understanding Black Youth. In Patterson's view, fifty years after Daniel Moynihan issued his report about the decline of the black family, "History has been kind to Moynihan." Moynihan was concerned about an out-of-wedlock birth rate in the black community of 25 percent. According to the Centers for Disease Control and Prevention, the equivalent rate for 2013 was 71.5 percent. (The rate for non-Hispanic whites was 29.3 percent.)

The inner-city culture that promotes the social dissolution that results in crime has been written about for many years by respected black observers. In 1899, the scholar W. E. B. Du Bois drew on interviews and census data to produce The Philadelphia Negro: A Social Study. He spent a year living in the neighborhood he wrote about, in the midst of what he described as "an atmosphere of dirt, drunkenness, poverty and crime." He observed in language much harsher than Moynihan's, the large number of unmarried mothers, many of whom he referred to as "ignorant and loose." He called upon whites to stop employment discrimination, which he called "morally wrong, politically dangerous, industrially wasteful, and socially silly." He told black readers they had a duty to work harder, to behave better, and to stem the tide of "Negro crime," which he called "a menace to civilized people."

In 1999, on the hundredth anniversary of Du Bois's study, Elijah Anderson published a new sociological study of poor black neighborhoods in Philadelphia, Code of the Street, and recorded its informants' characterization of themselves and their neighbors as either "decent" or "street" or, in some cases, a bit of both. In The Cultural Matrix, Orlando Patterson lists "three main social groups" - the middle class, the working class, and "disconnected street people" that are common in "disadvantaged" African-American neighborhoods. He also lists "four focal cultural configurations" (adapted mainstream, proletarian, street, and hip-hop).

Patterson views the "hip-hop" culture of the inner city as a destructive phenomenon, compares MC Hammer to Nietzsche, contends that hip-hop routinely celebrates "forced abortions" and calls Lil Wayne "irredeemably vulgar" and "all too typical" of the genre. Thomas Shelby, a professor of African and African-American Studies at Harvard, writes in The Cultural Matrix that "suboptimal cultural traits" are the major impediment for many African-Americans seeking to escape poverty. "Some in ghetto communities," he writes, "are believed to devalue traditional co-parenting and to eschew mainstream styles of childbearing."

In his speech on race in 2008, President Obama said that African-Americans needed to take more responsibility for their own communities by "demanding more from our fathers." Fifty years ago, Daniel Moynihan worried that "the Negro community" was in a state of decline with an increasingly matriarchal family structure that led to increasing crime. In the fifteen years after he published his report, the homicide rate doubled, with blacks overrepresented among both perpetrators and victims.

Orlando Patterson, in a recent interview with Slate, said: "I am not in favor of a national conversation on race," and noted that most white people in America had come to accept racial equality. But whether or not such a "national conversation" is useful, we are now in the midst of such an enterprise. FBI director Comey is contributing to that exchange. He asks:

Why are so many black men in jail? Is it because cops, prosecutors, judges and juries are racist because they are turning a blind eye to white robbers and drug dealers? . . . I don't think so. If it were so, that would be easier to address. . . . The percentage of young men not working or not enrolled in school is nearly twice as high for blacks as it is for whites. . . . Young people in those neighborhoods too often inherit a legacy of crime and prison, and with that inheritance they become part of the police officer's life and shape the way that officer, whether white or black, sees the world. Changing that legacy is a challenge so enormous and so complicated that it is, unfortunately, easier to talk only about the cops. And that's not fair. *
Wednesday, 16 December 2015 12:07

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

What Dennis Hastert's Case Tells Us about Washington's Institutional Corruption

We now know that J. Dennis Hastert, who served for eight years as speaker of the House of Representatives, was paying a former student hundreds of thousands of dollars to not say publicly that Hastert had sexually abused him decades ago. According to The New York Times this information became public from "two people briefed on the evidence in an F.B.I. Investigation."

Federal prosecutors announced the indictment of Hastert late in May on allegations that he made cash withdrawals totaling $1.7 million, to evade detection by banks. The federal authorities also charged him with lying to them about the purpose of the withdrawals.

This is a personal tragedy for Hastert and his family. But it also paints a vivid picture of institutional corruption in Washington, how men and women of modest means are elected to public office and, before long, become very wealthy, using their public positions to do so.

How, we may ask, did a former high school teacher who held elective office from 1981 to 2007 leave Congress with a fortune estimated at $4 million to $17 million? When Hastert entered Congress in 1987 his net worth was reported to be at most $270,000. The record shows that he was the beneficiary of lucrative land deals while in Congress and since leaving office he has earned more than $2 million a year as a lobbyist - influencing his former colleagues.

Writing in National Review, John Fund recalls that:

Denny Hastert used to visit The Wall Street Journal where I worked when he was speaker. He was a bland, utterly conventional supporter of the status quo; his idea of reform was to squelch anyone who disturbed Congress' usual way of doing business. I saw him become passionate only once, when he defended earmarks - the special projects such as Alaska's "Bridge to Nowhere" - that members dropped at the last minute into conference reports, deliberately giving no time to debate or amend them. Earmarks reached the staggering level of 15,000 in 2005, and their stench helped cost the GOP control of Congress next year. But Hastert was unbowed. "Who knows best where to put a bridge or a highway or a red light in his district?" I recall him bellowing. I responded that the Illinois Department of Transportation came to mind, and then we agreed to disagree.

The Sunlight Foundation found that Hastert had used a secret trust to join others and invest in farmland near the proposed route of a new road called the Prairie Parkway. He then helped secure a $207 million earmark for the road. The land, approximately 138 acres, was bought for about $2.1 million in 2004 and later sold for almost $5 million, a profit of 140 percent. Local land records and Congressional disclosure forms never identified Hastert as the co-owner of any of the land in the trust. Hastert turned a $1.3 million investment (his portion of the land holdings) into a $1.8 million profit in less than two years.

Once he left Congress, Hastert joined the professional services firm of Dickstein Shapiro, working all sides of various issues and glad-handing his former colleagues in Congress often for controversial clients. From 2011 to 2014, Lorillard Tobacco paid Dickstein Shapiro nearly $8 million to lobby for the benefit of candy-flavored tobacco and electronic cigarettes, and Hastert was the most prominent member of the lobbying team.

Hastert also pressed lawmakers on climate change for Peabody Energy, the largest private sector coal company in the world, in 2013 and 2014 - then switched sides this year and pushed for requiring renewable fuel production for Fuels America. Lawmakers and fellow lobbyists compared Hastert's qualities as a lobbyist to those he displayed as speaker: affable and low-key, but attractive to clients. In the post-earmark world, notes Rep. Tom Cole (R-OK), a senior member of the House Appropriations Committee, Hastert pressed for policy "riders" in appropriations bills and programmatic changes that helped his client's interests.

"As you'd expect, he was very effective," said Cole, "Number one, he knew the process extremely well, and he knew all the players. When the former speaker calls no member rejects it." Former Rep. Jack Kingston, a Republican who led the Appropriations Committee, says, "Yeah, it's possible, he could amass in a 10-year period a nest egg of $5 to $10 million. I'm not saying it's easy, but it's not that hard."

Sadly, Dennis Hastert's use of public office as a path to wealth is hardly unique. National Review reports that:

In 2004, Sen. Harry Reid (D-NV) made $700,000 off a land deal that was, to say the least, unorthodox. It started in 1998 when he bought a parcel of land with attorney Jay Brown, a close friend whose name has surfaced multiple times in organized crime investigations. Reid transferred his portion of the property to Patrick Lane Industrial Center L.L.C., a holding company Brown controlled. But Reid kept putting the property on his financial disclosures and when the company sold it in 2009, he profited from the deal - a deal on land he didn't technically own and that nearly tripled in value in three years.

In addition, according to Judicial Watch, Reid, the Democratic leader of the Senate, sponsored at least $47 million in earmarks that directly benefited organizations with close ties to his son Key Reid.

In 2009, then-House Speaker Nancy Pelosi and her husband, Paul, made the first of three purchases of Visa stock - Visa was holding an initial public offering, among the most lucrative ever. The Pelosis were granted early access to the IPO as "special customers" who received their shares at the opening price, $44. They turned a 50 percent profit in just two days. Starting March 18, the speaker and her husband made the first of three Visa stock buys, totaling between $1 million and $5 million. Peter Schweitzer, a scholar at the Hoover Institution, notes that, "Mere mortals would have to wait until March 19, when the stock would be publicly traded to get their shares." He points out that the Pelosis got their stocks just two weeks after legislation was introduced in the House that would have allowed merchants to negotiate lower interchange fees with credit card companies. Visa's general counsel described it as a "bad bill." The speaker squelched it and kept further action bottled up for more than two years. During the time period, the value of her Visa stock jumped more than 200 percent while the stock market as a whole dropped 15 percent.

Another former House speaker, Newt Gingrich, served as a paid consultant for the drug industry's lobby group, and, according to conservative columnist Timothy Carney:

Gingrich worked hard to persuade Republican congressmen to vote for the Medicare drug subsidy that the industry favored. . . . Newt Gingrich spent the last decade being paid by big business to convince conservatives to support big government policies that would profit his clients.

The role of former members of Congress reaping financial gain by lobbying their former colleagues, as Dennis Hastert did, is increasingly widespread. Former Rep. Billy Tauzin of Louisiana, originally elected as a Democrat but later switching to the Republican Party, left his post as chairman of the powerful House Energy and Commerce Committee to become a lobbyist for the drug industry. In 2009, he reportedly earned $4.48 million as the head of the Pharma drug industry lobby, a huge increase from his congressional salary.

The idea of politicians enriching themselves as a result of their holding political office is something new. Bill and Hillary Clinton have amassed millions from various special interest groups - and foreign interests - hoping to influence a future president. Bill Clinton reported being paid more than $104 million from 2001 through 2012 just for speeches. As recently as the presidencies of Harry Truman and Dwight Eisenhower it was considered unthinkable to trade upon having held high office to enrich oneself. Until 1958, former presidents did not even receive a pension. Congress finally awarded Harry Truman and Herbert Hoover pensions and funds for staff. Washington, Jefferson, Adams, Madison, Monroe and other early leaders lost a great deal of money while serving in office. George Washington, it is reported, had to borrow money to make the trip from Mt. Vernon to New York for his own inauguration. Now, public office has, for many, become a path to riches.

A recent Rasmussen Reports poll finds that 82 percent of the American people now believe that we have a professional political class that is more focused on preserving its power and privilege than on conducting the people's business. Dennis Hastert has become the new face of this phenomenon, along with the Clintons, Nancy Pelosi, Newt Gingrich and a host of others. Whether public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.

Remembering the Real Heroism of Robert E. Lee at Appomattox

The surrender of Confederate Gen. Robert E. Lee to Union Lt. Gen. Ulysses S. Grant 150 years ago effectively ended the Civil War. What few remember today is the real heroism of Robert E. Lee. By surrendering, he was violating the orders given by Jefferson Davis, the elected leader of the Confederacy.

The story of April 1865 is not just one of decisions made, but also of decisions rejected. Lee's rejection of continuing the war as a guerrilla battle, the preference of Jefferson Davis, and Grant's choice to be magnanimous at Appomattox, cannot be overestimated in importance.

With the fall of Richmond, Jefferson Davis and the Confederate government were on the run. Davis, writes Professor Jay Winik of the University of Maryland in his important book April 1865: The Month That Saved America:

. . . was thinking about such things as a war of extermination . . . a national war that ruins the enemy. In short, guerrilla resistance. . . . The day after Richmond fell, Davis had called on the Confederacy to shift from a conventional war to a dynamic guerrilla war of attrition, designed to wear down the North and force it to conclude that keeping the South in the Union would not be worth the interminable pain and ongoing sacrifice.

Davis declared that:

We have now entered upon a new phase of a struggle the memory of which is to endure for the ages. Relieved from the necessity of guarding cities and particular points, important but not vital to our defense, with an army free to move from point to point and strike in detail detachments and garrisons of the enemy, operating on the interior of our own country, where supplies are more accessible, and where the foe will be far removed from his own base and cut off from all succor in case of reverse, nothing is now needed to render our triumph certain but the exhibition of our own unquenchable resolve. Let us but will it, and we are free.

But Robert E. Lee knew the war was over. Grant was magnanimous in victory and, Winik points out:

. . . was acutely aware that on this day, what had occurred was the surrender of one army to another - not of one government to another. The war was very much on. There were a number of potentially troubling rebel commanders in the field. And there were still some 175,000 other Confederates under arms elsewhere; one-half in scattered garrisons and the rest in three remaining rebel armies. What mattered now was laying the groundwork for persuading Lee's fellow armies to join in his surrender - and also for reunion, the urgent matter of making the nation whole again. Thus, it should be no great surprise that there was a curious restraint in Grant's tepid victory message passed on to Washington.

Appomattox was not preordained. "If anything," notes Winik:

. . . retribution had been the larger and longer precedent. So, if these moments teemed with hope - and they did - it was largely due to two men who rose to the occasion, to Grant's and Lee's respective actions: one general, magnanimous in victory, the other gracious and equally dignified in defeat, the two of them, for their own reasons and in their own ways, fervently interested in beginning the process to bind up the wounds of the last four years. . . . Above all, this surrender defied millenniums of tradition in which rebellions typically ended in yet greater shedding of blood. . . . One need only recall the harsh suppression of the peasants' revolt in Germany in the 16th century, or the ravages of Alva during the Dutch rebellion, or the terrible punishments inflicted on the Irish by Cromwell and then on the Scots after Culloden, or the bloodstained vengeance executed during the Napoleonic restoration, or the horrible retaliation imposed during the futile Chinese rebellion in the mid-19th century.

Lee was not alone in rejecting the idea of guerrilla war. General Joe Johnston, offered generous terms of surrender by Union General William Tecumseh Sherman, cabled the Confederate government for instructions. He was ordered to fight on. Johnston was told to take as many of his men as possible and fall back to Georgia. Johnston refused and decided to surrender. Later, he acknowledged that he directly "disobeyed" his instructions. But Johnston, who wired back to Davis that such a plan of retreat was "impracticable," saw no other way. In his view, it would be "the greatest of crimes for us to attempt to continue the war." To fight further, he declared, would only "spread ruin all over the south." By brazenly violating the chain of command, he helped to save many lives and to heal the country.

In early May, when the Mississippi governor and the former governor of Tennessee rode out and urged General Nathan Bedford Forrest to retreat with his cavalry to continue a guerrilla war, Forrest responded: "Any man who is in favor of further prosecution of this war is a fit subject for a lunatic asylum." The attempt to establish a "separate and independent confederacy had failed," Forrest noted, and they should meet their responsibilities "like men." He added: "Reason dictates and humanity demands that no more blood be shed."

In words that echo the sentiments of Robert E. Lee before him, in places almost word for word, Forrest added:

I have never on the field of battle sent you where I was unwilling to go myself, nor would I advise you to a course which I felt myself unwilling to pursue. You have been good soldiers, you can be good citizens. Obey the laws, preserve your honor, and the government to which you have surrendered can afford to be and will be magnanimous.

"April 1865," writes Professor Winik:

. . . was incontestably one of America's finest hours: for it was not the deranged spirit of an assassin that defined the country at war's end, but the conciliatory spirit of leaders who led as much in peace as in war, warriors and politicians who, by their example, their exhortation, their deeds, overcame their personal rancor, their heartache, and spoke as citizens of not two lands, but one, thereby bringing the country together. True, much hard work remained. But much, too, had already been accomplished.

If it were not for Robert E. Lee's decision not to blindly follow irrational instructions to keep fighting a guerrilla war indefinitely, the surrender at Appomattox never would have taken place and our nation's history might have been far different. Fortunately, our American tradition has never embraced the notion of blindly following orders, particularly if they involved illegal or immoral acts. No American could ever escape responsibility for such acts by saying, "I was simply following orders."

The Civil War era poet James Russell Lowell makes this point:

"Taint your eppyletts an' feathers,
Make the thing a grain more right;
"Taint afollerin' your bell-wethers
Will excuse ye in His sight;
Ef you take a sword an' dror it,
An' go stick a feller thru,
Guv'ment aint to answer for it,
God'll send the bill to you.

Without Robert E. Lee's decision to surrender - against his instructions - we would not be celebrating the 150th anniversary of Appomattox at the present time. This heroic act has not been widely recognized. It deserves to be.

The Proposal to Remove Alexander Hamilton from the $10 Bill Is an Assault on American History

Secretary of the Treasury Jack Lew announced in mid-June that the Treasury Department's 2020 redesign of the $10 bill will feature a female portrait. While including women on our currency is long overdue, and a welcome step, removing Alexander Hamilton makes no sense. In fact, it is an assault upon American history itself.

Alexander Hamilton is a towering figure. His story is an inspiring one. The illegitimate son of a British officer who emigrated from the West Indies, Hamilton rose by the sheer force of intellect to shape our entire nation. He died at the age of 50 in 1804. In those short fifty years, his achievements were extraordinary.

Three years before he died he founded The New York Evening Post, which is still being published. George Washington promoted him from an artillery captain to a colonel on his staff during the Revolutionary War. In 1785, Hamilton helped found the New York Manumission Society to work for an end to slavery in that state. Emancipation was not achieved in New York until 1827, long after Hamilton's death.

In 1787, Hamilton, along with collaborators John Jay and James Madison, produced a series of 85 opinion pieces to support the ratification of the Constitution, what we now know as The Federalist Papers. Hamilton wrote 51 of the 85 essays himself, using the pseudonym Publius. When we speak of an American political philosophy, the starting point is The Federalist Papers.

That government should be clearly limited and that power is a corrupting force was the essential perception of the framers of the Constitution. The Founding Fathers were not utopians. They understood man's nature. They attempted to form a government that was consistent with, not contrary to, that nature. Hamilton pointed out that:

Here we have seen enough of the fallacy and extravagance of those idle theories which have amused us with promises of an exemption from the imperfections, weaknesses, and evils incident to society in every shape. Is it not time to awake from the deceitful dream of a golden age, and to adopt as a practical maxim for the direction of our political conduct that we, as well as the other inhabitants of the globe, are yet remote from the happy empire of perfect wisdom and perfect virtue?

As our first Treasury Secretary under George Washington, Hamilton set the nation on a path of financial stability. He had the new federal government assume the debts of the states along with its own. He also declared that all holders of U.S. debt would be paid in a non-discriminating manner. Many were opposed to Hamilton's plans, arguing the assumption of state debts rewarded those states which had been lax in meeting their responsibilities and that a policy of non-discrimination in paying holders of U.S. debt rewarded speculators. In the end, Hamilton prevailed.

Hamilton also established a central bank - the Bank of the United States. This, again, was highly controversial. It would also be a private bank, selling stock and making loans. His collaborator on The Federalist Papers, James Madison, thought the bank was illegal, since there was no mention of a bank in the Constitution. Hamilton responded that a bank was necessary to fulfill a function the Constitution did mention, borrowing money on the credit of the United States. This was, Hamilton argued, an implied power. George Washington and the Congress agreed.

The response to the idea of removing Hamilton from the $10 bill has been uniformly negative from observers of all points of view. Former Federal Reserve Chairman Ben Bernanke said he was "appalled" by the idea of Hamilton's removal. He said:

Replace Andrew Jackson, a man of many unattractive qualities, and a poor president, on the $20 bill. Given his views on central banking, Jackson would probably be fine with having his image dropped from a Federal Reserve note. Another, less attractive possibility, is to circulate two versions of the $10 bill, one of which continues to feature Hamilton. . . . The importance of his achievement can be judged by the problems that the combination of uncoordinated national fiscal policies and a single currency has caused the Eurozone in recent years.

Ron Chernow, the author of a highly regarded biography, Alexander Hamilton (2004), laments that:

There is something sad and shockingly misguided in the spectacle of Treasury Secretary Jack Lew acting to belittle the significance of the foremost Treasury Secretary in U.S. history. . . . Hamilton was undeniably the most influential person in our history who never attained the presidency. . . . Drawing on a blank slate, Hamilton arose as the visionary architect of the executive branch, forming from scratch the first fiscal, monetary, tax and accounting systems. He assembled the Coast Guard, the customs service, and the Bank of the United States. . . . He took a country bankrupted by Revolutionary War debt and restored American credit.

Chernow declares:

Yes, by all means let us have a debate about the political figures on our currency, and, yes, let us now praise famous women. But why on earth should we start the debate by singling out and punishing Alexander Hamilton, who did so much to invent our country.

Alexander Hamilton believed that genuine freedom could only be found in a society which guaranteed economic freedom. In his Second Treatise John Locke, the philosopher who most significantly influenced the thinking of Hamilton and the other Founding Fathers, stated that:

The great and chief end . . . of man's uniting into commonwealths and putting themselves under government is the preservation of their property. . . . Every man has a property in his own person. This nobody has any right to but himself. The labor of his body and the work of his hands, we may say, are properly his. Whatsoever, then, he removed out of the state that nature hath provided and left it in, he hath mixed his labor with it, and joined to it something that is his own, and thereby makes it his property.

Those who advocate an "equal" distribution of property claim that, in doing so, they are simply applying the philosophy of the Founding Fathers to matters of economic concern. Nothing could be further from the truth.

In The Federalist Papers, it is written that:

The diversity in the faculties of men, from which all rights of property originate, is not less an insuperable obstacle to a uniformity of interest. The protection of these faculties is the first object of government. From the protection of different and unequal faculties of acquiring property, the possession of different degrees and kinds of property immediately results.

Writing in The New York Times, Steven Rattner notes that:

The various women who've been put forward for this pioneering role - including Susan B. Anthony (a second try after her dollar coin flopped twice), Harriet Tubman and Eleanor Roosevelt - are all outstanding individuals worthy of recognition. Just don't push Alexander Hamilton aside to make room.

Writing in The Washington Post, Steven Mufson charges that, "By pushing aside Alexander Hamilton . . . the Obama administration has committed a grave historical injustice."

Jack Lew's assault on U.S. history has almost no defenders or supporters. Removing Alexander Hamilton from the $10 bill is clearly a bad idea. What, we must wonder, was Mr. Lew thinking when he came up with this proposal? *

Wednesday, 16 December 2015 12:04

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Anarchy and Conservatism: Two Contradictory Philosophies in Danger of Collision

The original post-World War II conservative movement asked the basic question which the 19th century British Conservative Benjamin Disraeli said was essential. The first thing a conservative must ask, he declared, was what it was he meant to conserve.

What these original modern conservatives sought to conserve was the American political tradition that embraced principles upholding constitutional government, division of powers, freedom of speech, press, and religion, and a respect for individual rights that, they believed, came from the Creator.

Today, some who call themselves conservatives have developed elements of an ideological cult, embracing a series of apparently non-negotiable "principles" which take it far from the sensibilities of those in whose name they speak. A contemporary "conservative," it seems, must reject evolution, must deny climate change, must oppose any restriction on gun ownership, even for the mentally ill, and must reject almost any role for government in American society. Sen. Rand Paul (R-KY) recently said of government, "We need to shut the damn thing down." To reject any element of this virtually religious creed is to be a "RINO" (Republican in name only). What would Abraham Lincoln, Theodore Roosevelt, or Ronald Reagan think of such an enterprise?

Peter Wehner, a senior fellow at the Ethics and Public Policy Center, who served in the last three Republican administrations, notes that:

Conservatives are rightly proud of our Constitution, yet many of them are disdainful of our government. But the Constitution created our system of government, and our goal in political life should be to reform that government back into one we can be proud of again. Understanding government in this way, and taking the steps necessary to enable it to work better and therefore regain the trust of the American people is a worthy calling. And a deeply conservative one, too.

What early conservatives rejected was ideology - Nazism, Communism, fascism, and socialism - that made a wasteland of the 20th century. The American political tradition, from the beginning, was not against government, but was against its abuses, and wanted it to be limited so that freedom would be preserved. In the Federalist Papers, James Madison makes this clear:

It may be a reflection on human nature that such devices should be necessary to control the abuses of government. But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed, and in the next place oblige it to control itself.

Russell Kirk, whose book The Conservative Mind, really launched modern conservatism, believed that the major problem faced in the 20th century was its commitment to "ideology."

In his book The Politics of Prudence (1993), he commends political prudence, one of the four "classical virtues," as opposed to "ideology," a word that signifies political fanaticism. In the initial chapters, some of which were delivered at the Heritage Foundation, he outlines the principles of conservative thought, summarizes important conservative books, and offers brief accounts of eminent conservatives, among them Cicero, Marcus Aurelius, Samuel Johnson, Edmund Burke, Sir Walter Scott, T. S. Eliot and Nathaniel Hawthorne.

The book, he tells us, is meant to be a "defense of prudential politics as opposed to ideological politics." He hoped to persuade the rising generation to set their faces against political extremism and utopian schemes by which the world has been afflicted since 1914:

"Politics is the art of the possible," the conservative says; he thinks of political policies as intended to preserve order, justice, and freedom. The ideologue, on the contrary, thinks of politics as a revolutionary instrument for transforming society and even transforming human nature. In the march toward Utopia, the ideologue is merciless.

The ideologies which have been so costly in our time - Communism, fascism and Nazism - are, Kirk points out, really "inverted religions." But, he notes:

. . . the prudential politician knows that "Utopia" means "Nowhere"; that we cannot march to an earthly Zion; that human nature and human institutions are imperfectible; that aggressive "righteousness" in politics ends in slaughter. True religion is a discipline for the soul, not for the state. . . . It is the conservative leader who, setting his face against all ideologies, is guided by what Patrick Henry called "the lamp of experience." In this 20th century, it has been the body of opinion generally called "conservative" that has defended the Permanent Things from ideological assaults.

Conservatism, Kirk writes:

. . . is not a bundle of theories got up by some closet philosopher. On the contrary . . . the conservative conviction grows out of experience: the experience of the species, of the nation, of the person. . . . It is the practical statesman, rather than the visionary recluse, who has maintained a healthy tension between the claims of authority and the claims of freedom. . . . The Constitution of the United States, two centuries old, is a sufficient example of the origin of conservative institutions in a people's experience . . . . (T)he Constitution . . . was rooted in direct personal experience of the political and social institutions which had developed in the Thirteen Colonies since the middle of the 17th century, and in thorough knowledge of the British growth, over seven centuries, of parliamentary government, ordered freedom and the rule of law.

The triumph of ideology would, Kirk notes, be the triumph of what Edmund Burke called the "antagonist world." This, in Kirk' s view, is:

. . . the world of disorder, what the conservative seeks to conserve is the world of order that we have inherited, if in a damaged condition, from our ancestors. The conservative mind and the ideological mind stand at opposite poles. And the contest between these two mentalities may be no less strenuous in the 21st century than it has been during the 20th.

The basic difference between conservatives and the advocates of the many ideologies which clutter the intellectual landscape, including extreme forms of "libertarianism" which often border on anarchy, and "neo-conservatism" which are often confused with conservatism, relates to the nature of man himself:

Man, being imperfect, no perfect social order can ever be created. Because of human restlessness, mankind would grow rebellious under any utopian domination, and would break out once more in violent discontent - or else expire in boredom. . . . To seek for utopia is to end in disaster, the conservative says, "We are not made for perfect things." All that we can reasonably expect is a tolerably ordered, just, and free society, in which some evils, maladjustments, and suffering will continue to lurk. By proper attention to prudent reform, we may preserve and improve this tolerable order. But if the old institutional and moral safeguards of a nation are neglected, then the anarchic impulse in mankind breaks loose. . . . The ideologues who promise the perfection of man and society have converted a great part of the 20th century world into a terrestrial hell.

Russell Kirk advised the new generation to explore the past, discover the roots of our civilization, and work to restore its sensibility. He concludes:

Time is not a devourer only. With proper use of the life-span allotted to us, we may do much to redeem modernity from vices, terrors, and catastrophic errors.

Many who now claim to speak for conservatism, among them glib radio talk show hosts and partisan politicians, have forgotten Disraeli's question about what it is that conservatives really seek to conserve. If it is the American political tradition, embodied in our Constitution and in the thinking of the Founding Fathers, contempt for government and belief in virtual anarchy is no place to be found. Neither is adherence to a form of political orthodoxy enforced by inquisition-like tribunals. Alexander Hamilton, Thomas Jefferson, John Adams, and James Madison would find little they would recognize as an American political tradition in such phenomenon. Many of those who proclaim themselves most loudly to be "conservative," are, in reality, something quite different.

"White Racism" Is the Scapegoat in Baltimore, Not the Culprit

Unrest in Baltimore, and legitimate questions about the death in police custody of Freddie Gray, a young African-American, have produced the usual charges of "white racism" and comparisons with the death of another young black man in Ferguson, Missouri several months ago, as well as similar incidents in North Charleston, South Carolina, and Staten Island, New York. Each of these cases is different, and characterizing them as part of a single pattern of police behavior may be missing the reality of what is, in fact, taking place.

In Ferguson, for example, it was pointed out that the community was majority black and the police force was largely white. In Baltimore, however, the mayor, City Council president, police commissioner and nearly half of its 3,000-member police force are black. It is unlikely that young black men are being unfairly targeted by black city officials on the basis of race.

What we see in Baltimore's inner city - a breakdown of family life, massive unemployment, drug use, and school drop-outs - has not been created by "white racism." There are many, far more complex causes.

Baltimore was once a city where tens of thousands of blue-collar employees earned a good living in industries building cars, airplanes, and making steel. Thomas J. Vicino, the author of Transforming Race and Class in Suburbia: Decline in Metropolitan Baltimore, points to major manufacturing facilities operated by Bethlehem Steel, General Motors and Martin Marietta. In 1970, about a third of the labor force in Baltimore was employed in manufacturing. By 2000, only 7 percent of city residents had manufacturing jobs, and losses have been continuing.

Dr. Vicino, a professor at Northeastern University and a Maryland native, argues that:

We need to reframe the problem more broadly than racial profiling and police brutality. . . . The bigger context is the globalization of the economy, technological change and de-industrialization. This is a double whammy for poor black people left in the city. They are not in a position to share in the development downtown and, with the loss of manufacturing jobs, they are left, at best, with access to relatively low-paying service jobs. This, in turn, creates a spiral for those left behind, damaging families and devastating neighborhoods.

Professor William Julius Wilson of Harvard, who teaches a course based on "The Wire," the HBO show set in Baltimore, says:

Regular employment provides the anchor for the spatial and temporal aspects of daily life; it determines where you are going to be and when you are going to be there. In the absence of regular employment, life, including family life, becomes less coherent.

Globalization, as embodied at the present time by the Trans-Pacific Partnership, promotes what it calls "free" trade. Yet some critics, both on the right and left, argue that trade which is not also "fair," puts Americans at a great disadvantage. American corporations must pay a minimum wage, obey OSHA rules about worker safety, follow environmental regulations, and deal with labor unions. None of this is true for companies in China, Bangladesh, Indonesia, or India, among others. Is such "free" trade really "free"?

Beyond all of this is the very real breakdown of the black family in our inner cities, where 72 percent of babies are born to single mothers. We know that children with absent fathers commit more crimes and are more likely to drop out of school. In his book The Best Parent Is Both Parents, David Levy, who served as president of the Children's Rights Council, reported that neither poverty nor race, but the fragile structure of the family, is the primary cause of crime.

Douglas A. Smith and G. Roger Jarjoura published findings in the Journal of Research in Crime and Delinquency analyzing victimization data on over 11,000 individuals from three urban areas. They discovered that the proportion of single-parent households in a community predicts its rate of violent crime and burglary while poverty level does not. Furthermore, the percentage of non-whites in an area has "no significant influence on rates of violent crime."

Because so many Americans have, in large numbers, abandoned the responsibility of child rearing, many young people are particularly vulnerable to the inducements of the drug culture. Dr. Lorenzo Merritt of Project Heavy West, a nonprofit counseling center in Los Angeles that tries to help children stay out of jail, said that they join gangs and the drug culture:

. . . fundamentally because of a need for acceptance and identity. It generally means an absence of a cohesive . . . family life where there is a sense of belonging and respect.

If black men are committing crime out of proportion to their numbers, it is important to consider the reason. According to a recent report issued by the Marriage and Religion Research Institute, by age 17, only 17 percent of black teenagers live with two married parents. Professor Orlando Patterson, a Harvard sociologist who is black, has published an article in the Chronicle of Higher Education lamenting that "fearful" sociologists had abandoned "studies of the cultural dimensions of poverty, particularly black poverty," and declared that the discipline had become "largely irrelevant."

Patterson asks:

Why are so many black men in jail? Is it because cops, prosecutors, judges, and juries are racist because they are turning a blind eye to white robbers and drug dealers? . . . . I don't think so. If it were so, that would be easier to address. . . . The percentage of young men not working or not enrolled in school is nearly twice as high for blacks as it is for whites. . . . Young people in those neighborhoods too often inherit a legacy of crime and prison, and with that inheritance they become part of the police officer's life and shape the way that officer, whether white or black, sees the world. Changing that legacy is a challenge so enormous and so complicated that it is, unfortunately, easier to talk only about the cops. And that's not fair.

In Baltimore, 25-year-old Freddie Gray, whose death in police custody has led to the growing unrest in the city, grew up in Sandtown-Winchester, one of Baltimore's most impoverished and crime-ridden communities. It has the highest incarceration rate in the state, an unemployment rate of over 50 percent for males ages 16 to 64 and a medium household income of under $25,000, according to research from the Justice Policy Institute and the Prison Policy Initiative.

Gray's rap sheet was a long one. His criminal history started in July 2007 with an arrest on charges of "possession of a controlled dangerous substance with intent to distribute." Overall, he had more than a dozen arrests, mostly drug-related. The latest was in March on a charge of "possession of a controlled dangerous substance." Drugs are a major reason why police patrolled Gray's neighborhood. David Simon, creator of "The Wire" and a former Baltimore Sun journalist, says that:

. . . the drug war . . . was transforming in terms of police/community relations, in terms of trust . . . the drug war was as much a function of class and social control as it was of racism.

The real issues in Baltimore go beyond questions of racism and police behavior. Jim Pasco, executive director of the National Legislative Office of the Fraternal Order of Police, says that:

The real issue is poverty and lack of quality education, lack of economic opportunity, a decaying city infrastructure, lack of sound parenting and mentorship. These kids, the odds are against them from the time of their conception, and it's a very, very convenient political outlet to blame the police for things that go on in the inner cities. But the fact of the matter is that these are problems that generations of bad elected leadership has resulted in.

The Baltimore Police Department has numerous outreach programs that connect police with underprivileged families and give communities a chance to communicate directly with officers. It holds monthly council meetings throughout the city where community leaders can express concerns over issues in their neighborhoods. Some community leaders have even been placed on boards that determine executive promotions within the police departments.

There have been some hopeful signs amidst the chaos in Baltimore. After rioting, residents by the hundreds cleaned debris from streets and stores that had been looted. Churches organized food drives for neighborhoods hit by rioting and teachers, who knew that closed schools meant children would go without meals, set up food stations in churches. Police and firefighters, the targets of rage when rioting began, were inundated with cakes, pies and thanks for their service. Church, community and political leaders took to the streets to urge calm and help enforce the curfew.

Events in Baltimore highlight the crisis being faced in our inner cities - from a variety of causes, from the de-industrialization of our urban areas due to globalization to the breakdown of the black family and the absence of fathers in the home. No problems can be confronted or resolved if they are misdiagnosed. The charge of "white racism," in reality, is the scapegoat for the problems in Baltimore and other urban areas, not the real culprit.

What Hillary Clinton's Attempt to Re-create Herself Tells Us About American Politics

Hillary Clinton, in announcing her presidential candidacy, is now engaged in an effort to re-create herself. The unusual commercial she used to introduce her campaign has received critical reviews, from liberals and conservatives, Republicans and Democrats. Liberal columnist Richard Cohen wrote that:

It looked like one of those Vaseline-lensed dog-food commercials, so lacking substance that I wondered if I had summoned the wrong video from the Internet. . . . All I can remember is a bunch of happy people and Clinton saying something about being on the side of the middle class. . . . I think it is no mere coincidence that the Clinton campaign now has the services of Wendy Clark, a senior marketing specialist from Coca-Cola. Maybe Clinton will "teach the world to sing."

This announcement video was followed by Clinton's strange van ride to Iowa, complete with video of her ordering a burrito bowl at Chipotle. She did this while wearing dark glasses, as did her aide Huma Abedin, which produced security-camera pictures making if appear that they were traveling incognito. She said that she would be an advocate for people like "the truckers that I saw on I-80 as I was driving here."

Perhaps appealing to authenticity would fail in Mrs. Clinton's case. After losing to Richard Nixon in 1968, the Democratic Party candidate, Hubert Humphrey, conceded that his effort to be authentic, his real self, might have done him in:

It's an abomination for a man to place himself completely in the hands of the technicians, the ghost writers, the experts, the pollsters and come out only as an attractive package.

After all of her years in public life, no one really knows where Hillary Clinton stands on any issue. The one constant is her desire to be president. In a cover article "What Does Hillary Stand For?" The Economist declares:

For someone who has been on the national stage for a quarter-century, her beliefs are hard to pin down. On foreign policy, she says she is neither a realist nor an idealist but an "idealistic realist." In a recent memoir, she celebrates "the American model of free markets for free people." Yet to a left-wing crowd, she says, "Don't let anybody tell you, that, you know, it's corporations and businesses that create jobs.". . . Some candidates' views can be inferred from the advisers they retain, but Mrs. Clinton has hundreds, including luminaries from every Democratic faction. Charles Schumer, her former Senate colleague from New York, called her "the most opaque person you'll ever meet in your life."

In The Economist's view:

Skeptics raise two further worries about Mrs. Clinton. Some say she is untrustworthy - a notion reinforced by the revelation that she used a private server for her e-mails as Secretary of State, released only the ones she deemed relevant and then deleted the rest. The other worry, which she cannot really allay, is that dynasties are unhealthy, and that this outweighs any benefit America might gain from electing its first female president.

The Clinton candidacy tells us a great deal about the current state of American politics. We often forget that public opinion is usually carefully manipulated, in the present era by an army of public relations consultants. Discussing the start of a campaign some years ago, David R. Altman, chairman of the Altman, Stoller and Weiss advertising agency, assessed the influence of advertising upon American politics this way:

The annual exercise in political irrelevance has begun. Once again, the American viewing public is being subjected to a barrage of flashy thirty- and sixty-second spot announcements urging votes for this candidate or that. TV has become the most destructive political force we have known. It is an open invitation to the demagogue, a path to elective office for the incompetent but glib candidate, and it is a definite deterrent for the brilliant but full office seeker. It has changed the rules of the game of politics from "let the better candidate have a chance to win" to "let the most appealing candidate win."

Mr. Altman charged that:

For the most part, political ads on TV perform what I consider to be a massive confidence game on the American people. Why? Because political commercials do not as a rule inform the electorate. They stimulate the emotions. They arouse passions. They polarize people on different sides of the political street. They use trickery - trick lighting, trick makeup, a full gamut of Hollywood special effects - and occasionally candidates have been known to tell lies on television. What has been the result? We consistently elect candidates who later "surprise" us - who turn out to be different from the image perceived during the campaign.

The well-known political consultant David Garth once said, "You've seen one of my campaigns, you've seen them all." His technique was to put together cinema variety clips of his candidate and show them on television. Illustrative of his technique was the 1969 reelection campaign of New York City Mayor John Lindsay, when Lindsay was an unpopular mayor. He won a second term in large part because of Mr. Garth's advertising campaign in which Lindsay repeatedly told voters, "I made mistakes." This approach, Garth recalled, was highly successful.

Now, Hillary Clinton is trying to become a person different from the one all of us have come to know, realizing that victory in the presidential race requires such a radical make-over. Is this really going to be a successful enterprise? Former Sen. Jim Webb of Virginia, who is considering a presidential run of his own, says that "people are looking for leadership they can trust" and that Americans would like to go back to the party of Harry S. Truman and Franklin D. Roosevelt.

Webb states:

Think about Harry Truman, what he would be saying to someone who told him he needed a consultant to show him how to dress or a lifestyle consultant to tell him that he needed to go to Wal-Mart. You know, we need people who will, in politics, lead the same way that they live.

Enthusiasm for Hillary Clinton's candidacy is difficult to find among the liberal commentators who might be expected to be a bit more supportive. Washington Post columnist Eugene Robinson wrote that:

The choreographed launch was over-thought, over-produced and, in the scheme of things, not terribly important in details. Everyone already knew she was running.

Hillary Clinton's candidacy tells us a great deal about contemporary American politics, none of it good. *

Wednesday, 16 December 2015 12:01

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

As Extremism Grows Among Europe's Muslim Immigrants, There Are Lessons to Be Learned from America's Melting Pot

Young Muslims from Europe are traveling to the Middle East in growing numbers to join ISIS. They have been involved in the beheadings of Western hostages and are busy urging others to leave Europe and the United States as well to join their ranks.

There is, if seems, at least some level of support for this extremism within immigrant communities. A poll of British Muslims found that 27 percent had some sympathy for the motives behind January's Islamist attacks in Paris against the satirical magazine Charlie Hebdo and a kosher supermarket. Eleven percent agreed that those who publish images of the Prophet Mohammed deserve to be attacked. The poll was conducted for the BBC between January 26 and February 20. There are about 2.8 million Muslims in Britain, about 4.4 percent of the population. "These are, as far as I'm concerned, worrying statistics," said Sayeeda Warsi, who was Britain's first female Muslim minister, before resigning last year over the government's policy on the war in Gaza.

In France, as the nation reeled from the terrorist attacks in Paris, reports filled the newspapers and T.V. newscasts of young Muslim students refusing to honor the dead, highlighting the sharp divisions in French society. Young Muslims in France live largely in a separate, segregated world. A 2012 study by the Organization for Economic Cooperation and Development found that France leads Europe in educational inequalities stemming from social and ethnic origins. France's National Council for the Evaluation of the School System has spoken of "school ghettos," referring to districts where dropout rates are high and performances exceptionally weak.

Starting in the 1950s, immigrants from Algeria, Morocco and Tunisia began arriving in France. They were often sent to isolated housing projects that bred alienation. The expected smooth integration never took place. Despite a high unemployment rate, approximately 200,000 immigrants have arrived in France every year since 2004. Muslims now make up about 8 percent of the country, constituting the largest Muslim population in Western Europe. Four out of ten French recently surveyed said they considered Muslims "to be a threat to our national identity." Marine Le Pen, the leader of France's increasingly popular National Front, referred to Muslims praying in the street as an "occupation" of France.

The failure of France and other European countries to assimilate their growing Muslim immigrant population into the larger society is sowing seeds of future turmoil. The exodus of young recruits to ISIS from the immigrant neighborhoods of Paris, London, Brussels, Copenhagen, and other European cities is a indication of further turmoil to come. These young people with British, Danish, and French passports are likely to return and emulate the terrorist attacks we have recently witnessed.

Americans are not immune from this phenomenon. The attack upon the Boston Marathon is one example. The numbers of young American Muslims who have joined ISIS are, thus far, small, but efforts to recruit in immigrant communities, as among Somalis in Minnesota, are growing. But our own society has some experience with assimilating immigrants from around the world, integrating them into our society, and making them Americans. As Herman Melville said in the 19th century, "If you shed a drop of American blood, you shed the blood of the whole world." For its own survival, Europe would do well to study our melting pot experience.

Some time ago, Professor Seymour Martin Lipset of the Hoover Institution of Stanford University, criticized those who were promoting bilingualism and multiculturalism in American public schools:

The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension and tragedy. Canada, Belgium, Malaysia, and Lebanon - all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons, and Corsicans.

Remembering the way American public schools served to bring children of immigrants into the mainstream, Fotine Z. Nicholas, who taught for 30 years in New York City schools and wrote an education column for a Greek-American weekly, notes:

I recall with nostalgia the way things used to be. At P.S. 82 in Manhattan, 90 percent of the students had European-born parents. Our teachers were mostly of Irish origin, and they tried hard to homogenize us. We might refer to ourselves as Czech or Hungarian or Greek but we developed a pride in being American. . . . There were two unifying factors: the attitude of our teachers and the English language. . . . After we started school, we spoke only English to our siblings, our classmates and our friends. We studied and wrote in English, we played in English, we thought in English.

Successive waves of immigrants have assimilated into the American society. They entered a United States that had self-confidence and believed in its own culture, history, and values, and was determined to transmit them to the newcomers. And the immigrants themselves wanted to become Americans. Our traditional response to the problem of assimilation, The Economist points out:

. . . was to treat each immigrant as an individual. . . . The essential American promise is that individuals will rise or fall on their own merits. . . . Waving the banner of diversity, opponents of the melting pot are in danger of promoting ethnic division as a matter of public policy. . . . The government should not only oppose legal distinctions between ethnic groups; it should also do more to build a common American culture through education. . . . If children are taught to see themselves as members of an ethnic group, rather than as Americans, the U.S. will rapidly become disunited.

If some in the U.S. have retreated from our melting pot philosophy, the countries of Western Europe have never properly embraced it. They seem not to know how to make their Muslim immigrants French, British, or Belgian. Perhaps they should have considered this dilemma more carefully before they opened their doors to these immigrants. They now have a lot of catching up to do.

It is important to remember that by coming to the U.S. and Western Europe, immigrants are voting with their feet for our system and our way of life. They should be helped to assimilate into our societies, not to recreate here and in Europe the very systems they have escaped at such high cost.

In his Wriston lecture on "Universal Civilization," V. S. Naipaul, the son of immigrant Indian laborers who grew up in post-colonial Trinidad and was educated in England, contrasts some of the static, inward-looking, insular, backsliding "non-Western" cultures with that spreading "universal civilization" that he finds to be based on Jefferson's idea of the pursuit of happiness. Discussing the essence of Western civilization, which sets it apart from others, Naipaul characterizes it in these terms:

The ideal of the individual, responsibility, choice, the life of the intellect, the idea of vocation and perfectibility and achievement. It is an immense human idea. It cannot be reduced to a fixed system nor generate fanaticism. But it is known to exist and because of that, other more rigid systems in the end blow away.

The American society traces the rights we take for granted back to the Magna Carta. The idea of trial by jury, due process of law and limits upon government power come from this ancient English charter. The fact that the majority of present-day Americans cannot trace their individual ancestry to England bears little relationship to the British nature of American culture. In America's British Culture, Russell Kirk argues that:

Two centuries after the first U.S. census was taken, nearly every race and nationality in the world had contributed to the American population, but the culture of America remains British. . . . The many millions of newcomers to the U.S. have accepted integration into the British-descended American culture with little protest, and often with great willingness.

The challenge for Western Europe is to assimilate its growing Muslim immigrant population into the French, British and other cultures and societies in which they now live. The American experience provides a model of how this might be achieved. If these immigrants remain isolated and alienated, Europe will face increasingly stormy days ahead.

Family Breakdown: One Important Cause of Many of Society's Ills

In 1965, Daniel Patrick Moynihan, then assistant secretary of labor who went on to serve as Democratic U.S. senator from New York for nearly a quarter-century, issued a report warning of a crisis growing for America's black families. It reported a dramatic increase in out-of-wedlock births and one-parent families and warned of the "tangle of pathologies" which resulted. Among these were poor performance in school, increased drug use, and a growing rate of incarceration for crime.

"The Moynihan argument . . . assumed that the troubles impending for black America were unique," writes Nicholas Eberstadt of the American Enterprise Institute.

. . . a consequence of the singular historical burdens that black Americans had endured in our country. That argument was not only plausible at the time, but also persuasive. Yet today that same "tangle of pathology" can no longer be described as characteristic of just one group within our country. Quite the contrary . . . these pathologies are evident throughout all of America today, regardless of race or ethnicity.

Single motherhood has become so common in America that demographers believe that half of all children will live with a single mother at some point before age 18. Research from Princeton University's Sara McLanahan and Harvard University's Christopher Jencks shows that more than 70 percent of all black children are born to an unmarried mother, a threefold increase since the 1960s.

In a new paper, McLanahan and Jencks assess the state of children born to single mothers, nearly fifty years after the Moynihan Report warned that the growing number of fatherless black children would struggle to avoid poverty. The report looks prescient. Black children today are about twice as likely as the national average to live with an unmarried mother. Research is confirming Moynihan's fears that children of unmarried mothers face more obstacles in life.

In the studies reviewed by McLanahan and Jencks, it was found that these children experience more family instability, with new partners moving in and out, and more half-siblings fathered by different men. The growing number of studies in this field also suggest that these children have more problem behaviors and more trouble finishing school.

The growing debate about income inequality ignores the evidence that shows that unwed parents raise poorer children. Isabel Sawhill of the Brookings Institution calculates that returning marriage rates to their 1970 level would lower the child poverty rate by a fifth. There may be a partisan political reason why this point is not made more often. The Economist suggests that, "This omission may be deliberate. Democrats are reluctant to offend unmarried women, 60 percent of whom voted for the party's candidates in 2014."

There may be, some observers point out, a connection between government welfare programs and the breakdown of the family, as well as the declining number of men in the workforce. As late as 1963, on the eve of the War on Poverty, more than 93 percent of American babies were coming into the world with two married parents. According to the 1960 census, nearly 88 percent of children under 18 were then living with two parents. For the quarter century from 1940 to 1965, official data recorded a rise in the fraction of births to unmarried women from 3.8 percent to 7.7 percent. Over the following quarter century, 1965-1990, out-of-wedlock births jumped from 7.7 percent of the nationwide total to 28 percent. The most recently available data are for 2012, which shows America's over-all out-of-wedlock ratio had moved beyond 40 percent.

The trends discussed in the 1965 Moynihan Report for black families have now extended to American families of all racial backgrounds. Among Hispanic Americans, more than 30 percent of children were in single-parent homes by 2013, and well over half were born out of wedlock by 2012. Among non-Hispanic white Americans, there were few signs of family breakdown before the massive government entitlement programs began with the War on Poverty in the 1960s. Between 1940 and 1963, the out-of-wedlock birth ratio increased, but only from 2 percent to 3 percent. In 1960, just 6 percent of white children lived with single mothers. As of 2012, the proportion of out-of-wedlock births was 29 percent, nearly 10 times as high as it was just before the War on Poverty.

In his study, The Great Society at Fifty: The Triumph and the Tragedy, Nicholas Eberstadt argues that:

What is indisputable . . . is that the new American welfare state facilitated these new American trends by helping to finance them: by providing support for working-age men who are no longer seeking employment and for single women with children who would not be able to maintain independent households without government aid. Regardless of the origins of the flight from work and family breakdown, the War on Poverty and successive welfare policies have made each of these modern tendencies more feasible as mass phenomena in our country today.

The War on Poverty, of course, did not envision such a result. These were unintended consequences that, as we have seen, are often the case with many well-intentioned government programs. President Lyndon Johnson wanted to bring dependence on government handouts to an eventual end, and did not intend to perpetuate them into the future. Three months after his Great Society speech, Johnson declared:

We are not content to accept the endless growth of relief rolls of welfare rolls. . . . Our American answer to poverty is not to make the poor more secure in their poverty but to reach down and to help them lift themselves out of the ruts of poverty and move with the large majority along the high road of hope and prosperity.

In Eberhardt's view:

Held against this ideal, the actual unfolding of America's antipoverty policies can be seen only as a tragic failure. Dependence on government relief, in its many modern versions, is more widespread today, and possibly also more habitual, than at any time in our history. To make matters much worse, such aid has become integral to financing lifestyles and behavioral patterns plainly destructive to our commonwealth - and on a scale far more vast than could have been imagined in an era before such antipoverty aid was all but unconditionally available.

Any serious discussion of poverty and the growing gaps in income must confront the reasons why, for example, in the past 50 years, the fraction of civilian men ages 25 to 34 who were neither working nor looking for work has quadrupled and that for many women, children, and even working-age men, the entitlement state has become the breadwinner. Daniel Patrick Moynihan once said "the issue of welfare is not what it costs those who provide it, but what it costs those who receive it."

At the heart of the social and economic decline we face at the present time is the breakdown of the family. Few in the political arena, in either party, are addressing this question. Unless they do, their proposals to move our economy forward and lessen the gaps in income and wealth are unlikely to succeed.

There Is a Growing Danger that Police Are Being Made Scapegoats for Larger Racial Problems that Society Ignores

The attacks upon police for "racism" have been mounting as a result of the killings of black men in Ferguson, Missouri, Staten Island, and elsewhere. Many with a history of demagoguery when it comes to questions of race relations, Jesse Jackson and Al Sharpton among them, have done their best to keep this issue alive. Sadly, they have cast more heat than light on a question that is far more complex than their self-serving analysis would lead Americans to believe.

Recently, FBI director James Comey addressed this question. At the outset, he declared certain "hard truths," including the fact that the history of law enforcement has been tied to enforcing slavery, segregation, and other forms of discrimination. "One reason we cannot forget our law enforcement legacy," he said, "is that the people we serve and protect cannot forget it, either."

Mr. Comey also acknowledged the existence of unconscious racial bias "in our white-majority culture," and how that influences policing. He conceded that people in law enforcement can develop "different flavors of cynicism" that can be "lazy mental shortcuts," resulting in more pronounced racial profiling.

But he then warned against using police as scapegoats to avoid coming to grips with much more complex problems affecting minority communities, including a lack of "role models, adequate education, and decent employment," as well as "all sorts of opportunities that most of us take for granted." In his address at Georgetown University, Comey declared:

I worry that this incredibly important and difficult conversation about policing has become focused entirely on the nature and character of law enforcement officers when it should also be about something much harder to discuss.

Citing the song "Everyone's a Little Bit Racist" from the Broadway show "Avenue Q," Comey said that police officers of all races viewed black and white men differently using a mental shortcut that "becomes almost irresistible and maybe even rational by some lights" because black men commit crime at a much higher rate than white men.

Comey said that nearly all police officers had joined the force because they wanted to help others. Speaking in personal terms, he described how most Americans had initially viewed Irish immigrants like his ancestors "as drunks, ruffians, and criminals." He noted that, "Law enforcement's biased view of the Irish lives on in the nickname we still use for the vehicle that transports groups of prisoners. It is, after all, the 'Paddy Wagon.'"

If black men are committing crime out of proportion to their numbers, it is important to consider the reason. According to a report just released by the Marriage and Religion Research Institute (MARRI), by age 17, only 17 percent of black teenagers live with two married parents. Professor Orlando Patterson, a Harvard sociologist who is black, published an article in December in the Chronicle of Higher Education, lamenting that "fearful" sociologists had abandoned "studies of the cultural dimensions of poverty, particularly black poverty," and declared that the discipline had become "largely irrelevant."

Now, Patterson and Ethan Fosse, a Harvard doctoral student, are publishing a new anthology called The Cultural Matrix: Understanding Black Youth. In Patterson's view, fifty years after Daniel Moynihan issued his report about the decline of the black family, "History has been kind to Moynihan." Moynihan was concerned about an out-of-wedlock birth rate in the black community of 25 percent. According to the Centers for Disease Control and Prevention, the equivalent rate for 2013 was 71.5 percent. (The rate for non-Hispanic whites was 29.3 percent.)

The inner-city culture that promotes the social dissolution that results in crime has been written about for many years by respected black observers. In 1899, the scholar W. E. B. Du Bois drew on interviews and census data to produce The Philadelphia Negro: A Social Study. He spent a year living in the neighborhood he wrote about, in the midst of what he described as "an atmosphere of dirt, drunkenness, poverty and crime." He observed in language much harsher than Moynihan's, the large number of unmarried mothers, many of whom he referred to as "ignorant and loose." He called upon whites to stop employment discrimination, which he called "morally wrong, politically dangerous, industrially wasteful, and socially silly." He told black readers they had a duty to work harder, to behave better, and to stem the tide of "Negro crime," which he called "a menace to civilized people."

In 1999, on the hundredth anniversary of Du Bois's study, Elijah Anderson published a new sociological study of poor black neighborhoods in Philadelphia, Code of the Street, and recorded its informants' characterization of themselves and their neighbors as either "decent" or "street" or, in some cases, a bit of both. In The Cultural Matrix, Orlando Patterson lists "three main social groups" - the middle class, the working class, and "disconnected street people" that are common in "disadvantaged" African-American neighborhoods. He also lists "four focal cultural configurations" (adapted mainstream, proletarian, street, and hip-hop).

Patterson views the "hip-hop" culture of the inner city as a destructive phenomenon, and compares MC Hammer to Nietzsche, contends that hip-hop routinely celebrates "forced abortions" and calls Lil Wayne "irredeemably vulgar" and "all too typical" of the genre. Thomas Shelby, a professor of African and African-American Studies at Harvard, writes in The Cultural Matrix that "suboptimal cultural traits" are the major impediment for many African-Americans seeking to escape poverty. "Some in ghetto communities," he writes, "are believed to devalue traditional co-parenting and to eschew mainstream styles of childbearing."

In his speech on race in 2008, President Obama said that African-Americans needed to take more responsibility for their own communities by "demanding more from our fathers." Fifty years ago, Daniel Moynihan worried that "the Negro community" was in a state of decline with an increasingly matriarchal family structure that led to increasing crime. In the fifteen years after he published his report, the homicide rate doubled, with blacks overrepresented among both perpetrators and victims.

Orlando Patterson, in a recent interview with Slate, said: "I am not in favor of a national conversation on race," and noted that most white people in America had come to accept racial equality. But whether or not such a "national conversation" is useful, we are now in the midst of such an enterprise. FBI director Comey is contributing to that exchange. He asks:

Why are so many black men in jail? Is it because cops, prosecutors, judges and juries are racist because they are turning a blind eye to white robbers and drug dealers? . . . I don't think so. If it were so, that would be easier to address. . . . The percentage of young men not working or not enrolled in school is nearly twice as high for blacks as it is for whites. . . . Young people in those neighborhoods too often inherit a legacy of crime and prison, and with that inheritance they become part of the police officer's life and shape the way that officer, whether white or black, sees the world. Changing that legacy is a challenge so enormous and so complicated that it is, unfortunately, easier to talk only about the cops. And that's not fair.

A New Look at the Declaration of Independence

American students are less proficient in their nation's history than in any other subject, according to results of a recent nationwide test, with most fourth graders unable to say why Abraham Lincoln was an important figure, and few high school seniors able to identify China as the North Korean ally that fought American troops during the Korean War.

Diane Ravitch, an education historian, was invited by the National Assessment of Educational Progress, which conducted the test, to review the results. She said she was particularly concerned by the fact that only 2 percent of 12th graders correctly answered a question concerning Brown v. Board of Education. Students were given an excerpt including the following passage and were asked what social problem the 1954 ruling was supposed to correct:

We conclude that in the field of public education, separate but equal has no place, separate educational facilities are inherently unequal.

Ms. Ravitch declared: "The answer was right in front of them. This is alarming." Secretary of education Arne Duncan said, "The results tell us that, as a country, we are failing to provide children with a high-quality, well-rounded education."

Yet, if we are not teaching history in our schools, there is widespread interest in our history in the larger society. Biographies of figures such as Abraham Lincoln, John Adams, Thomas Jefferson and Andrew Jackson have all been best sellers in recent years. Television series about major historical events such as the Civil War have attracted large audiences.

Now, much attention is being given to a new study of the Declaration of Independence, Our Declaration, by Danielle Allen, a professor at the Institute for Advanced Study in Princeton, New Jersey. Danielle's father, Bill Allen, is also an academic and scholar. He was one of the early black conservatives and has long opposed race-based programs that judged men and women on the basis of their race and ethnic background rather than their individual merit.

While many Americans think of Thomas Jefferson as the sole author of the Declaration, Danielle Allen points out that, "The monumental achievement of Thomas Jefferson is, ultimately, to have produced a first draft."

She notes that Jefferson shared his draft with John Adams and Benjamin Franklin, who offered suggestions. It then went through the Committee of Five and on to Congress, that from July 2 to July 4 edited the Declaration extensively, "much to Jefferson's chagrin." Congress cut about a quarter of Jefferson's words, added some references to divinity and took out a section attacking slavery.

In Allen's view:

With changes such as these, with God edited in and a condemnation of slavery elided, Congress achieved a text that the men of that day and age could live with, including Jefferson, grumpily.

She views this process in positive terms. Jefferson, she writes, produced a work of such "philosophical integrity and unquestionable brilliance" that it could survive the "intense committee work." And, she argues, the committee work reflected the Founders' belief in equality.

This process of "democratic writing," coming after many months of discussion and argument not only in Philadelphia but throughout the colonies, is, Allen believes, worthy of celebration:

There is no other way for a free and equal people to chart its course. Our only chance to achieve collective happiness comes through extensive conversation punctuated here and there with votes, which will themselves over time, in their imperfection, simply demand of us more talk.

The Declaration declares the right of a people to create a government "most likely to effect their Safety and Happiness." This in turn, writes Allen, rested on a radical notion: "As judges of our own happiness, we are equals," and, as a result, "the unrelenting work for which each of us, in face of this equality, must take responsibility."

The book Our Declaration is a line-by-line, often word-by-word commentary on our Founding document. The inspiration to write the book was stimulated by Danielle Allen's experience teaching the Declaration to night school students in Chicago. She writes about growing up in a mixed-race African-American family whose dinner conversations often turned to the Declaration and its pronouncement that "all men" are created equal.

The book has stirred some controversy. Reviewing Our Declaration in The New York Times Book Review, Professor Steven B. Smith of Yale writes:

This book makes three large claims about the Declaration of Independence, one that is profoundly true, another that is debatable, and a third, I would say, that is false. Its principal truth is that when Jefferson wrote "all men are created equal," this genuinely meant to apply to all, black as well as white. There is moral cosmopolitanism in the Declaration's language.

Second, Smith notes, is the considerable attention Allen devotes to the famous "Laws of Nature and of Nature's God" clause. He asks:

How much does the Declaration depend on a theistic orientation? Jefferson and his colleagues speak of rights as being endowed by our Creator. An endowment suggests that these rights are not self-created but a gift. Yet as Allen correctly notes, the God invoked by the Declaration is certainly not the God of the Bible. . . . Allen seems to argue . . . that the language of divinity is entirely marginal to the text . . . she says the Declaration's language of "self-evident" truths is drawn not from Scripture but from logic. . . . She confidently affirms that one does not need to be a theist to accept the arguments of the Declaration. It is not at all clear that this confidence was shared by the authors of the text.

What Smith finds to be Allen's "least plausible" assertion is her claim about the "group writing" that went into the composition of the Declaration. She expresses the view that group writing shows how something called the "collective mind" contributes to the production of our shared moral vocabulary. Smith disagrees. He notes that Jefferson's original draft contained a strong denunciation of the slave trade as a "war against human nature," and writes that:

The passage was deleted by the Continental Congress as too inflammatory. . . . Jefferson's relationship to slavery was, as Allen observes, "maddeningly complex," but had his words not been compromised by the group, they would have rendered impossible later misrepresentations of the Declaration as expressing the economic self-interest of the slave owners.

Another area of debate has been Allen's belief that "equality" is central in the Declaration.

Traditionally, American society has seen the claims of liberty and equality as pulling in opposite directions. And in the battle between liberty and equality, the claims of individual liberty have held the dominant position. There is, of course, some historical evidence to back up Allen's claim. In Democracy In America, written in 1835, Alexis De Tocqueville wrote that, "Americans are so enamored of equality that they would rather be equal in slavery than unequal in freedom."

More often, however, we have understood "equality" to mean equal opportunity to go as far as our individual talents and hard work will take us, as well as equality before the law, and in the eyes of God, not equality of condition. In Capitalism and Freedom, economist Milton Friedman states that, "The 19th century liberal regarded an extension of freedom as the most effective way to promote welfare and equality; the 20th-century liberal regards welfare and equality as either prerequisites of or alternatives to freedom."

Whether or not one agrees with all of Danielle Allen's conclusions, the fact is that she has produced an important book. Her passion for each of the Declaration's 1,337 words is extraordinary. And to see Americans focusing on their history, exploring what Jefferson, Adams, Franklin and their colleagues really meant, is hardly a minor achievement in our era of popular culture and political correctness.

Urging Jews to Flee France Is Calling for a Posthumous Victory for Hitler

In the wake of the terrorist attack in Paris, which included an assault on a kosher grocery store, Israeli Prime Minister Benjamin Netanyahu traveled to France and urged French Jews to flee their country and emigrate - make "aliyah" - to Israel.

He declared: "I wish to tell all French and European Jews - Israel is your home." He said that he would convene a special committee to promote emigration from France and other European countries.

Yair Lapid, Netanyahu's former finance minister, said: "European Jewry must understand that there is just one place for Jews, and that is the state of Israel." This, of course, is what Zionism believes, that Israel is the "homeland" of all Jews and that those Jews living in France, England, the United States and elsewhere are really in "exile."

This, of course, is an ideological construct that has no relationship to reality. The overwhelming majority of American Jews, for example, have always believed that Judaism is a religion of universal values, not a nationality, and that rather than being in "exile" in America, they are fully at home. This view has been expressed repeatedly in our history. In 1841, at the dedication ceremony of Temple Beth Elohim in Charleston, South Carolina, Rabbi Gustav Poznanski declared: "This country is our Palestine, this city our Jerusalem, this house of God our temple."

There is widespread dismay in France at the Israeli notion that French Jews are not really French and their real "home" is Israel. The horrors of terrorism that have been inflicted upon Paris and elsewhere are being confronted by the governments involved. French Prime Minister Manuel Valls said, "If 100,000 Jews leave, France will no longer be France. The French Republic will be judged a failure."

Rabbi Menachem Margolin, director of the European Jewish Association, said that far better than emigration to Israel, would be the preservation and protection of Jewish life in the many countries Jews call home. He regretted that

. . . after every anti-Semitic act in Europe, the Israeli government issues the same statement about the importance of aliyah rather than employ every diplomatic and international means at its disposal to strengthen the safety of Jewish life in Europe.

He said: "The Israeli government must stop this Pavlovian response every time there is an attack against Jews in Europe."

Yonathan Arli, Vice President of CRIF, an umbrella group of Jewish institutions in France, says that he believes Jews should remain in France, which is their home. "We have had a Jewish community living here for more than a thousand years," he said.

We went through bombing attacks, the Holocaust, acts of terrorism, and we are not about to leave now. We just want to be safe.

Writing from Paris in The Forward, Laurent-David Samama notes that while some French Jews might be considering emigration:

. . . others - including young Jews like me - feel that making aliyah is a too-easy escape; it's simply not the answer. Those of us who remain in Paris, Marseille or Lyon are determined not to let the terrorists win. Throughout French history, Jews have experienced many periods of crisis. We've always overcome them, and we will overcome them again. Now more than ever . . . there is another communal faction that believes France needs us to stay here, to play the role of social whistleblower.

Smadar Bar-Akiva, executive director of JCC Global, a network of Jewish community centers, declared:

Jews in France clearly feel that last week's events were a turning point in their lives. Yet the calls for French Jews to pack their bags and make aliyah are disturbing and self-serving. . . . It will be more constructive to help French Jewry continue the educational and social work they are already doing.

Uri Avnery, the leader of Israel's peace movement, Gush Shalom, noted that:

The blood of the four Jews murdered in the kosher supermarket was not yet dry when Israeli leaders called upon the Jews of France to pack up and come to Israel. Israel, as everybody knows, is the safest place on earth. This was an almost automatic Zionist gut reaction. . . . The basic belief of Zionism is that Jews cannot live anywhere except in the Jewish state, because the victory of anti-Semitism is inevitable everywhere. Let the Jews of America rejoice in their freedom and prosperity - sooner or later they will come to an end. They are doomed like Jews everywhere outside of Israel. The new outrage in Paris confirms this basic belief. There was very little real commiseration in Israel. Rather a secret sense of triumph. The gut reaction of ordinary Israelis is: "We told you so!" and "Come quickly, before it's too late."

Israel is doing its best to make Jews feel unsafe in their native countries. In mid-January, the Israeli embassy in Dublin posted an image on Facebook showing the Mona Lisa wearing a hijab and carrying a large rocket. The line underneath read, "Israel is the last frontier of the free world." In a similar vein, the Arab Affairs correspondent of Israel's Channel 10 broadcast a fear-mongering "investigation" from London supposedly proving that the city was overrun with Islamic extremists.

Writing in Mondoweiss, Jonathan Cook points to the similar worldview of Zionists and traditional anti-Semites:

Israeli politicians of both right and left have parroted his (Netanyahu's) message that European Jews know "in their hearts that they have only one country." The logical corollary is that Jews cannot be loyal to other states they live in, such as France. . . . In this regard, Netanyahu and the far right share much common ground. He wants a Europe free of Jews. The far right wants the same. . . . One Israeli commentator noted pointedly that Israeli politicians like Netanyahu "were helping to finish the job started by the Nazis and their Vichy collaborators: making France Judenrein.

Sadly, the Israeli government has never recognized that Jewish citizens of France, the United States, the United Kingdom and other countries are not "Israelis in exile." Mr. Netanyahu has repeatedly called upon American Jews to make a "mass aliyah" to Israel. No other foreign government argues that millions of Americans, because of their religion, are in "exile" in the United Stated and that their real "homeland" is that foreign country.

Such claims distort the meaning of Judaism almost completely. In 1929, Orthodox Rabbi Aaron Samuel Tamarat wrote that the very notion of a sovereign Jewish state as a spiritual center was "a contradiction to Judaism's ultimate purpose." He wrote:

Judaism at root is not some religious concentration which can be localized in a single territory. Neither is Judaism a "nationality" in the sense of modern nationalism. . . . No, Judaism is Torah, ethics and exaltation of spirit. . . . It cannot be reduced to the confines of any particular territory. For as Scripture said of Torah, "Its measure is greater than the earth."

Israel should be content to be the "homeland" of its own citizens, Jewish, Christian and Muslim, and stop attempting to speak in the name of millions of men and women who are citizens of other countries. No other country does this. And its call for French Jews to abandon their country at a time of crisis is unseemly in the extreme. Claude Lanzmann, the widely respected French Jewish filmmaker, best known for his Holocaust documentary film Shoah, said, quite wisely, that following Benjamin Netanyahu's advice would have only one result, giving Hitler, who did his best to rid France and all of Europe of Jews, "a posthumous victory." *

Wednesday, 16 December 2015 11:58

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

When American Society Is Called "Racist" - to What Is It Being Compared?

In recent days, in the wake of a number of questionable interactions between the police and black men, there have been demonstrations, sometimes violent, proclaiming that "racism" is embedded in the American society. Addressing these protests, President Obama said that racism is "deeply rooted" in our country.

In an interview with BET early in December, Mr. Obama declared:

This is something that is deeply rooted in our history. When you're dealing with something as deeply rooted as racism or bias . . . you've got to have vigilance. . . .

Those who proclaim that America is a "racist" society do not tell us which other countries - either contemporary or historical - they are comparing it to. If they did look around the world - or through history - they would find that our society, although flawed as is every human endeavor, is unique - not for drawing lines between people but for embracing something quite different. American nationality is not based on common race, religion, or ethnic background, but on a commitment to live in a free and open society, with the fulfillment of the responsibilities of citizenship. Americans come in all colors, religions, and backgrounds. Japanese, Swedes, Nigerians and most other nations do not.

Those who proclaim the "racist society" thesis often go back to the question of slavery, as if this inhumane practice was an American invention. From the beginning of recorded history until the 19th century, slavery was the way of the world. Rather than some American uniqueness in practicing slavery, the fact is that in 1787, when the Constitution was being written, slavery was legal everyplace in the world. What stands out is that in the American colonies there was strenuous objection to slavery and that the most prominent framers of the Constitution wanted to eliminate it at the very start of the nation.

Slavery is as old as recorded history. Most people in the ancient world regarded slavery as a natural condition of life, one that could befall anyone at any time. It existed among nomadic pastoralists in Asia, hunting societies of North American Indians, and sea people such as the Norsemen. The legal codes of Sumeria provide documentary evidence that slavery existed there as early as the 4th millennium B.C. The Sumerian symbol for slaves in cuneiform writing suggests "foreign."

The poems of Homer supply evidence that slavery was an integral part of ancient Greek society, possibly as early as 1200 B.C. Plato opposed enslavement of Greeks by Greeks, regarding bondservants as essentially inferior beings. His pupil Aristotle considered slaves as mere tools, lucky to have the guidance of their masters. At the time of Pericles, Athens had 43,000 citizens, who alone were entitled to vote and discharge political functions, 28,500 metics, or resident aliens, and 115,000 slaves. A century and a half later, Demetrius of Phalerum took a census of the city and counted only 21,000 citizens, 10,000 metics and 400,000 slaves.

The Bible ratifies slavery, although it called for humane treatment of slaves. In England, 10 percent of the persons enumerated in the Domesday Book (A.D. 1086) were slaves, and these could be put to death with impunity. Portugal imported large numbers of African slaves to work her estates in the southern provinces and to do menial labor in the cities from 1444 on. By the middle of the 16th century, Lisbon had more black than white residents. In 1515, the Portugese king ordered that they be denied Christian burial and thrown into a "common ditch" called "Poco dos Negros."

When the Constitutional Convention met in Philadelphia in 1787, not a single nation had made slavery illegal. As they looked back through history, the framers saw slavery as an accepted and acceptable institution. It was not until 1792 that Denmark became the first Western nation to abolish the slave trade. What stands out historically is that so many of the leading men of the American colonies of that day wanted to eliminate it - and pressed vigorously to do so.

Benjamin Franklin and Alexander Hamilton were ardent abolitionists. John Jay, who would become the first Chief Justice of the Supreme Court, was president of the New York Anti-Slavery Society. Rufus King and Gouverneur Morris were in the forefront of the opposition to slavery and the slave trade. One of the great debates at the Constitutional Convention related to the African slave trade. George Mason of Virginia made an eloquent plea for making it illegal:

This infernal traffic originated in the avarice of British merchants. The British government constantly checked the attempt of Virginia to put a stop to it. . . . Slavery discourages arts and manufactures. The poor despise labor when performed by slaves. . . . Every master of slaves is born a petty tyrant. They bring the judgment of heaven on a country.

While many criticized the framers for not eliminating the slave trade immediately, others understood that they had set in motion an opposition to slavery that would bear fruit in the future. Oliver Ellsworth of Connecticut stated:

Slavery, in time, will not be a speck in our country. Provision is already made in Connecticut for abolishing it. And the abolition has already taken place in Massachusetts.

Professor Samuel Huntington points to the truly historic meaning of the Constitutional Convention and its product:

This is a new event in the history of mankind. Heretofore most governments have been formed by tyrants, and imposed on mankind by force. Never before did a people, in time of peace and tranquility, meet together by their representatives and, with calm deliberation, frame for themselves a system of government.

It took a Civil War, the Emancipation Proclamation, the civil rights movement, and legislation ending racial discrimination to move our society toward the goal of color-blindness, to judging each citizen, as Martin Luther King urged, on the content of his or her character, not the color of their skin. Today, black Americans face no glass ceilings. We have a black president and attorney general and have had two black Secretaries of State. Things are not perfect, and they never will be. But we should sometimes pause and remember how far we have come.

In a recent interview with New York Magazine, comedian Chris Rock, who is black, discussed his daughters:

I drop my kids off and watch them in the school with all these mostly white kids, and I got to tell you. I drill them every day: Did anything happen today? Did anybody say anything? They look at me like I am crazy. . . . My kids grew up not only with a black president but with a black Secretary of State, a black Chairman of the Joint Chiefs of Staff, a black Attorney General. My children are going to be the first black children in the history of America to actually have the benefit of the doubt of just being moral, intelligent people. . . . The advantage that my children have is that my children are encountering the nicest white people that America has ever produced.

Ellis Close, who is black, wrote a book, The Rage of a Privileged Class, in 1993 in which he argued that many successful black Americans "were seething about what they saw as the nation's broken promise of equal opportunity." More recently, Close, a Newsweek columnist, wrote:

Now, Barack Obama sits in the highest office in the land and a series of high-powered African-Americans have soared to the uppermost realms of their professions. The idea of a glass ceiling is almost laughable. Serious thinkers are searching for a new vocabulary to explain an America where skin color is an unreliable marker of status. . . .

The history of the world, sadly, shows us people at war with one another over questions of race, religion, and ethnicity. Today, radical Islamists are killing people because they are of another religion. In Israel there are efforts to define the state as legally "Jewish," thereby making its 20 percent non-Jewish population less than full citizens. Russia has invaded and absorbed Crimea to absorb its ethnic Russian population. Anti-immigrant parties are gaining strength in Sweden, Denmark, England, Germany, and other European countries. When Britain left India, millions of Muslims were forced to leave Hindu-majority India and form Pakistan - at the cost of an untold number of lives. We have seen millions of Armenians slaughtered by Turks. We have witnessed genocide carried out by Nazi Germany, by Rwanda, by the Khmer Rouge. We could fill pages with a record of such horrors.

Those who glibly call America a "racist" society are not comparing it to anyplace in the real world, either historically or at the present time. They are comparing it to perfection and here, of course, we fail, as would any collection of human beings. But our collection of human beings includes men and women of every race and nation. There are problems and difficulties but the real story is our great success in molding a nation from people who have journeyed to our shores from every place on earth. As Herman Melville said many years ago, "If you shed a drop of American blood, you shed the blood of the whole world." This, not "racism," which, after all, is prevalent in one form or another, everywhere, is America's genuine achievement. Occasional eruptions of intolerance should not obscure this greater reality.

Anti-Police Rhetoric Misunderstands the Reality of Inner-city Life

The killings of two police officers in New York City has focused attention upon the anti-police rhetoric which, in the view of many, helped to create an atmosphere in which such an action could take place.

The New York gunman, Ismaaiyl Brinsley, wrote on social media that he intended to kill cops, and was angry about the deaths of Michael Brown and Eric Garner, who were killed by police officers in Ferguson, Missouri, and New York.

Police in New York believe that Mayor Bill de Blasio has helped create an anti-police atmosphere in the city. After two police lieutenants were attacked by protestors on the Brooklyn Bridge, de Blasio described them as having been "allegedly assaulted," terminology which many police officers found offensive.

There have been a number of instances in which the mayor's statements have antagonized the police. Earlier in December, de Blasio spoke to George Stephanopoulos of ABC News about his fears for his biracial son:

It's different for a white child. And with Dante, very early on with my son, we said, "Look, if a police officer stops you, do everything he tells you to do, don't move suddenly, don't reach for your cell phone," because we knew, sadly, there's a greater chance it might be misinterpreted if it was a young man of color.

This echoed previous statements the mayor had made, going back to a campaign ad in which he pointed to his Afro-wearing teenage son to explain his opposition to the New York Police Department's controversial "stop and frisk" tactic, which entailed stopping hundreds of thousands of people a year for what was deemed suspicious activity. The vast majority of those targeted were nonwhite and innocent of any wrongdoing.

The new mayor, the first Democrat to be elected in New York for twenty years, represents a sharp turn from the close alliance between his predecessors, Rudolf Giuliani and Michael Bloomberg, and law enforcement. Recriminations against de Blasio began within hours of the news that officers had been shot at point-blank range as they sat in their patrol car in the Bedford-Stuyvesant area of Brooklyn, and that the gunman had been motivated to kill them as retribution for the deaths of black men at the hands of police.

A video of the arrival of de Blasio and Police Commissioner William Bratton at the hospital, where officers Wenjian Liu and Rafael Ramos had been taken, showed dozens of police officers silently turning their backs. "There's blood on many hands . . ." said Patrick Lynch, president of the largest police union,

. . . those who incited violence on the street in the guise of protest, that tried to tear down what New York City police officers did every day. We tried to warn it must not go on, it shouldn't be tolerated. That blood on the hands starts at the steps of City Hall in the office of the mayor.

Former New York City officials are critical of what they call the "anti-cop" environment created by the White House, activists such as Al Sharpton and Jesse Jackson, as well as Mayor de Blasio and Attorney General Eric Holder. "We've had four months of propaganda starting with the president that everybody should hate the police," said former New York mayor Rudy Giuliani. "They have created an atmosphere of severe, strong, anti-police hatred in certain communities, and for that, they should be ashamed of themselves."

Former New York Governor George Pataki said that Mayor de Blasio and Attorney General Holder have frequently used divisive "anti-cop rhetoric." Relations between Mayor de Blasio and uniformed police officers have become so strained that "he probably needs an intermediary to go between himself and the unions, maybe a religious leader," said former New York police commissioner Ray Kelly. "I don't know how receptive the unions would be."

Steven Cohen, a professor at Columbia University's School of International and Public Affairs, says that:

The mayor needs to understand he's not an advocate anymore. He's an executive, and that means he has to act more as the mayor of the entire city than as the leader of a faction that helped him become mayor.

The fact is that the effort to stir hostility to the police, particularly in minority communities, is both divisive and based on a misunderstanding of reality. This does not mean, of course, that there are not occasional missteps by police officers, some of which have a racial element. These should be investigated and prosecuted, when appropriate. The larger picture, however, is quite different.

Neither of the police officers killed in New York was white. The officers in the patrol cars of New York City come from 50 countries and speak scores of languages. The Police Department, The New York Times reports:

. . . looks more like the city than ever. In two generations, as the city was becoming ever safer, the Police Department utterly changed its makeup.

Minorities make up the majority of the New York Police Department.

Heather MacDonald, a fellow at the Manhattan Institute. Writes in City Journal:

. . . a lie has overtaken significant parts of the country, resulting in growing mass hysteria. That lie holds that the police pose a mortal threat to black Americans - indeed that the police are the greatest threat facing black Americans today. . . . President Obama announced that blacks were right to believe that the criminal-justice system was often stacked against them. . . . Eric Holder escalated a long-running theme of his tenure as U.S. Attorney General - that the police routinely engaged in racial profiling and needed federal intervention to police properly. In an editorial justifying the Ferguson riots, The New York Times claimed that "The killing of young black men by police is a common feature of African American life, and a source of dread for black parents from coast to coast."

In fact, MacDonald points out:

Police killings of blacks are an extremely rare feature of black life and are a minute fraction of black homicide deaths. The police could end all killings of civilians tomorrow and it would have no effect on the black homicide risk, which comes overwhelmingly from other blacks. In 2013, there were 6,261 black homicide victims in the U.S. - almost all killed by black civilians - resulting in a death risk in inner cities that is ten times higher for blacks than for whites. None of those killings triggered mass protests; they are deemed normal and beneath notice. The police, by contrast, according to published reports, kill roughly 200 blacks a year, most of them armed and dangerous, out of about 40 million police-civilian contacts a year. Blacks are in fact killed by police at a lower rate than their threat to officers would predict. In 2013, blacks made up 42 percent of all cop-killers whose race was known, even though blacks are only 13 percent of the nation's population. The percentage of black suspects killed by the police nationally is 29 percent lower than the percentage of blacks mortally threatening them.

Prior to leaving New York to attend a White House summit on policing, Mayor de Blasio told the press that a "scourge" of killings by police is "based not just on decades but centuries of racism." After the Staten Island grand jury declined to indict an officer for homicide in Eric Garner's death, de Blasio declared:

People are saying black lives matter. It should be self-evident, but our history requires us to say "black lives matter." It was not years of racism, but centuries of racism.

He said he worries "every night" about the "dangers" (his biracial son Dante) may face from "officers who are paid to protect him."

The mayor seems to misunderstand the reality of today's New York City. There is no institution more committed to the idea that "black lives matter" than the New York City Police Department. Thousands of black men are alive today who would have been killed years ago had the data-driven policing under Mayors Giuliani and Bloomberg not brought down the homicide levels of the early 1990s. The police in New York fatally shot eight individuals last year, six of them black, all posing a risk to the police, compared with scores of blacks killed by black civilians.

Al Sharpton, who now is pictured standing as a key advisor beside both President Obama and Mayor de Blasio, first rose to fame by promoting the story that a black teenager, Tawana Brawley, was sexually assaulted by white law enforcement officials. There was not a word of truth to this story. Al Sharpton has never stopped his racially divisive agitation. Yet now he is welcome at the White House and at City Hall. At one New York protest, marchers chanted, "What do we want? Dead cops." Two public defenders from the Bronx participated in a rap video extolling cop killings. At a march across the Brooklyn Bridge, a group of people tried to throw trash cans onto the heads of officers on the level below them. Social media is filled with gloating at the deaths of the two New York officers. A student leader and a representative of the Afro-American Studies Department at Brandeis University tweeted that she has "no sympathy for the NYPD officers who were murdered today."

In Heather MacDonald's view:

The only good that can come out of this wrenching attack on civilization would be the delegitimation of the lie-based protest movement. Whether this will happen is uncertain. . . . The elites' investment in black victimology is probably too great to hope for an injection of truth in the dangerously counterfactual discourse about race, crime and policing.

Historically, contempt for those in uniform who protect us - and keep society safe - is nothing new. In his poem "Tommy," about the poor treatment encountered by British soldiers - except when "it comes to fightin'" - Rudyard Kipling wrote - as if he had contemporary police officers in mind:

O makin' mock o' uniforms that guard you while you sleep
Is cheaper than them uniforms, an' they're starvation cheap;

An' hustlin' drunken dodgers when they're goin' large a bit

Is five times better business than paradin' in full kit.

Then it's Tommy this, an' Tommy that, an' "Tommy, ow's yer soul?"
But it's "Thin red lines of 'eroes" when the drums begin to roll,
The drums begin to roll, my boys, the drums begin to roll,
O it's "Thin red lines of 'eroes" when the drums begin to roll.

Terror in Paris Raises the Question: Is the West Prepared for Jihadis Returning from Syria and Iraq?

The terror attacks in Paris raise many questions about how prepared the West, including our own country, is to confront Islamist terrorism, particularly in the face of thousands of young people holding French, British, American, and other Western passports who are now in the Middle East fighting with groups such as ISIS and al Qaeda. Before long, many of them will return, and events such as we have witnessed in Paris - and also at the Boston Marathon, the Madrid and London subways, and in Ottawa and Sydney - may proliferate.

In France, the assault on the journalists at Charlie Hebdo, the satirical journal, came at a time when tension was already growing between mainstream French society and its large Muslim immigrant community, the largest in Europe. The current best-selling book in France is the novel Submission by Michel Houellebecq who imagines France in 2022 with a Muslim president. Another best-seller at the present time is The French Suicide, in which journalist Eric Zemmour argues that the 1968 student uprisings and immigration, among other things, have set France on a path to ruin.

"I think this anxiety is the idea of seeing France give up on itself, of changing to the point of no longer being recognizable," said the philosopher Alain Finkielkraut, whose 2013 book, The Unhappy Identity, discussed the problems immigration poses for French identity and cultural integration. "People are homesick at home," he says.

Both Zemmour and Houellebecq approach the subject differently, but speak to the same anxiety. Christophe Barbier, the editor of L'Express, says, "It's the same book, in that both talk about the same subject: the irreversible rise of Islam in society and politics."

France has, it seems, failed to assimilate its immigrant population and transmit to them the Western values of, among other things, freedom of speech, freedom of religion, and freedom of the press. In many neighborhoods, city officials have virtually ceded control to Islamists. Soeren Kern, an analyst at the Gatestone Institute and author of annual reports on The Islamization of France, declares that:

The situation is out of control, and it is not reversible. Islam is a permanent part of France now. It is not going away. I think the future looks very bleak. The problem is a lot of these younger-generation Muslims are not integrating into French society. Although they are French citizens, they don't really have a future in French society. They feel very alienated from France. This is why radical Islam is so attractive because it gives them a sense of meaning in their lives.

The Muslim population of France reached 6.5 million, or 10 percent of its 66 million people. Some Muslim activists predict that France will be a Muslim-majority country in the not too distant future. Gatestone reports that an intelligence document leaked to Le Figaro said that Muslims are creating a separate public school society "completely cut off from non-Muslim students." Over one thousand French supermarkets are selling Islamic books that call for jihad and the killing of non-Muslims. Last year, French Prime Minister Manuel Valls said:

We are fighting terrorism outside of France, but we are also fighting an internal enemy since there are those French who fit into the process of radicalization. This enemy must be fought with the greatest determination.

One of the two brothers involved in the Charlie Hebdo killings traveled to Yemen in 2011 and received terrorist training from al Qaeda's affiliate there before returning to France. Said Kouachi, 34, spent several months training in small arms combat, marksmanship, and other skills. Both French and American officials were aware that Kouachi had trained in Yemen, inspired by Anwar Al-Awlaki, the American-born cleric who by 2011 had become a senior operational figure for Al Qaeda in the Arabian Peninsula. Before he was killed in an American drone strike in Sept. 2011, he repeatedly called for the killing of cartoonists who insulted the Prophet Muhammad.

France has struggled for years to keep track of extremists while avoiding measures that would alienate ordinary Muslims and increase the risk of a violent response. Jonathan Laurence, author of The Emancipation of Europe's Muslims, reports that intelligence services in European countries had so many residents with jihadist sympathies that it was very hard to separate those who merely offer verbal support for groups claiming to fight for Islam from those who are prepared to take up arms. "Mass surveillance of an entire community is not an option because civil liberties also need to be balanced with the potential benefit it will gain," said Laurence.

The Islamic State has attracted a large number of European-born Muslims, and some Americans, to Syria and Iraq in recent months and is seen to be encouraging blowback terrorist attacks in the countries from which they come and whose passports they carry. Targeting Charlie Hebdo was

. . . . deftly chosen: not a religious symbol, but a symbol of what democratic freedoms are, exactly where the Islamic State wants to drive a wedge between European Muslims and their fellow citizens. . . .

said Jean-Pierre Filiu, a French scholar who studies Islamic extremism.

In Filiu's view, the "plan backfired because of the unanimous condemnation of this heinous attack in France and throughout the Muslim world." He said that the widespread expressions of disdain toward the incident from Islamic leaders across Europe, several of who publicly called for tolerance, were underscored by the fact that one of the 12 people killed in the attack was a French policeman who was Muslim.

"As a Muslim, killing innocent people in the name of Islam is much, much more offensive to me than any cartoon can ever be," wrote pro-democracy activist Iyad El-Baghdadi in a statement that was re-tweeted more than 26,000 times in a single day.

It is important for Europe's long-term well being that all Muslims are not demonized, but that radical Islamists are isolated and carefully monitored. Muslims will clearly play an important role in Europe's future. In Germany, it is projected that there will be 4.8 million Muslims in 2020. The number will account for roughly 6 percent of the nation's total population, up from 4.5 percent in 2000. Even bigger surges are underway in Britain, Spain, and France, according to a Pew Research Center study. Muslims are projected to make up 6.5 percent of Britain's population by 2020, up from 2.7 percent in 2000.

The difference between traditional Islam and the radical religion promoted by the Islamic State, al Qaeda and other extremists is something non-Muslims often do not understand. The Koran, for example, does not anywhere forbid creating images of Muhammad, although there are later commentaries and traditions that do -"Hadith" - to guard against idol worship. This is hardly unique. The Old Testament forbids "graven images." The word "blasphemy" does not appear in the Koran. Islamic scholar Maulana Wahiduddin Khan points out that:

There are more than 200 verses in the Koran, which reveal that the contemporaries of the prophets repeatedly perpetrated the same act, which is now "blasphemy or abuse of the prophet.". . . but nowhere does the Koran prescribe the punishment of lashes, or death, or any other physical punishment. . . . In Islam, blasphemy is a subject of intellectual discussion rather than a subject of physical punishment.

Historically, Islam has not been an intolerant religion. In 1492, when the Catholic King Ferdinand and Queen Isabella expelled the Jews from Spain, they were welcomed into the Ottoman Empire and other Muslim countries. When the Spanish Inquisition was killing men and women for their religious beliefs, Jews and Christians found much more tolerance and religious freedom under Islam. Now, unfortunately, many Muslim majority countries look very much like medieval Spain. Pakistan, Bangladesh, Malaysia, Egypt, Turkey, and Sudan have all used blasphemy laws to jail and harass people, according to the U.S. Commission on International Religious Freedom. Saudi Arabia forbids the practice of any religion other than its own Wahhabi version of Islam.

Europe has a choice. It can try to assimilate its Muslim immigrants into Western society, transmitting the values of freedom, democracy, and tolerance of diverse views. It can, in doing so, use America as a model, in which immigrants from every part of the world, of every race, religion, and ethnic background have been transformed into Americans, Muslims included. Or it can isolate immigrants, telling them that they can never be "French," or "German," or "British," and alienate young people so that they are driven into the hands of Al Qaeda and the Islamic State.

If European countries did not intend to assimilate immigrants into their societies, they should not have permitted them to enter. Now that they are there, and appear increasingly alienated, it is essential that positive steps be taken to avoid future chaos. And it is important that Muslims themselves isolate the extremists in their community and become determined to become full citizens of the countries in which they live. As those engaged in jihad in the Middle East return to Europe, a perfect storm will be faced unless positive steps are taken. If those who rail against immigrants, and Islamic fundamentalists, come to dominate their respective communities, Europe's future will be bleak. This may be a lesson to take away from the Paris terror attacks.

Confronting Torture: A Violation of American Values

The heated debate over the Senate Intelligence Committee's report on our government's use of torture is often asking the wrong questions. The report is being criticized for not having interviewed CIA personnel involved in the program. This is a legitimate criticism. The report is criticized for categorically stating that no worthwhile information was obtained by such procedures. We have no way of knowing whether this is true. The CIA argues that it is not.

The real argument against the use of torture is not that it is ineffective in gaining worthwhile information, which most experts argue is the case, but that it is illegal, immoral, and in violation of American values. Both liberals and conservatives should be in agreement on this matter. Liberals object to inhumane treatment of prisoners, and conservatives are concerned about out-of-control big government, conducted in secret.

Senator John McCain (R-AZ), who was taken prisoner during the Vietnam War and suffered years of torture at the hands of the North Vietnamese, declared that America never engaged in torture against German and Japanese prisoners of war during World War II, against North Korean prisoners during the Korean War, or against North Vietnamese and Viet Cong prisoners during the Vietnam War. We put on trial those who were guilty of torturing Americans. Torture, McCain declared, is not the American way. Beyond this, he noted that, "I know from personal experience that the abuse of prisoners will produce more bad than good intelligence."

Ironically, one of the reasons repeatedly stated by President George W. Bush for the U.S. invasion of Iraq in 2003 was the maintenance of "torture rooms" by Saddam Hussein. Andrew Napolitano, a former judge of the Superior Court of New Jersey, and an analyst for Fox News Channel, says of the Senate report that:

. . . it is damning in the extreme to the Bush administration and to the CIA leadership. It offers proof that the CIA engaged in physical and psychological torture, some of which was authorized - unlawfully, yet authorized - most of which was not. The report also demonstrates that CIA officials repeatedly lied to the White House and to Senate regulators about what they were doing and they lied about the effectiveness of the torture. If the allegations in the report are true, we have war criminals, perjurers, computer hackers and thugs on the government payroll.

The fact is, as Judge Napolitano points out:

All torture is criminal under all circumstances - under treaties to which the U.S. is a party, under the Constitution that governs the government wherever it goes, and under federal law. Torture degrades the victim and the perpetrator. It undermines the moral authority of a country whose government condones it. It destroys the rule of law. It exposes our own folks to the awful retaliatory beheadings we have all seen. . . . It is a recruiting tool for those who have come to cause us harm.

Historically, in wartime, we have done things we later regretted. Civil liberties have been abused. Abraham Lincoln suspended the writ of habeas corpus during the Civil War. During World War II, we imprisoned more than 127,000 Japanese-Americans. Similarly, after the attacks on 9/11, there was uncertainty and fear, which led to the actions recorded in the Senate report. "Still," The Washington Times noted editorially:

. . . it's difficult to argue with John McCain . . . a man who learned something about torture and its limits in the notorious North Vietnamese prison the American prisoners called, with grim irony, "the Hanoi Hilton."

Defending the release of the Senate report, which was opposed by many of his fellow Republicans, Sen. McCain said:

What might come as a surprise, not just to our enemies, but to many Americans is how little these practices did to aid our efforts to bring 9/11 culprits to justice and to find and prevent attacks today and tomorrow. That could be a real surprise since it contradicts the many assurances provided by intelligence officials on the record and in private that enhanced interrogation technique were indispensable in the war against terrorism. I suspect the objection of those same officials to the release of this report is really focused on that disclosure, torture's ineffectiveness, because we gave up much in the expectation that torture would make us safer. Too much.

We now know that 26 of those detained - and tortured - were held in error. One of these, Mohamed Bashmilah, was held in secret prisons for 19 months. He was kept shackled alone in freezing-cold cells in Afghanistan, subjected to loud music 24 hours a day. He attempted suicide at least three times, once by saving pills and swallowing them all at once; once by slashing his wrists; and once by trying to hang himself. Another time, he cut himself and used his own blood to write "this is unjust" on the wall.

Until 9/11, the U.S. had officially condemned secret imprisonment as a violation of basic international standards of human rights. But like the program on torture, it was set aside in an effort to prevent another attack. In one case, Laid Saidi, an Algerian identified in the Senate report as Abu Hudhaifa, was held in Afghanistan for 16 months. The Senate report says that he "was subjected to ice water baths and 66 hours of sleep deprivation before being released because the CIA discovered he was not the person he was believed to be."

John Sifton of Human Rights Watch notes that:

You have an agency that has been presenting itself to Congress and the public as very professional, on top of everything. The report shows that they were flying by the seat of their pants. They were making it up as they went along.

In a democratic society, non-elected government bureaucrats are responsible for carrying out the laws passed by our elected representatives in the Congress, which are thusly executed by the executive branch. In the instance of torture, we see something quite different - secret government with men and women who have not been elected by anyone, engaged in actions which are illegal, and keeping such action secret from elected officials. For four years, according to CIA records, no one from the agency ever came to the White House to give President George W. Bush a full briefing on what was happening in the dungeons of Afghanistan and Eastern Europe. For four years, interrogators stripped, slammed, soaked and otherwise abused their prisoners without informing the president - or the congressional oversight committees. Finally, in April 2006, the CIA director gave President Bush his first briefing about interrogation practices being used since 2002.

In that briefing, the president was told about one detainee being chained to the ceiling of his cell, clothed in a diaper and forced to urinate and defecate upon himself. The president is reported to have "expressed discomfort." According to the Senate report, "The CIA repeatedly provided incomplete and inaccurate information" to the White House. But it may be that the White House, and the congressional oversight committees, really didn't want to know exactly what was going on, in which case they are equally culpable.

We must, of course, recognize the fears that were widespread after 9/11 - as was the fear after Pearl Harbor. People fearful of another brutal attack often act in ways that, in a more tranquil time, would never be considered. As Sen. McCain said:

I understand the reasons that governed the decision to resort to these interrogation methods, and I know that those who approved them and those who used them were dedicated to securing justice for the victims of terrorist attacks and to protecting Americans from further harm. . . . But I dispute wholeheartedly that it was right for them to use these methods, which the report makes clear were neither in the best interests of justice nor our security nor the ideals we have sacrificed so much blood and treasure to defend.

Even in the worst of times, concluded McCain, "We are always Americans and different, stronger, and better than those who would destroy us." With the Senate report we are acknowledging before the world that in a moment of great tension and fear, we violated our own deeply held principles. We always used to say, "Americans don't torture." Hopefully, we can say this again and make certain that our government is not conducted in secret but in the light of day, as was intended by the Founding Fathers. *

Page 1 of 7

Calendar of Events

No events

Words of Wisdom