Allan C. Brownfeld

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Saturday, 05 December 2015 04:55

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Eighty-four Percent of Americans Disapprove of Congress: Their Contempt Is Justified

A Washington Post-ABC News poll shows a new high - 84 per cent of Americans - disapproving of the job Congress is doing, with almost two-thirds saying they "disapprove strongly." Just 13 percent of Americans approve of how things are going. It has been nearly four years since even 30 percent expressed approval of Congress.

Editorially, The Washington Examiner notes that,

Nobody can remember the last time the public approval rating of Congress was so low. That's because it's never been as low as it is now. . . . It's not hard to see why: the American people are fed up with the bipartisan corruption, endless partisan bickering, and lack of concrete action to address the nation's most pressing problems, especially out-of-control spending and the exploding national debt. . . . Both parties have presided over congressional majorities as Congress sank in public esteem during the past decade.

One reason for public dismay is the manner in which members of Congress often support while in office the interests they then go to work for once out of office. Of equal concern is the manner in which members of Congress vote for subsidies to the groups that have contributed to their political campaigns. This is true of members of Congress from both parties.

For example, soon after he retired last year as one of the leading liberals in Congress, former Rep. William D. Delahunt (D-MA) started his own lobbying firm with an office on the 16th floor of a Boston skyscraper. One of his first clients was a small coastal town that has agreed to pay him $15,000 a month for help in developing a wind energy project.

The New York Times reports that,

Amid the revolving door of congressmen-turned-lobbyists, there is nothing particularly remarkable about Mr. Delahunt's transition, except for one thing. While in Congress, he personally earmarked $1.7 million for the same energy project. So today, his firm, the Delahunt Group, stands to collect $90,000 or more for six months of work from the town of Hull, on Massachusetts Bay, with 80 percent of it coming from the pot of money he created through a pair of Energy Department grants in his final term in office.

Beyond the town of Hull, Delahunt's clients include at least three others who received millions of dollars in federal aid with his direct assistance. Barney Keller, communications director for the Club for Growth, a conservative group that tracks earmarks, says:

I cannot recall such an obvious example of a member of Congress allocating money that went directly into his own pocket. It speaks to why members of Congress shouldn't be using earmarks.

While this case may be somewhat extreme, it is repeatedly duplicated in one form or another by members of Congress. Consider former Senator Rick Santorum of Pennsylvania. A review of Santorum's many earmarks suggests that the federal money he helped direct to Pennsylvania paid off in the form of campaign cash. In just one piece of legislation, the defense appropriations bill for the 2006 fiscal year, Santorum helped secure $124 million in federal financing for 54 earmarks, according to Taxpayers for Common Sense, a budget watchdog group. In that year's election cycle, Santorum's Senate campaign committee and its "leadership PAC" took in more than $200,000 in contributions from people associated with the companies that benefited or their lobbyists. In all, Taxpayers for Common Sense estimated, Santorum helped secure more than $1 billion in earmarks during his Senate career.

Or consider former House Speaker Newt Gingrich, who speaks about being a "Reagan conservative" who supports "limited government," yet received $1.6 million from Freddie Mac over an eight-year period and gave the government-backed mortgage giant assistance in resisting reformers in Congress. Mr. Gingrich denies that he was a "lobbyist," as do some other former members of Congress. The Lobbying Disclosure Act of 1995 has three tests:

(1) Do you make more than $3,000 over three months from lobbying?
(2) Have you had more than one lobbying contract?
(3) Have you spent more than 20 per cent of your time lobbying for a single client over three months?

Only a person who has met all three tests must register as a lobbyist. Thus, a former member of Congress who has many lobbying contacts and makes $1 million a year lobbying but has no single client who takes up more than 20 per cent of his time would not be considered a lobbyist.

Clearly, it is time to change this rule. A task force of the American Bar Association recommended last year that the 20 percent rule be eliminated, which would require far more people to register as lobbyists, and subject them to ethics and disclosure requirements. The Center for Responsive Politics found that more than 3,000 lobbyists simply "de-registered" after Congress imposed new reporting requirements for lobbyists in 2007.

With regard to Gingrich, Washington Times columnist Don Lambro writes:

Mr. Gingrich . . . is the quintessential Washington insider, peddling influence in government. . . . He denied he was lobbying, insisting that he was hired to be a historian, when he was selling his services to one of the richest bidders in government. He was being paid well out of Freddie Mac's coffers while it was sowing the seeds of a housing scandal that resulted in an economic meltdown that has hurt millions of Americans and cost taxpayers billions of dollars. In other words, as a paid insider, he was part of the problem, not part of the solution.

Cutting the size of government, reducing our debt, and balancing the budget are embraced rhetorically by candidates for public office. Once elected, however, many become part of the system they campaigned against. The incentive structure once in office is to raise money to stay in office, and the way to do this is to vote subsidies to those groups being called upon to contribute. Both parties are engaged in this behavior, and candidates of both parties are rewarded so that special interests will have a friend in office no matter who is elected.

Sadly, the action of Congress - and the lobbying enterprises of former members of Congress - are legal. This, of course, is because it is Congress itself that writes the laws. There was a time when members of Congress, when they retired or were defeated, returned home. Some still do. Many others, however, remain in Washington, getting rich trying to influence their former colleagues.

This enterprise, of course, is only part of why Congress is viewed in such negative terms by 84 percent of Americans. Narrow partisanship and a greater concern for politics than for the country's well being is another. All of this is on naked display in today's Washington. The public contempt has been well earned. Whether that public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.

We Must Recognize a New Threat to Freedom in the Name of "National Security"

On December 31, 2011, President Obama signed the National Defense Authorization Act, which was supported by both Republicans and Democrats in the Congress. This legislation allows for the indefinite detention of American citizens within the United States - without charging them with a crime.

Under this law, those suspected of involvement with terrorism are to be held by the military. The president has the authority to detain citizens indefinitely. While Senator Carl Levin (D-MI) said that the bill followed existing law, "whatever the law is," the Senate specifically rejected an amendment that would exempt citizens, and the administration has opposed efforts to challenge such authority in federal court. The administration claims the right to strip citizens of legal protections based on its sole discretion.

This legislation was passed by the Senate 93 to 7. "The only comparable example was Reconstruction in the South," says constitutional law scholar Bruce Fein.

That was 150 years ago. This is the greatest expansion of the militarization of law enforcement in this country since.

The opposition to this legislation assembled an unlikely coalition of liberal Democrats, the American Civil Liberties Union, constitutional conservatives, libertarians, and three Republican senators - Rand Paul (KY), Mark Kirk (IL), and Mike Lee (UT).

The law, argued Senator Paul:

. . . would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. I want to repeat that. We are talking about people who are merely suspected of a crime. And we are talking about American citizens. If these provisions pass, we could see American citizens being sent to Guantanamo Bay.

Senator Mark Udall (D-CO), who proposed a failed amendment to strip the language from the bill, said that these provisions would "authorize the military to exercise unprecedented power on U.S. soil.

Writing in The American Conservative, Kelley Beaucar Vlahos notes that:

Already the federal government has broad authority to decide whether terror suspects are detained and held by federal law enforcement agencies and tried in regular courts or carried off by the military under the Military Commissions Act. This new legislation would allow the military to take control over the detention of suspects first - which means no Miranda rights and potentially no trial even on U.S. soil, putting the front lines of the War on Terror squarely on Main Street.

Bruce Fein argues that the ambiguity of words like "associated groups" or "substantially supports" gives the military wide discretion over who is considered a terrorist. "It's a totally arbitrary weapon that can be used to silence people."

Rep. Justin Amash (R-MI), one of the leading critics of the bill in the House of Representatives, issued a fact-checking memo outlining how the language can be abused:

For example, a person makes a one-time donation to a non-violent humanitarian group. Years later, the group commits hostile acts against an ally of the U.S. Under the Senate's NDAA, if the President determines the group was "associated" with terrorists, the President is authorized to detain the donor indefinitely, and without charge or trial.

James Madison warned that, "The means of defense against foreign danger historically have become instruments of tyranny at home."

Senator Paul states that:

The discussion now to suspend certain rights to due process is especially worrisome, given that we are engaged in a war that appears to have no end. Rights given up now cannot be expected to be returned. So we do well to contemplate the diminishment of due process, knowing that the rights we lose now may never be restored. . . . This legislation would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. . . . There is one thing and one thing only protecting innocent Americans from being detained at will by the hands of a too-powerful state: our Constitution and the checks it puts on government power. Should we err and remove some of the most important checks on state power in the name of fighting terrorism, well, then the terrorists will have won.

In his dissent in Hamdi v. Rumfeld, Justice Antonin Scalia declared:

Where the government accuses a citizen a waging war against it, our constitutional tradition has been to prosecute him in federal court for treason or some other crime. . . . The very core of liberty secured by our Anglo-Saxon system of separated powers has been freedom from indefinite imprisonment at the will of the executive.

Jonathan Turley, professor of law at George Washington University, points out that:

In a signing statement with the defense authorization bill, Obama said he does not intend to use the latest power to indefinitely imprison citizens. Yet, he still accepted the power as a sort of regretful autocrat. An authoritarian nation is defined not just by the use of authoritarian powers, but by the ability to use them. If a president can take away your freedom or your life on his own authority, all rights become little more than a discretionary grant subject to executive will.

James Madison, Turley recalls,

. . . famously warned that we needed a system that did not depend on the good intentions or motivations of our rulers: "if men were angels, no government would be necessary." Since 9/11, we have created the very government the framers feared: a government with sweeping and largely unchecked powers resting on the hope that they will be used wisely. The indefinite-detention provision in the defense authorization bill seemed to many civil libertarians like a betrayal by Obama. While the president had promised to veto the law over that provision, Senator Levin, a sponsor of the bill, disclosed on the Senate floor that it was in fact the White House that asked for the removal of an exception for citizens from indefinite detention.

Historically, those who seek to expand government power and diminish freedom always have a variety of good reasons to set forth for their purposes. In the case of Olmstead v. United States (1927), Justice Louis Brandeis warned that:

Experience should teach us to be most on our guard to protect liberty when the government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in the insidious encroachment of men of zeal, well meaning but without understanding.

In recent years, in the name of ecology, racial equality, public health, and a variety of other "beneficent" purposes, the power of government has grown and the freedom of the individual has diminished, just as Justice Brandeis feared it would. But it has also diminished in the name of national security, something many conservatives, usually alert to the growth of government power, tend to support - or to acquiesce in. This is a serious mistake, as we now face the new threat of indefinite detention of American citizens. Freedom cannot be preserved by taking it away.

The Arab Spring: Understanding the Promise and Peril of Revolution in the Middle East

Developments in the Middle East remain chaotic. In the wake of the Arab Spring we have seen the overthrow of autocratic regimes in Tunisia and Egypt, a virtual civil war in Syria, and challenges to such governments as those in Bahrain and Yemen. The brutal Libyan dictator Muammar Ghaddafi has been overthrown. What comes next in this volatile region is difficult to know.

In an important new book, The Invisible Arab, Marwan Bishara, senior political analyst for Al Jazeera's English language service and the editor of its flagship show "Empire," and a former lecturer at the American University of Paris, provides a thoughtful analysis on how Arabs broke their own psychological barrier of fear to kindle one of the first significant revolutionary transformations of the 21st century.

Bishara describes how the historic takeover of Tunisia's November 7 Square, Egypt's Tahrir Square, and Bahrain's Pearl Square, among others, was the culmination of a long social and political struggle: countless sit-ins, strikes, and demonstrations by people who risked and suffered intimidation, torture, and imprisonment. It was aided by the dramatic rise of satellite television networks, including Al Jazeera, which bypass attempts by governments to censor news and information.

"Like most revolutions," he writes,

. . . this one was a long time coming. . . . They were the culmination of a long social and political struggle - countless sit-ins, strikes, pickets, and demonstrations. . . . The story begins with the young Arabs whose networking and organizations brought the people out into the streets. The youth, who make up 60 percent of all Arabs, have been looked upon as a "demographic bomb," and "economic burden," or as a "reservoir for extremism." However, unlike previous generations, this group heralded change.

For decades, Bishara argues, these Arab citizens and their social and political movements

. . . have been either unfairly demonized or totally ignored by the West . . . who saw the region through the prism of Israel, oil, terrorists, or radical Islamism. But today's Arabs are presenting a stark contrast to the distortion . . . heaped upon them. Characterized as unreceptive to democracy and freedom, they are now giving the world a lesson in both.

The more difficult part of this revolutionary journey, he notes, will come as

. . . the Arabs, sooner rather than later, discover that democracy and freedom come with greater responsibility. Defeating dictators is a prerequisite for progress, but does not guarantee it, especially in the absence of functional state institutions, democratic traditions, and modern infrastructure. The prevalence of poverty, inequality, and rising regional and international competition present huge challenges.

The origins of what he calls "the miserable Arab reality" are not civilizational, economic, or philosophical per se. Instead,

. . . . The origins . . . are political par excellence. Like capital to capitalists, or individualism to liberalism, the use and misuse of political power has been the factor that defines the contemporary Arab state. Arab regimes have subjugated or transformed all facets of Arab society.

By the beginning of the 21st century, Arab autocracies represented some of the oldest dictators in the world. Zine el-Abidine Ben Ali's dictatorship in Tunsia, the most recently established in the region, ruled for 25 years, followed by 30 years for Egypt's Mubarak, and 33 years for Yemen's Ali Abdulla Sale, and 43 years for Ghaddafi in Libya. In Syria, the al-Assad dynasty has ruled for 43 years, and Saddam Hussein was removed in 2003 after 24 bloody years ruling Iraq. Only the authoritarian Arab monarchies precede these dictatorships in longevity. Bahrain, a repressive Sunni monarchy, has ruled a Shia majority since its independence from Britain in 1971.

Arab states, writes Bishara,

. . . were, for a lack of better words, turned into the private estates of the ruling families. While these regimes boasted of secular republicanism, they were run similar to the Kingdom of Saudi Arabia and the United Arab Emirates, where no political activism was allowed and where the ruling families dominated all facets of political life. . . . The energy-producing Arab states are sustained rentier-type economies, characterized by a trade-off between economic welfare and political representation. Whereas the modern democratic state was founded on the cry of "no taxation without representation" . . . the modern Arab state has turned that notion on its head. With free-flowing petro-dollars pouring into their countries, Arab leaders have been able to sell off national resources and enrich themselves without having to turn to their citizens for personal taxation. . . . It became a ritual in the wealthy monarchies for the kings, emirs, or princes to provide small sums of money to their "subjects," and the poor in particular, as a makrama or "generous gift" that was generated from the natural resources in their land.

According to the U.N. Development Program's (UNDP) first Arab Human Development Report, written exclusively by Arab experts,

. . . Arab countries have not developed as quickly as comparable nations in other regions. Indeed, more than half of Arab women are illiterate; the region's infant mortality rate is twice as high as in Latin America and the Caribbean. Over the past 20 years, income growth per capita has also been extremely low.

In virtually every Arab country, more than half the population is under 30 - more than 140 million people - while a quarter are between the ages of 15 and 29, making this generation the largest youth cohort in the history of the Middle East. This unemployed and increasingly angry demographic has given traction to the "youth bulge" theory, which posits that when population growth outstrips that of jobs, social unrest is inevitable.

The influence of the information revolution has been crucial to developments in the region. As a result, notes Bishara,

. . . The Arab youth were able to think for themselves, freely exchange ideas, and see clearly beyond their ruler's deception, vengeful jihadist violence, or cynical Western calculations. . . . At the beginning of 2011, there were 27 million Arabs on Facebook, including 6 million Egyptians. Within a few nights, 2 million more Egyptians joined, underlining the centrality of the medium to the changes in the country. More than 60 million people in the Arab world are online.

Yemeni activist and 2011 Nobel Peace Prize co-winner Tawakkol Karman described the use of social media:

The revolution in Yemen began immediately after the fall of Ben Ali in Tunisia. . . . As I always do when arranging a demonstration, I posted a message on Facebook, calling on people to celebrate the Tunisian uprising.

This new media, writes Bishara,

. . . had an important cultural, even sociological role to play in patriarchal Arab societies. It helped young people break free from social constraints. It propelled them into uncharted territory, and it helped them mold a certain type of individualism. They began to enjoy an uninhibited space where they could share information and experiences, join chat rooms, and participate with one another. Theirs is a new found egalitarianism. . . .

Bishara laments the fact that,

Arabs have been valued not for their embrace of freedom or respect for human rights, but rather in terms of their proximity to U.S. interests. A subservient ally and energy providing partner made for a good Arab regime, regardless of its despotic or theocratic rule. . . . Western leaders have talked in slogans . . . about democracy and Islam, but have always been as indifferent to the people of the region as their dictators.

What does the future hold? Bishara recognizes that there are great dangers:

Islamist movements, the likes of the Egyptian Brotherhood, have already opened dialogue with the military and with Western powers on the basis of mutual interest and respect. This might be seen as a positive development, that allows for a new sort of regional order on the basis of a new accommodation among Islamists, the generals, and Western leaders. However, this triangle could eventually be as oppressive and totalitarian as the previous dictatorships . . . the Islamists must make sure that they reconcile with the principles of democracy and modern statehood, not a division of labor with the military. . . . Many of the Islamists I spoke to reckon that if they have a majority they have a democratic right to change the constitution and govern as they see religiously fit. They don't recognize democracy as first and foremost a system of government based on democratic values that go beyond the right of the majority to rule, to ensure that the rights and privileges of the minorities are respected and preserved. . . .

The Invisible Arab is a thoughtful contribution to our understanding of the Middle East from one of its articulate new voices. He shows how the revolutions have evolved - and how it could all go terribly wrong. Marwan Bishara hopes for a free and democratic Middle East - and he has his fingers crossed. *

Saturday, 05 December 2015 04:47

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

"Flash Mobs" in the Summer of 2011: An Example of Family Breakdown

The summer of 2011 saw a proliferation of a phenomenon that has come to be known as "flash mobs." Organized largely through text messages and via Facebook and Twitter, the gangs of unruly youths, usually members of minority communities, have beaten and robbed citizens in Philadelphia, disrupted a fireworks display outside Cleveland, attacked fairgoers in Wisconsin, and looted a 7-Eleven in Germantown, Maryland.

Riots in London during the summer mirror some of the worst uprisings in modern U.S. history. Hundreds of stores across London, Manchester, Birmingham, and other British cities were torched or ransacked in four nights of mayhem after the police killing of a north Londoner named Mark Duggan, whose death was quickly overshadowed by the wave of recreational violence. "This is criminality, pure and simple," said Prime Minister David Cameron.

The looting was more than simply a race riot. While Duggan was black, and there are strong correlations between race and class in Britain, some of the worst violence happened in majority-white neighborhoods like Croydon. "This is much broader than race," says Caryl Phillips, a British writer with Afro-Caribbean roots. "This is about a whole group - black, white, and brown - who live just outside the law."

In the U.S., notes Jerry Ratcliffe, chairman of the Department of Criminal Justice at Temple University, and a former London police officer:

This is an old crime being organized with new tools. There's nothing new about groups of people assaulting people and robbing, but what's new is the technology. There's a fascination with the speed by which this can now take place. You can go from nothing happening to something happening in a matter of moments. Flash mobs take advantage of opportunities. Those opportunities are that the victims are outnumbered by the group and that there is an absence of law enforcement.

In Philadelphia, Mayor Michael A. Nutter, who is black, told marauding black youths, "You have damaged your own race." After imposing a strict curfew, Nutter told young people: "Take those God-darn hoodies down, especially in the summer. Pull your pants up and use a belt 'cause no one wants to see your underwear. . . ."

Mayor Nutter moved up the weekend curfew for minors to 9 p.m. and told parents that they would face increased fines for each time their child is caught violating the curfew.

The head of the Philadelphia chapter of the NAACP, J. Whyatt Mondesire, said it "took courage" for Mr. Nutter to deliver the message. "These are majority African-American youths and they need to be called on it."

In the past two years, Philadelphia has been the scene of a number of flash mobs in which youths meet at planned locations by texting one another and then commit assorted mayhem. In one episode, teens knocked down passersby on a Center City street and entered an upscale department store where they assaulted shoppers. In another incident, about 20 to 30 youths descended on Center City after dark, then punched, beat, and robbed bystanders. One man was kicked so savagely that he was hospitalized with a fractured skull. Police arrested four people, including an 11-year-old.

Speaking from the pulpit of his Baptist Church, Mr. Nutter delivered a 30-minute sermon about black families taking responsibility for the behavior. He said:

The Immaculate Conception of our Lord Jesus Christ took place a long time ago, and it didn't happen in Philadelphia. So every one of these kids has two parents who were around and participating at the time. They need to be around now.

The Mayor told parents:

If you're just hanging out out there, maybe you're sending them a check or bringing some cash by. That's not being a father. You're just a human ATM. . . . And if you're not providing the guidance and you're not sending any money, you're just a sperm donor.

Columnist Gregory Kane, who is black, writes:

What is particularly instructive in this instance is where the 11-year old (arrested in Philadelphia) ended up: in the custody of his grandmother. We don't know what the boy's mother and father are doing right about now, but we know what they aren't doing: parenting their son. . . . Excuses for the flash mobbers, many of whom are black, with some attacking whites at random . . . have been coming fast and furious. They need jobs, the excuse makers tell us. They need recreational facilities. What they need are parents who don't hesitate to put a foot squarely in their derrieres when a foot in that spot is needed.

In Mayor Nutter's view, the collapse of the black family is a key element in the problems we face.

Let me speak plainer: That's part of the problem in the black community. . . . We have too many men making too many babies they don't want to take care of, and then we end up dealing with your children.

In the U.S. at the present time, out-of-wedlock births are now at 41 percent of overall births, but there is a tremendous variation in illegitimate births by race. Such births are the norm in both the black (72 percent) and Hispanic (53 percent) communities, but less than a third of white births (29 percent) are illegitimate.

It is clear that there has been a racial component in the flash mob events this past summer. Columnist Gregory Kane states:

I don't know what else to call it when mobs of blacks single out whites to attack. But there still exists this notion that blacks can't be racists. Racism requires power, the thinking goes. Since blacks have no power, they can't be racists. Such nonsense is bad enough when left-wing loonies and black nationalist types parrot it. But the Rev. Jesse Jackson is a prominent black leader. He, at least, should know better. Alas, he does not. This, he declares, "Is nonsense."

The respected economist Thomas Sowell disputes the idea that the violence of flash mobs can be explained by disparities in income. In his view:

Today's politically correct intelligentsia will tell you that the reason for this alienation and lashing out is that there are great disparities and inequities that need to be addressed. But such barbarism was not nearly as widespread two generations ago, in the middle of the 20th century. Were there no disparities or inequities then? Actually there were more. What is different today is that there has been - for decades - a steady drumbeat of media and political hype about differences in income, education and other outcomes, blaming these differences on oppression against those with fewer achievements or lesser prosperity.

The fact that so many black voices are now being heard about the decline of the black family and the manner in which that decline has led to such events as the flash mobs of this past summer is a hopeful sign. No problem can be resolved unless it is properly understood. Hopefully, that understanding will grow and the real problems we face can then be addressed.

Crony Capitalism: A Growing Threat to Economic Freedom

Crony capitalism - the close alliance of big business with government - leads not to free enterprise but its opposite, in which government, not the market, chooses winners and losers through subsidies and other forms of government largesse. Adam Smith, the great philosopher of capitalism, understood that businessmen want to maximize profits, and how it is done is of secondary interest. Indeed, he once said that when two businessmen get together, the subject of discussion is how to keep the third out of the market. Adam Smith - and more recent philosophers of the free market such as Hayek, Ludwig von Mises, and Milton Friedman - believed deeply in capitalism. Many businessmen, and many on Wall Street, do not.

Consider some of the recent manifestations of this phenomenon.

The U.S. Government guaranteed a $535 million loan for Solyndra, LLC, the now bankrupt California company that was the centerpiece of President Obama's "clean energy" future. There are a least 16 more such loan guarantees worth in excess of $10 billion.

From e-mails made public in mid-September by the House Energy and Commerce subcommittee on Oversight and Investigation, it is clear that key Solyndra loan decisions were guided primarily by political considerations.

President Obama was not in the White House when the proposal to back the company initially appeared in Washington, but two weeks before President George W. Bush left office, an Energy Department review panel unanimously recommended against making the loan. Even after Obama decided to support the proposal, career employees at the Office of Management and Budget cautioned against doing so. One predicted that Solyndra would run out of money and file for bankruptcy by September 2011. A Government Accountability Office report said that the Energy Department had circumvented its own rules at least five times to make the loan. The leading investors in Solyndra were two investment funds with ties to George B. Kaiser, a major fundraising "bundler" for Obama.

Both Republicans and Democrats supported the loan-guarantee program, which was approved by the Republican-controlled Congress in 2005. The loan guarantee program for alternative energy companies was created as part of the Energy Policy Act of 2005, sponsored by Rep. Joe Barton (R-TX), who has been a leader in the congressional probe of Solyndra's ties to the Obama administration.

Similarly, Senator Jim DeMint (R-SC) said in the Senate that the Solyndra case exposed the "unintended results when our government tries to pick winners and losers." This, of course, is quite true. Yet DeMint himself had been a supporter of the loan-guarantee legislation in 2005.

The fact is that solar companies are not the only energy companies getting federal loan guarantees. The power giant Southern Co. won a $3.4 billion loan guarantee from the Energy Department last summer. Yet, even some Republican critics of big government have supported this huge expenditure. Rep. Phil Gingrey (R-GA) declared that it was wrong to compare Southern to Solyndra because "Southern Co. owns Mississippi Power, Alabama Power, Georgia Power, among others, and employs literally thousands of people."

Washington Examiner columnist Timothy Careny notes that:

The implication was clear: Federal subsidies to big, established companies are fine. It's the handouts to these upstarts that are objectionable. So Gingrey is embracing the heart of Obamanomics - the proposition that government ought to be an active partner in shaping the economy and helping business. . . . If Republicans were willing to broaden their attack beyond criticizing this one (Solyndra) deal, they could indict the whole practice of government-business collusion.

Or consider the Export-Import Bank, supported by both Republicans and Democrats, which is a government agency that subsidizes U.S. exporters. Recently, it broke its record for the most subsidy dollars provided in a single year, primarily to Boeing.

Members of both parties have voted to bail out failed banks, auto companies, and other enterprises considered "too big to fail." Now, business interests are lining up to influence the work of the new congressional "supercommittee" that will help decide whether to impose massive cuts in spending for defense, health-care, and other areas. Nearly 100 registered lobbyists for big corporations used to work for members of the committee and will be able to lobby their former employers to limit the effect of any reductions. They are representing defense companies, health-care conglomerates, Wall Street banks, and others with a vested interest in the outcome of the panel's work. Three Democrats and three Republicans on the panel also employ former lobbyists on their staff.

The 12-member committee is tasked with identifying $1.5 trillion in spending reductions over a decade. "When the committee sits down to do its work, it's not like they're in an idealized platonic debating committee," said Bill Allison, editorial director of the Sunlight Foundation, which is tracking ties between lobbyists and the panel. "They're going to have in mind the interests of those they are most familiar with, including their big donors and former advisers."

General Electric, for example, has been awarded nearly $32 billion in federal contracts over the past decade, with much of that business going to lucrative defense and health-care subsidiaries. General Electric's chief executive, Jeffrey Imelt, also heads President Obama's Council on Jobs and Competitiveness. At least eight GE lobbyists used to work for members of the supercommittee.

Top donors to the deficit committee members include AT&T, $562,045; BlueCross/Blue Shield; $460,02; General Electric, $452,999; American Bankers Association, $421,883; Citigroup, $443,006; and National Association of Realtors, $418,000. Needless to say, they contribute to both parties.

A study last year from the London School of Economics found 1,113 lobbyists who had formerly worked in the personal offices of lawmakers. At least nine members of the 12-member supercommittee have scheduled fundraisers this fall, putting them in a position to take money from industry donors at the same time they are helping to decide what to cut from government spending. The most active fundraiser on the panel appears to be Rep. James Clyburn (D-SC) who has a least five donor events scheduled before the panel's Thanksgiving deadline. According to the Sunlight Foundation, contributions given during the time the supercommittee is meeting will not be disclosed to the Federal Election Committee until January - well after the final decision is made.

Sadly, free markets are genuinely embraced more often by intellectuals than businessmen. All too often, businessmen seek government subsidy, bailout, and intervention to keep competitors out of the market. When Congress acted to eliminate the Civil Aeronautics Board and the Interstate Commerce Commission and open up the airline and trucking industries to real competition, it was the industries themselves that opposed deregulation, for they had found a way to control the government agencies involved in their own behalf.

The old warning by the economist Friedrich Hayek that socialism in its radical form is not nearly as dangerous as socialism in its conservative form is worthy of serious reconsideration. When the advocates of state power and the advocates of corporate bigness become allies, government involvement in the economy - a form of socialism - is inevitable. The result is the crony capitalism we now face. *

Saturday, 05 December 2015 04:43

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

How America Goes to War: Rediscovering the Dangers of an All-Powerful Executive

In recent days, our country has been embroiled in three wars - Iraq, Afghanistan, and Libya.

Article I, Section 8, of the U.S. Constitution clearly gives Congress - not the executive - the power to declare war. Since the Constitution was signed in 1787, Congress has declared war five times: the War of 1812, the Mexican War, the Spanish-American War, and World Wars I and II. Yet, since 1787, the U.S. has been involved in numerous military conflicts without a declaration.

In the case of the Korean War, President Truman sent some 1.8 million soldiers, sailors, and airmen over a period of just three years and 36,000 lost their lives - but never sought or resolved a congressional declaration of war. Congress has not declared war since World War II, despite dozens of conflicts since then.

In 1973, Congress passed the War Powers Resolution, which was meant to counteract what Presidents Nixon and Johnson had done in Vietnam. Congress felt deceived, particularly since it was later discovered that the Gulf of Tonkin incident that precipitated a larger war had never, in fact, taken place.

The law, however, hardly reasserts Congress' very clear constitutional power to declare war. Instead, it simply asks for an authorization letter and then gives the President a three-month deadline. It requires the President to withdraw U.S. forces from armed hostilities if Congress has not given its approval within 60 days.

Even fulfilling the requirements of the War Powers Resolution appears to be too much for the Obama Administration. In fact, the President rejected the views of top lawyers at the Pentagon and the Justice Department when he decided that he had the legal authority to continue American military participation in the air war in Libya without congressional authorization.

Jeh C. Johnson, the Pentagon general counsel, and Caroline D. Krass, the acting head of the Justice Department's Office of Legal Counsel, told the White House that they believed that the U.S. military's activities in the NATO-led air war amounted to "hostilities" under the War Powers Resolution, that would require Mr. Obama to terminate or scale back the mission after May 20.

The President, however, adopted the legal analysis of the White House counsel, Robert Bauer, and several others who argued that the military's activities in Libya fell short of "hostilities." Under that view, Obama needed no permission from Congress to continue the mission unchanged.

Late in June, the House rejected a bill to authorize the U.S. military operations in Libya. The resolution to support the mission failed 295 to 123, with 70 Democrats joining Republicans in a rebuff to the President. Still, the House also defeated a measure that would have limited financing to support these efforts.

Rep. Jason Chaffetz (R-UT) said:

It didn't go far enough. Under that resolution, the president is still going to be engaged in the war. We've been inept and irrelevant on the war actions. We have not lived up to our constitutional duty.

In Libya, the goal of our mission appears to have changed from month to month. In March, the President said that U.S. intervention would be confined to implementing a no-fly zone. He declared that, "Broadening our mission to include regime change would be a mistake." By May, the mission was to make Libyans "finally free of 40 years of tyranny." By June, after more than 10,000 sorties, including those by attack helicopters, the strategy seems to boil down to an effort to eliminate Gaddafi himself.

While some have charged that opponents of the conflict in Libya are "isolationists," conservative columnist George Will notes that:

Disgust with this debacle has been darkly described as a recrudescence of "isolationism" as though people opposing this absurdly disproportionate and patently illegal war are akin to those who, after 1938, opposed resisting Germany and Japan. Such slovenly thinking is a byproduct of shabby behavior.

While men and women of good will may disagree about the merits of the U.S. intervention in Libya - or Afghanistan and Iraq - the larger question is whether one man, the President, can take the country to war without a congressional declaration, as clearly called for in the Constitution.

What we are dealing with is the dangerous growth of executive power. During the years of the New Deal, when the power of president was dramatically expanded, Republicans, who were in the opposition, objected to the growth of such power as a threat to freedom. Later, when Republicans held the power of the Presidency, they, too, expanded executive power, and Democrats, now in opposition, objected. This has been characterized as argument from circumstance, not principle. If you hold power, you expand it. No one in power has an incentive to cede back the power that has been assumed.

Even at the beginning of the Republic perceptive men such as John Calhoun predicted that government would inevitably grow, and those in power would advocate a "broad" use of power, and those out of power would always argue for a "narrow" use of power, and that no one would ever turn back government authority which has once been embraced.

Calhoun was all too prophetic when he wrote the following in "A Disquisition On Government":

. . . . Being the party in possession of government, they will . . . be in favor of the powers granted by the Constitution and opposed to the restrictions intended to limit them. As the major and dominant parties, they will have no need of these restrictions for their protection. . . . The minor or weaker party, on the contrary, would take the opposite direction and regard them as essential to their protection against the dominant party. . . . But where there are no means by which they could compel the major party to observe the restrictions, the only resort left then would be a strict construction of the Constitution. . . . To this the major party would oppose a liberal construction . . . one which would give to the words of the grant the broadest meaning of which they were susceptible.

Calhoun continued:

It would then be construction against construction - the one to contract and the other to enlarge the powers of the government to the utmost. But of what possible avail could the strict construction of the minor party be, against the liberal interpretation of the major party, when the one and the other be deprived of all means of enforcing its construction? In a contest so unequal, the result would not be doubtful. The party in favor of the restrictions would be overpowered. . . . The end of the contest would be the subversion of the Constitution. . . the restrictions would ultimately be annulled and the government be converted into one of unlimited powers.

Our history shows that this is true. Republicans opposed big government when Democrats were in power, but spoke of concepts such as "executive privilege" when their own party held positions of authority. The Democrats have done exactly the same thing. The growth of government power has been a steady process, regardless of who was in office.

Those who want to rein in government power, to return to the federal system set forth in our Constitution, with its clearly defined separation of powers and checks and balances, would do well to turn their attention to the question of who has the power to take America to war. The Constitution did not give one man that power, although events in Afghanistan, Iraq, and Libya show us that this seems no longer to be the case. Concern over developments in Libya are a healthy sign that more and more Americans seem to be paying attention to the question of the war-making power.

Dramatic Decline in Public Education Leads to Renewed Push for Voucher Programs

Mounting evidence of a dramatic decline in American public education is leading to a renewed push for voucher programs across the country.

Details of decline are all around us. Some of the New York City high schools that received the highest grades under the Education Department's school assessment system are graduating students who are not ready for college. Of the 70 high schools that earned an "A" on the most recent city progress report and have at least one third of graduates attending college at City University of New York (CUNY), 46 posted remediation rates above 50 percent, according to reports sent to the city's high schools. Remediation rates - the percentage of students who fail a CUNY entrance exam and require remediation classes - rose to 49 percent in 2010 from 45 percent in 2007.

About three quarters of the 17,500 freshmen at CUNY community colleges this year have needed remedial instruction in reading, writing, or math, and nearly a quarter of the freshmen have required such instruction in all three subjects.

Fewer than half of all New York state students who graduated from high school in 2009 were prepared for college or careers, as measured by state Regents tests in English and math. In New York City, that number was 23 percent.

At LaGuardia Community College in Queens, where 40 percent of the math classes are remedial, faculty member Jerry G. Ianni says:

Most students have serious challenges remembering the basic rules of arithmetic. The course is really a refresher, but they aren't ready for a refresher. They need to learn how to learn.

About 65 percent of all community college students nationwide need some form of remedial education, with students' shortcomings in math outnumbering those in reading two to one, said Thomas R. Bailey, director of the Community College Research Center at Teachers University at Columbia University.

The New York State Department of Education released new data in June showing that only 37 percent of students who entered high school in 2006 left four years later adequately prepared for college, with even smaller percentages of minority graduates and those in the largest cities meeting that standard. In New York City, 21 percent who started high school in 2006 graduated last year with high enough scores on state math and English tests to be deemed ready for higher education, or well-paying careers. In Rochester County, it was 6 percent, in Yonkers, 14.5 percent.

Nearly one fourth of the students who try to join the U.S. Army fail its entrance exam, painting a grim picture of an educational system that produces graduates who can't answer basic math, sciences, and reading questions. The report by the Education Trust bolsters a growing worry among military and education leaders that the pool of young people qualified for military service will grow too small.

"Too many of our high school students are not graduating ready to begin college or a career - and many are not eligible to serve in our armed forces," Education Secretary Arne Duncan said. "I am deeply troubled by the national security burden created by America's underperforming education system."

The report found that 23 percent of recent high school graduates don't get the minimum score needed on the enlistment test to join any branch of the military. Questions are often basic, such as: "If 2 plus X equals 4, what is the value of X?"

The military exam results are also of concern because the test is given to a limited pool of people. Pentagon data shows that 75 percent of those aged 17 to 24 don't even qualify to take the test because they are physically unfit, have a criminal record, or don't graduate from high school.

"It's surprising and shocking that we still have students who are walking across the stage who really don't deserve to and haven't earned that right," said Tim Callahan with the Professional Association of Georgia Educators, a group that represents more than 80,000 educators.

The study shows wide disparities in scores among white and minority students, similar to racial gaps on other standardized tests. Nearly 40 percent of black students and 30 percent of Hispanics don't pass, compared with 16 percent of whites. The average score for blacks is 39 and for Hispanics is 44, compared to whites' average score of 55.

The decline in American public education has led to a renewed campaign for a voucher system which would give middle-class and poor parents the same freedom of choice of where to send their children to school that only well-to-do parents now have.

Early in May, Indiana Governor Mitch Daniels signed what is probably the broadest voucher law in the country. A few days later, Oklahoma approved the tax credits for those who contribute to a privately funded private school "opportunity scholarship" program. In New Jersey, in May, a voucher bill was approved by a Senate committee with bipartisan support. In Washington, D.C., the voucher program, which was killed by the Democratic majorities in the last Congress, is all but certain to be restored. In Wisconsin, Governor Scott Walker is pushing hard to broaden Milwaukee's voucher program to other cities and many more children.

According to the Foundation for Educational Choice, a pro-voucher group that lists Milton Friedman as its patriarch, more than 52 bills have emerged this year, some passed, some still pending, in 36 states - among them Arizona, Florida, Ohio, Oregon, and Pennsylvania - providing funding for vouchers, tax credits, or other tax-funded benefits for private education. "No year in recent memory," said foundation president Robert Enlow, has provided better opportunities for the cause."

Writing in The Nation, Peter Schrag, a liberal, declares that, "Milton Friedman's vision for school choice is becoming a reality around the country."

Early in April, a divided Supreme Court further heartened the movement by upholding Arizona's law providing tax credits for contributions to "school tuition organizations" - scholarship funds for private and religious schools.

Many forget that vouchers have never been an exclusively conservative issue. In the 1960s, liberal school reformers like Paul Goodman and John Holt, pushing for "free schools," the "open school," and other escapes from what they regarded as "over-bureaucratized, lockstep" school structures, embraced vouchers as a way of getting there.

Later, liberals like Berkeley law professor John Coons, who helped launch lawsuits seeking equity in school spending, became strong voucher advocates as a way to allow poor and minority children some way out of the ghetto schools.

Clearly, the time seems to have come for a voucher system - and genuinely free choice for parents with regard to where to send their children to school.

The Supreme Court's Strange Embrace of Violent Video Games for Children

The U.S. Supreme Court, in a 7-2 ruling late in June, declared, in a decision written by Justice Antonin Scalia, that a California law that bars selling extremely violent videos to children violated children's First Amendment rights to buy interactive games in which they vicariously steal, rape, torture, and decapitate people to score points.

Justice Scalia said that the state had no compelling interest in limiting the sale of such violent videos. He made light of studies showing that violent videos correlate to aggressive behavior in some children and denied that reading about violence is different from participating in full-color, sound-filled interactive depiction in which the children themselves commit the violence.

Justices Stephen G. Breyer, a liberal, and Clarence Thomas, a conservative, filed the only dissents, arguing that the law was intended to empower parents, not erode the First Amendment. The law targets adults who sell this material to children. The goal, clearly, was not to disempower children but to curb predators.

In a concurring opinion, Justice Samuel Alito and Chief Justice John G. Roberts, argued that the law should be struck down because of vagueness, but added that:

The Court is far too quick to dismiss the possibility that the experience of playing video games (and the effects on minors of playing violent video games) may be very different from anything that we have seen before. . . . In some of these games, the violence is astounding. Victims by the dozens are killed with every imaginable implement . . . dismembered, decapitated, disemboweled, set on fire and chopped into little pieces. They cry out in agony and beg for mercy. Blood gushes, splatters, and pools. Severed body parts and gobs of human remains are graphically shown. In some games, points are awarded based, not only on the number of victims killed, but on the killing technique employed.

In his dissent, Justice Thomas declared that:

The Farmers could not possibly have understood the freedom of speech to include a qualified right to speak to minors. Specifically, I am sure that the founding generation would not have understood "the freedom of speech" to include a right to speak to children without going through their parents.

In his dissent, Justice Breyer quoted from a 1944 case, where the court recognized that the "power of the state to control the conduct of children reaches beyond the scope of its authority over adults."

Most adult Americans probably have no idea of the nature of the video games children are playing - and which the Supreme Court has now embraced as free speech. One such graphic game involves the player torturing a girl as she pleads for mercy, urinating on her, dousing her with gasoline and setting her on fire.

Among the most popular games is "Bloody Day," described this way:

Back alley butchering has never been so much fun. It's like having your own barrel with moderately slow moving fish. How many kills can you rack up?

Another is "Boneless Girl," which is presented in these terms:

Poke and pull this scantily clad babe all over bubble-land. You'll be amazed by the small spaces she can fit through, and throwing her across the screen never gets old.

Sadly, notes Joel Bakan, author of the forthcoming book Childhood Under Siege: How Big Business Targets Children, children:

. . . don't need to rent or buy casual games. They are available on computers, tablets, and cellphones - free. (California's law wouldn't have applied to these games, even if it had survived the court's scrutiny, because they are not rented or sold.) Many popular casual games contain as much violence as notorious video games like Postal 2 and Grand Theft Auto, if not more. But they tend to exist under the radar; they're part of an obscure world into which teenagers and children escape and about which parents are often in the dark. (I learned about them only after I asked my 12-year-old son what he liked to do online.)

Bakan reports that,

Nickelodeon's www.addictinggames.com, a premier casual game site, calls itself "the largest source of the best free online games." It attracts 20 million unique monthly users, mostly children and teens. . . . Like other leading casual game sites, www.addictinggames.com makes money by running advertisements. According to Viacom, the site's corporate owner, the aptly named site allows "junkies" to "gorge themselves" and to "fuel their addiction." Viacom's interest in promoting addiction helps explain why Nickelodeon, the award-winning children's network, might want to push brutal, violent entertainment. Violence sells. And it continues to sell to children, teens, and tweens "hooked" at an early age and hungry for more. . . . The games' use of graphic violence to generate profit is strategic and calculated.

In the 1949 case of Terminiello v. Chicago, Justice Robert H. Jackson, in a famous dissent, declared that, "The Constitution is not a suicide pact." He wrote that,

The choice is not between order and liberty. It is between liberty with order and anarchy without either. There is danger if the court does not temper its doctrinaire logic with a little practical wisdom, it will convert the constitutional Bill of Rights into a suicide pact.

Discussing the California video game case, Brown v. Entertainment Merchants Association, Robert Knight, senior fellow for the American Civil Rights Union, notes that:

The Constitution is the greatest political document in history and the guarantor of our God-given rights. The First Amendment has proved foundational to maintaining all of our freedoms. Exceptions should be few and necessary. But in the hands of America's ruling lawmakers and jurists, the First Amendment is sometimes misapplied as a free pass for dysfunction and decadence.

It is apparently perfectly within the law for movie theaters to refuse to sell tickets to minors to see R- or X-rated movies. What is the difference when it comes to violent video games? To protect children from material of this kind has always been viewed as a sign of civilization. The evidence that watching violent material has a serious impact upon young people is widespread. Consider the role such violent videos played in the lives of the perpetrators of the massacre at Columbine.

Why would the Supreme Court turn its back on such evidence - and normal common sense - to issue a ruling such as the one it did? This is difficult to understand, and we are fortunate that Justices Breyer and Thomas dissented. From their dissents, hopefully, we can revisit this decision in the future. *

Saturday, 05 December 2015 04:39

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

In Contemporary American Society, Truth Is in Increasingly Short Supply

Truth seems to be an increasingly rare commodity in the contemporary American society.

In our political life, the lies are legion. In June, after ten days of adamant denials, Rep. Anthony Weiner (D-NY), finally admitted to having sent sexually explicit photographs to various young women. After telling the nation that he "did not have sexual relations with that woman," former President Clinton finally admitted the truth. Former Senator John Ensign (R-NV) denied the facts of his relationship with a married staff member and the payoff by his parents to the woman's husband. Former Senator John Edwards (D-NC) had a staff member claim to be the father of his mistresses' child.

But lack of truth goes far beyond the personal lives of our politicians. Where were the weapons of mass destruction we were told Saddam Hussein possessed - and which were used as a justification for launching the war in Iraq? We now know that the Gulf of Tonkin incident, the precipitating event which led to President Johnson's launching the Vietnam War, did not really happen. Sadly, the list is a long one.

And it is not only in our political life that truth is hard to find. In an important new book, Tangled Webs: How False Statements Are Undermining America: From Martha Stewart to Bernie Madoff, James B. Stewart warns of the risks from an epidemic of perjury that has "infected nearly every aspect of society."

Citing prosecutors who speak of a recent surge of deliberate lying by sophisticated individuals, often represented by the best lawyers, he focuses on four cases involving well-known people: business executive and lifestyle guru Martha Stewart, convicted of lying to investigators about the reasons her ImClone stock was sold; former Dick Cheney adviser Lewis "Scooter" Libby, found guilty of perjury in conjunction with the leak of CIA operative Valerie Plame's identity; baseball star Barry Bonds, indicted for perjury related to illegal use of steroid drugs; and Bernard Madoff, who while conducting the greatest Ponzi scheme in history, and lying to investors and investigators, was never actually indicted for perjury.

Stewart is particularly outraged when it comes to the failure to indict Madoff for perjury. It was clear to Securities and Exchange Commission investigators in 2005 that he was lying about his investment business, but their superiors decided not to press the issue:

At the time of his sworn testimony in 2006, Madoff purported to have approximately $20 billion under management. By the time his scheme collapsed, he had $65 billion. Failing to pursue his lies cost innocent victims another $45 billion.

Stewart believes that lying is on the rise, threatening to swamp the legal system, and sow cynicism nationwide. In the end, he argues, "it undermines civilization itself."

Consider the case of Greg Mortenson. His best-selling books, Three Cups of Tea, and Stones into Schools are full of lies and evasions. He tells the story of how in 1993, he stumbled into the tiny Pakistani village of Korphe after a failed attempt at climbing K2. He explains how the kind villagers nursed him back to health with many cups of tea and how, as payment for their generosity, he returned to build a school. That school then became hundreds of schools across Pakistan and Afghanistan. Millions were inspired by the idea that a man could make such a profound difference in a desperate part of the world. Mortenson was nominated three times for the Nobel Prize. He was called a secular saint.

In April, as a result of an investigative report by bestselling author Jon Krakauer and a "60 Minutes" expose, we learned that Mortenson may very well be a charlatan. The most significant passages in the book seem to be fictitious, including the whole story about his recovery in Korphe. The "Taliban abductors" described in "Three Cups of Tea" were supposedly friendly villagers protecting him as a guest of honor. It was reported that his charity is apparently badly mismanaged and that many of its schools stand empty, some of them serving as storage sheds for hay.

In 2009, only 41 percent of donations to Mortenson's charity went to its work in Afghanistan and Pakistan. Much of the rest, charge Krakauer and "60 Minutes," went to Mortenson himself - to chartered jets, massive purchases of his books (at retail, so he would get the royalties and keep them on the bestseller list), and advertisements for them in The New Yorker at more than $100,000 each time.

More and more Americans are also claiming to have military honors they never earned. Joseph Brian Cryer, for example, is a former candidate for City Council in Ocean City, Maryland. He claimed to be an elite U.S. Navy SEAL and bragged online about having "77 confirmed kills" in 102 hours during a Libyan operation in 1986. To prove his own bona fides, he showed a government ID card that shows him to be 100 percent disabled and a Navy commander.

But Cryer is a fraud, said Don Shipley, a retired SEAL who makes it his business to expose false ones. Shipley has access to a database of all Navy SEALs since 1947. Since Navy SEAL Team 6 took out Osama bin Laden in April, he said he has received about 50 requests each day to investigate people who claim to be SEALs.

The list of those criminally charged for falsifying their military service is a long one. In one case, Command Sgt. Maj. Stoney Crump, the senior enlisted man at Walter Reed Army Medical Center, was fired for faking his record and wearing numerous unauthorized awards and decorations. He was sentenced to six months in prison.

In another case, former Marine Corps Sgt. David Budwah was sentenced in 2009 to 18 months confinement and fined $25,000 for pretending to be an injured war hero to get free seats at rock concerts and professional sporting events.

"Every society in history, since the caveman days, has revered its warriors," said B. G. Burkett, author of Stolen Valor. He has uncovered thousands of suspected fakes and says most lie out of lack of self-esteem. "They haven't done anything in their lives," he said. "But the second they say they're a warrior, everybody sees them in a different light."

Congress passed the Stolen Valor Act in 2006. The law makes it illegal for someone to falsely claim to hold military honors or decorations. But some of those who have faced criminal charges claim the law is unconstitutional, arguing that it violates the First Amendment. The law "has every good intention," said Ken Paulson, president of the First Amendment Center. "But courts have been reluctant to outlaw lying in America. It's just too prevalent to legislate."

This far, federal courts have split on the law's constitutionality. A federal judge in Virginia ruled this year that the First Amendment doesn't protect the false claims the act makes illegal. But the California-based 9th Circuit Court of Appeals found the law unconstitutional last year.

In May, Rep. Joe Heck (R-NV) introduced a revised Stolen Valor Act that would make it a crime of fraud to benefit, or intend to benefit, from lying about military awards. "It's not O.K. to misrepresent yourself as a physician and practice medicine," Mr. Heck said.

It's not O.K. to misrepresent yourself as a police officer. Why should you be able to misrepresent yourself as a member of the military, specifically if you're trying to gain something of value?

The widespread telling of untruths - and the claim that people have a legal right to engage in lying about basic credentials - is an indication of our society's current moral standards. In the end, more is involved than simply immoral behavior. Such behavior is, in fact, a threat to democratic self-government.

Edmund Burke, in his letter to a member of the French National Assembly in 1791, made a point we might well ponder today:

Men are qualified for civil liberty in exact proportion to their disposition to put chains upon their own appetites in proportion as their love of justice is above their rapacity; in proportion as their soundness and honesty of understanding is above their vanity and presumption; in proportion as they are more disposed to listen to the counsels of the wise and good in preference to the flattery of knaves. Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less there is of it within, the more of it there must be without. It is ordained in the eternal constitution of things that men of intemperate minds cannot be free. Their passions forge their fetters.

Can a Free Society Endure if It Does Not Teach Its History and Its Values to the Next Generation?

American students are less proficient in their nation's history than in any other subject, according to results of a nationwide test released in June, with most fourth graders unable to say why Abraham Lincoln was an important figure, and few high school seniors able to identify China as the North Korean ally that fought American troops during the Korean War.

Overall, 20 percent of fourth graders, 17 percent of eighth graders and 12 percent of high school seniors demonstrated proficiency on the exam, the National Assessment of Educational Progress. Fewer than a third of eighth graders could answer what was described as a "seemingly easy question," asking them to identify an important advantage American forces had over the British during the Revolution, the government's statement on the results said.

Diane Ravitch, an education historian who was invited by the national assessment's governing board to review the results, said she was particularly disturbed by the fact that only 2 percent of 12th graders correctly answered a question concerning Brown vs. Board of Education, which she called "very likely the most important decision of the U.S. Supreme Court in the past seven decades."

Students were given an excerpt including the passage, and were asked what social problem the 1954 ruling was supposed to correct:

We conclude that in the field of public education, separate but equal has no place, separate educational facilities are inherently unequal.

"The answer was right in front of them," Ms. Ravitch said. "This is alarming."

"The results tell us that, as a country, we are failing to provide children with a high-quality, well-rounded education," said Education Secretary Arne Duncan.

The evidence of our failure to teach our history is abundant. Fewer than half of American eighth graders knew the purpose of the Bill of Rights on the most recent national civics examination, and only one in 10 demonstrated acceptable knowledge of the checks and balances among the legislative, executive, and judicial branches, according to the test results released in April.

"These results confirm that we have a crisis on our hands when it comes to civics education," said Sandra Day O'Connor, the former Supreme Court justice, who last year founded icivics.org, a nonprofit group that teaches students civics through web-based games and other tools.

"The results confirm an alarming and continuing trend that civics in America is in decline," said Charles N. Quigley, executive director of the Center for Civic Education. "During the past decade or so, educational policy and practice appear to have focused more and more upon developing the worker at the expense of developing the citizen."

"We face difficult challenges at home and abroad," said Justice O'Connor.

Meanwhile, divisive rhetoric and a culture of sound bites threaten to drown out rational dialogue and debate. We cannot afford to continue to neglect the preparation of future generations for active and informed citizenship.

Historian David McCullough says that:

We're raising young people who are, by and large, historically illiterate. I know how much of these young people - even at the most esteemed institutions of higher learning - don't know. It's shocking.

McCullough, who has lectured on more than 100 college campuses, tells of a young woman who came up to him after a lecture at a renowned university in the Midwest. "Until I heard your talk this morning, I never realized the original 13 colonies were all on the East Coast," she said.

Some years ago, when 111 ninth graders in a Honolulu school were asked to write the Pledge of Allegiance, no one could do it correctly. One response described the United States as a nation "under guard" and dedicated "for richest stand." A teacher, who asked not to be identified so her students would not be embarrassed, called the results frightening. She said all the students had spelling problems and had little grasp of what the pledge words meant. The word "indivisible," for example, came out as "in the visible." The teacher said that 12 students had trouble spelling the word "America." The word appeared in some papers as "Americain," "Americai," "Amereca," "Amicra," and "Amica." The teacher said, "I'm sick. I don't know what to do or where to turn."

These trends were hardly new. More than twenty years ago, writing in Public Opinion magazine, author Ben Stein reported:

Recently a 19 year-old junior at the University of Southern California sat with me while I watched "Guadalcanal Diary" on T.V. It goes without saying that the child had never heard of Guadalcanal. More surprisingly, she did not know whom the U.S. was fighting against in the Pacific. ("The Germans?") She was genuinely shocked to learn that all those people were Japanese and that the U.S. had fought a war against them. ("Who won?") Another student at USC did not have any clear idea when World War II was fought. . . . She also had no clear notion of what had begun the war for the U.S. Even more astounding, she was not sure which side Russia was on and whether Germany was on our side or against us. In fact, I have not yet found one single student in Los Angeles, in either college or in high school, who could tell me the years when World War II was fought. Nor have I found one who knew when the American Civil War was fought.

Stein laments that:

Unless our gilded, innocent children are given some concept of why the society must be protected and defended, I fear that they will learn too soon about a whole variety of ugly ideas they did not want to know about. . . . People who do not value what they have rarely keep it for long, and neither will we.

Things have gotten far worse since Stein wrote those words. One reason for students' poor showing on recent tests underlines the neglect shown to the study of history by federal and state policy makers - both Republicans and Democrats - especially since the 2002 No Child Left Behind act began requiring schools to raise scores in math and reading, but in no other subject. This federal accountability law (surprisingly embraced by Republicans who previously argued that education was a state and local - not a federal - matter) has given schools and teachers an incentive to spend most of their time teaching to the math and reading tests, and totally ignoring history.

"History is very much being shortchanged," said Linda K. Salvucci, a history professor in San Antonio who is chairwoman-elect of the National Council for History Education.

Historian Paul Johnson points out that:

The study of history is a powerful antidote to contemporary arrogance. It is humbling to discover how many of our glib assumptions, which seem to us novel and plausible, have been tested before, not once but many times and in innumerable guises; and discovered to be, at great human cost, wholly false.

Free societies are rare in history. If their history and values are not transmitted to the next generation, their survival is questionable. As Cicero (106-43 B.C.) understood:

To remain ignorant of things that happened before you were born is to remain a child. What is human life worth unless it is incorporated into the lives of one's ancestors and set in a historical context?

European Leaders Are Turning Against Multi-culturalism - a Dilemma Faced by Our Own Society as Well

As immigration problems - particularly among the large North African and Middle Eastern populations in France, Germany, the Netherlands, Great Britain, and other West European countries - rise to the surface, the idea of "multi-culturalism" is coming under increasing criticism.

Germany's chancellor, Angela Merkel, called it "a total failure," and France's president Nicolas Sarkozy, told an interviewer that immigrants should "melt into a single community." In a speech in Munich, Britain's Prime Minister, David Cameron, traces the problem of homegrown Islamist alienation and terrorism to "a question of identity."

"A passively tolerant society," Cameron said, "stands neutral between different values." But "a generally liberal country . . . says to its citizens, this is what defines us as a society: to belong here is to believe in these things."

The things Cameron went on to cite were freedom of speech and worship, democracy, and the rule of law, and equal rights. Much of this is not new, as concern over multi-culturalism has been growing. A year after the London bombings of July, 2005, Ruth Kelly, then the Labor Party minister in charge of community policies, asked whether - in its anxiety to avoid imposing a single British identity on diverse communities - multi-culturalism had encouraged "separateness."

In December 2006, Tony Blair gave a speech on multi-culturalism which included many of Prime Minister Cameron's points. Both prime ministers called for tighter controls on Muslim groups receiving public funds, an entry ban on foreign preachers with extremist views, a tougher position on forced marriages, and an expectation that all British citizens support common values, from the rule of law to a rejection of discrimination.

French president Sarkozy declared that:

If you come to France, you accept to melt into a single community, which is the national community, and if you do not want to accept that, you cannot be welcome in France. Of course, we must respect all differences, but we do not want . . . a society where communities coexist side by side.

Europe's dilemma is real, as is its need for immigrants. Deaths are expected to outnumber births this year in 10 of the European Union's 27 member states. As of 2015 the EU as a whole will experience negative natural population growth, demographers say, and the gap will grow to one million excess deaths a year by 2035. By 2050 the EU will have 52 million fewer people of working age, the European Commission warns. Businesses across Europe are already facing severe shortages of engineers, technicians, craftspeople, and other skilled professionals, with four million unfilled jobs across the continent.

For decades, most European countries have consigned immigrants to the margins. In Germany - which, until recently, continued to proclaim it was "not an immigrant society" - some professions were restricted to German citizens well into the 1990s, while eligibility for citizenship itself was based on bloodlines until a landmark reform in 2001. Millions of refugees were legally barred from working, which forced them into welfare dependency. Muslims, in particular, remain unintegrated and ghettoized in many European countries.

The attention now being focused on the need to integrate immigrants into European society is a hopeful sign. We have had this same debate in the U.S. for some time, but, for a variety of reasons, have done a better job in integrating immigrants into our society. Fortunately, our "melting pot" tradition has served us well.

What Americans have in common is not a common racial, ethnic, or religious background, but, instead, a commitment to the concept of individual freedom in a society established by the U.S. Constitution, which protects and preserves it.

We have, of course, had our advocates of "bi-lingual," "Afro-centric," and other forms of multi-cultural education. According to the multiculturalist worldview, notes Linda Chavez:

African-Americans, Puerto Ricans, and Chinese Americans living in New York City have more in common with persons of their ancestral group living in Lagos or San Juan or Hong Kong than they do with other New Yorkers who are white. Culture becomes a fixed entity, transmitted, as it were, in the genes, rather than through experience.

Historian Arthur M. Schlesinger, Jr. declared that:

Multiculturalists would have our educational system reinforce, promote, and perpetuate separate ethnic communities and do so at the expense of the idea of a common culture and a common national identity.

Afro-centric education and other forms of separate education for separate groups is the opposite of the traditional goal of civil rights leaders who wanted only to open up American education to all students, regardless of race. The distinguished black leader in the early years of this century, W. E. B. DuBois, disputed the multiculturalists of his own day. He said:

I sit with Shakespeare and he winces not . . . . Across the color line I move arm in arm with Belzac and Dumas. I summon Aristotle and Aurelius and what soul I will, and they come all graciously with no scorn or condescension. So, wed with Truth, I dwell above the veil.

To him, the timeless wisdom of the classical works of Western civilization spoke to all people and races, not just to whites of European ancestry.

Professor Seymour Martin Lipset of the Hoover Institution at Stanford University declares:

The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension, and tragedy. Canada, Belgium, Malaysia, Lebanon - all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons, and Corsicans.

European societies will resolve their difficulties with today's immigrants when they adopt the American model and recognize that their societies will no longer be homogeneous in the future and that their goal should be to assimilate new immigrants into the culture and civilization of France, England, and other Western European countries.

Some of today's Islamic immigrants may provide a greater challenge, but this should not prove insurmountable if the proper policies are adopted. Finally, Western European leaders seem to have come to the understanding that multiculturalism is not the way.

Immigrants leave their native countries for the West because there is something in the West they want. It is the responsibility of these Western European countries to transmit to their new immigrants the values, culture, and civilization of their societies. This has been going on in our own country for more than 300 years with great success. The first female black candidate for president, Rep. Shirley Chisholm (D-NY) once said, "We came over on different ships, but we're in the same boat now." Multiculturalism is a dangerous detour from the assimilation and acculturation of immigrants which should be the primary goal of those societies now receiving large numbers of new residents. Abandoning the notion that Western European countries are "not immigrant societies" is an important step forward. *

Saturday, 05 December 2015 04:37

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

One Reason for Our Educational Decline May Be Bad Students, Not Bad Schools

The fact that our schools are not doing very well in educating our students is abundantly clear. A recent international assessment of students showed that as China's economic power has risen, so has its educational progress. The survey, which rated the aptitude for 15-year-olds in 65 countries, shows teens in Shanghai far outscoring international peers in reading, math, and science. The U.S. lags far behind countries such as Finland and South Korea. When it comes to math, Chinese students scored 600 while American students scored 497. In science, the Chinese scored 575 and Americans 501.

A study released in November by McKinsey, the international consulting firm, showed that throwing money at education does not seem to do much good, at least in those countries that already send all their young people to school. The U.S., for example, increased its spending on schools by 21 percent between 2000 and 2007, while Britain pumped in 37 percent more funds. Yet, during this period, standards in both countries slipped.

Many school systems that did not receive extra funds did much better. Schools in the state of Saxony, in Germany, in Latvia, Lithuania, Slovenia, and Poland have all raised their achievement scores. Even poor countries such as Chile and Ghana have made progress.

In an important new book, Bad Students, Not Bad Schools (Transaction), Robert Weissberg, who taught political science at the University of Illinois for decades, argues that the reason for educational decline "is the students, stupid, not the facilities or the curriculum."

It is his view that this "obvious truth" is one which none dares to speak. He reports that millions of lazy, incurious, disruptive, and nearly illiterate youngsters flood classrooms every day, and none of the popular and expensive initiatives and ideas that are being promoted by well-meaning foundations and professors of education will change them. In his view, the middling students far outnumber the motivated ones, and the most difficult ones - troublemakers and vandals, immigrants struggling with English, kids who hate homework (an Indiana survey counts 50 percent who complete one hour or less of studying per week) - effectively turn classrooms into chaotic free-for-alls.

Year after year, Weissberg shows, a new initiative is presented - laptops for every eighth grader, bills to equalize school funding, after-school day care for single mothers - founded on the assumption that a better environment will invigorate lagging ones, close the racial gap, and prepare every student for college.

Weissberg notes that bright students with a good work ethic excel whether they study with new laptops or much used textbooks. His example is the children of the Vietnamese boat people - from close-knit families who place a high priority on educational achievement. Students who come from families who pay little attention to their children's education, children who have never been read to and learned to appreciate books, whose parents - often a single mother - are usually otherwise occupied - are not coming to school prepared to learn. It should be no surprise, Weissberg argues, that they do so poorly.

The kinds of reforms advocated to improve our schools tend, Weissberg believes, to completely miss the point of the real challenge we face:

In sports, this would be as if a basketball franchise with inept, lackadaisical players tried to reverse its fortunes by constructing a spectacular new arena, adding high-tech training facilities, inventing clever new plays, and hiring a Hall of Fame coach.

To resolve these problems, Weissberg advocates a radical approach, one unlikely to find much support. He explicitly advocates a policy "to eliminate the bottom quarter of those past 8th grade" or, "altering the mix of smart and not so smart students." In this formulation, unintelligent and idle students exact a lot of time and labor, holding up gifted students. Once the bad-student pool reaches a certain proportion, the teacher, principal, and school board end up devoting all of their attention to it.

One can find this prescription too harsh and perhaps counterproductive while recognizing that Weissberg has identified the problem with current efforts at educational reform which do not identify the real problems we face. He devotes the bulk of his book to various elements of the reform effort, documenting the way each one rationalizes away the bored and unruly students. Judges force municipalities to integrate schools, foundations write multi-million-dollar checks to enhance "learning opportunities" in urban schools, conservative leaders push charter-school expansion. No one addresses the fact that we do not turn out the same kind of students we used to because we do not get the same kind in.

Professor Mark Bauerlein of Emory University, writing in Commentary, notes that:

An entire industry has prospered on malfunction. If public schools produced skilled, knowledgeable graduates, if at-risk kids turned into low-risk kids, if students in crumbling buildings achieved as well as others, an army of professors, advocates and activists and lawyers, foundation personnel, contractors, and construction workers, security officers, after-school and summer-school tutors, and school administrations, counselors, and professional developers, would have to find other employment. The more schools fail, the more money pours in. Los Angeles has one of the lowest graduation rates in the country, and in September it opened the $578 million Robert F. Kennedy Community Schools complex, the most expensive school in U.S. history.

The 2008 documentary "Hard Times at Douglass High," features a Baltimore student named Audie. "This is what we do," Audie said, talking about himself and other students who roamed the halls all day, learning nothing:

Just walking the halls all day, baby. (Bleep) class. That (bleep's) for clowns, man. Don't nobody go to class here, man. Man, (bleep) academics.

Discussing the Weissberg book, columnist Gregory Kane, who is black, declares: "Want to really reform American education? Get guys like Audie out of our schools."

No problem can be resolved unless it is properly understood. Whatever one thinks of Professor Weissberg's proposed solutions, his analysis is worthy of serious attention.

The Focus of Attention on the Role of Public Sector Unions in Leading Cities and States to Fiscal Crisis Is Long Overdue

The debate in Wisconsin, Ohio, Indiana, and other states over collective bargaining on the part of public employees has focused long-overdue attention upon the role of public sector unions in moving our cities and states toward fiscal crisis and, in many instances, insolvency.

The idea of collective bargaining for public employees is a rather new concept. Wisconsin was the first state to provide those rights in 1959. Other states followed and California became the biggest convert in 1978 under Jerry Brown in his first term as governor. President Kennedy permitted some federal workers to organize, although not bargain collectively, for the first time in 1962.

Few Americans today remember how, until recent years, even our most liberal political leaders resisted collective bargaining for public unions. President Franklin Roosevelt opposed collective bargaining for public unions as did legendary New York City Mayor Fiorello LaGuardia. Even George Meany, the long-time AFL-CIO president, opposed the right to bargain collectively with the government.

The Wall Street Journal explained the basis for such opposition:

. . . unlike in the private economy, a public union has a natural monopoly over government services. An industrial union will fight for a greater share of corporate profits, but it also knows that a business must make profits or it will move or shut down. The union chief for teachers, transit workers, or firemen knows that the city is not going to close the schools, buses, or firehouses. This monopoly power, in turn, gives public unions inordinate sway over elected officials. The money they collect from member dues helps elect politicians who are then supposed to represent the taxpayers during collective bargaining. . . . Public unions depend entirely on tax revenues to fund their pay and benefits. They thus have every incentive to elect politicians who favor higher taxes and more government spending. The great expansion of state and local spending followed the rise of public unions.

Concern about public sector unions is hardly new. When three-fourths of the Boston police department went on strike in 1919, leading to an escalation of crime, then-Massachusetts Governor Calvin Coolidge called out the state militia and broke the strike. Coolidge declared: "There is no right to strike against the pubic safety by anybody, anywhere, any time."

In August, 1981, the Professional Air Traffic Controllers Organization called a strike over better working conditions, better pay and a 32-hour work week. In doing so, the union violated a law that banned strikes by government unions. President Reagan declared the strike "a peril to national safety" and ordered the members back to work under terms of the Taft-Hartley Act of 1947. Only 1,300 of the nearly 13,000 controllers returned to work. President Reagan demanded those remaining on strike to resume work within 48 hours or forfeit their jobs. In the end, Reagan fired 11,345 striking air traffic controllers and banned them from federal service for life.

Collective bargaining rights are only one part of the problem we face. Consider the state of Virginia, which bans collective bargaining. Like pension systems in states friendlier to unions, Virginia's public employee retirement system is underfunded by $17.6 billion. At the same time, teachers in Virginia have slightly higher average salaries than the unionized teachers in Wisconsin, and over the past decade, Virginia teacher pay grew faster than teacher pay in Wisconsin.

The fact is that, regardless of the question of collective bargaining, states and local governments across the country are faced with chronic fiscal problems rooted in unsustainable employee compensation systems. This is an issue beyond traditional liberal and conservative divisions. Editorially, The Washington Post notes that:

Much of the issue is rooted in healthcare costs, especially benefits for public-sector retirees. States face a combined $555 billion in unfunded retiree health coverage liabilities. Yet in 14 states, taxpayers pick up 100 percent of the premium tab for retirees, who often collect benefits for a decade or more before going on Medicare. This is not only unfair to taxpayers, for whom free healthcare is usually a remote dream. It also encourages overconsumption of medical goods and services, thus raising the cost for everyone.

More than a third of the nation's $9.3 trillion in pension assets belong to state and local government employees, even though they make up only 15 percent of the U.S. work force, according to a study by the Spectrum investment group. Even with $3.4 trillion set aside to pay public pensions, dozens of state and local governments are struggling to make payments. Wisconsin, Ohio, and Florida are calling on state employees for the first time to contribute to their retirement plans the way workers do in the private sector. The $3.4 trillion set aside for public pensions understates the burden for states and taxpayers since the plans are collectively underfunded by as much as $2.5 trillion, said Milton Ezrati, senior economist at Lord Abbott & Company.

"The undeniable fact is that most states and municipalities offer more generous pensions that they can afford," he said, noting that the plans typically allow employees full retirement benefits after 20 or 30 years of employment and include generous cost-of-living increases, healthcare benefits, and other perks that are not common in the private sector.

The Spectrum study found that the nation's 19.7 million state and local employees constituted 15 percent of the 128-million American work force in 2009. Yet they laid claim to more than $3 in retirement assets for every $1 set aside for the retirement of the nation's 108 million workers in the private sector.

According to the Bureau of Labor Statistics in 2010, the total compensation costs of state and local government workers were 44 percent higher than private industry; pay was only 33 percent higher, but benefits cost 70 percent more.

"The cost," says Daniel DiSalvo of the City College of New York,

. . . of public-sector pay and benefits (which in many cases far exceed what comparable workers earn in the private sector), combined with hundreds of billions of dollars in unfunded pension liabilities for retired government workers, are weighing down state and city budgets. And staggering as these burdens seem now, they are actually poised to grow exponentially in the years ahead.

At long last, public attention is being focused upon the role of public sector unions. It could not come a moment too soon.

Horrors Continue in Zimbabwe, but the World Largely Looks Away

When it comes to dictators in Africa clinging to power, the list, unfortunately, is a long one. This year, popular uprisings in North Africa have led to the removal of Zine el-Abidine Ben Ali's 23-year regime in Tunisia, and Hosni Mubarak's 30-year control of Egypt. At the present time, Libya's Moammar Gaddafi is fighting popular resistance - as well as Western air strikes - to maintain his 41-year-old grip on power.

Sadly, many other dictators remain in place. Robert Mugabe of Zimbabwe has always been clear about his own ambition. "No matter what force you may have," he declared in 2001, "this is my territory and that which is mine I cling to until death."

In April 2008, voters in Zimbabwe flocked to the polls, and, by an overwhelming margin, repudiated Mugabe's rule. Then 84 and in failing health, Mugabe seemed ready to concede defeat to the opposition leader, Morgan Tsvangirai. Instead, Mugabe and his supporters launched a counterattack. The Zimbabwe Electoral Commission, controlled by the ruling party, falsified the vote count, forcing Tsvangirai into a second round. Foreign journalists were detained and removed from the country. Mugabe loyalists hunted down, beat, and killed supporters of Tsvangirai's Movement for Democratic Change (M.D.C.). Mugabe's generals called it "Operation Who Did You Vote For?"

In a new book, The Fear: Robert Mugabe and the Martyrdom of Zimbabwe, Peter Godwin, himself a native of Zimbabwe when it was Rhodesia, recalls that he was then one of the few Western journalists remaining in the country. He traveled from Harare to rural Zimbabwe, documenting the bloodshed. He visited hospitals overflowing with maimed and burned victims. "Think of deep, bone-deep lacerations, of buttocks with no skin left on them, think of being flayed alive."

He writes of a torture method called falanga: "Think of swollen, broken feet, of people unable to stand, unable to sit, unable to lie on their backs because of the blinding pain."

At one point, Godwin joins with James McGee, the American ambassador, on a fact-finding trip outside Harare. They repeatedly confront policemen, militia members, and intelligence agents, but McGee manages to move forward as he and his team gather evidence of torture and murder. Godwin wanders into a farmhouse used as a torture center by Mugabe's hit teams and discovers a notebook that documents interrogations and names people "who are to be beaten." Finally, Godwin is advised to leave the country for his own safety and he watched from New York as Tsvangirai withdraws from the runoff, saying he cannot participate in a "violent, illegitimate sham."

A few months later, Tsvangirai and Mugabe sign the so-called Global Political Agreement. Negotiated under international pressure by South African president Thabo Mbeki - who remained silent as the murder count rose - the deal kept Mugabe entrenched in power but forced him to install Tsvangirai as prime minister and turn over half the cabinet seats to members of the Movement for Democratic Change.

Peter Godwin returned to Zimbabwe to witness the inauguration of the new government. He quickly realized that the ruling party has no intention of upholding the agreement. Godwin's friend Roy Bennett, a white, Shona-speaking ex-farmer and M.D.C. leader popular with his black constituents, returns from exile in South Africa to assume a junior cabinet post and is almost immediately placed in jail, held for weeks in very poor conditions. Tendai Biti, a courageous attorney and M.D.C. secretary general, survives his own incarceration on treason charges and reluctantly signs on as finance minister, "Here is Tendai," Godwin writes, "trying to scrounge the money to pay for the bullets that were used against his own supporters in the last election."

Godwin portrays Mugabe as an "African Robespierre" - highly educated and completely ruthless. He cautions against viewing him as a case of a good leader gone bad. "His reaction to opposition has invariably been a violent one," writes Godwin.

Using violence to win elections has long been Mugabe's method of remaining in power. He first set out his views on electoral democracy in 1976, during the guerrilla war against the government of Rhodesia - in which he was widely embraced in the West, including in Washington - in a radio broadcast. "Our votes must go together with our guns." He even boasted of having "a degree in violence." Since coming to power in 1980, he has regularly resorted to the gun to deal with whatever challenge his regime has faced.

Peter Godwin details the manner in which, after the 2008 elections, Mugabe unleashed the army, police, security agencies, and party militias to beat the electorate into submission in time for the second round of elections. Among the electorate this campaign was known simply as "chidudu" - the fear. Villagers were beaten and told to "vote Mugabe next time or you will die." Scores of opposition organizers were murdered by death squads. Rape, arson, and false arrests were widespread.

Mugabe was open about his intentions and his contempt for democracy. "We are not going to give up our country because of a mere 'X,'" he told supporters at an election rally. "How can a ballpoint fight with a gun?"

What stands out in Godwin's reporting is not just the scale of destruction that Mugabe has inflicted on his country but the courage of Zimbabweans who defy his tyranny, knowing the consequences of doing so. Godwin describes the "insane bravery" of an opposition candidate who continued to taunt his attackers even while they were beating him and later, defying doctors' orders, appeared in plaster cast to take his place at the swearing-in ceremony at a local council.

The African Union, formerly the Organization of African Unity, says that it is determined to be more rigorous than its predecessor, which turned a blind eye to dictatorship and tyranny. According to The Economist:

. . . The AU still exudes a lot of hot air. . . . The AU's instinct is still to wring hands . . . rather than resolve issues. Its credibility was hurt when Moammar Gaddafi was elected chairman for 2009. This year, Equatorial Guinea's Teodoro Obiang, one of Africa's more venal leaders, looks likely to get the job.

And the people of Zimbabwe continue to suffer as the world, including our own country, which bears some responsibility for installing Mugabe in power, looks away. The brave men and women who have shown their willingness to put their lives on the line for freedom deserve better.

American Colleges and Universities Are Failing to Transmit Our History and Culture

There is growing evidence that our colleges and universities are failing to transmit our history and culture.

Recently, the Intercollegiate Studies Institute (ISI) gave a 60-question civic literacy test to more than 28,000 college students:

Less than half knew about federalism, judicial review, the Declaration of Independence, the Gettysburg Address, and NATO. And this was a multiple choice test, with the answers staring them right in the face. . . .

said political scientist Richard Brake, co-chairman of ISI's Civic Literacy Board. Brake said:

Ten percent thought that "We hold these truths to be self-evident, that all men are created equal . . ." came from the Communist manifesto.

In another study, a large number of U.S. university students were shown to have failed to develop critical thinking, reasoning, and writing skills because of easy classes and too little time spent studying.

The study of 3,000 students at 29 four-year universities found that 45 percent "did not demonstrate any significant improvement in learning" during their first two years in college as measured by a standardized test. After the full four years, 36 percent had shown no development in critical thinking, reasoning, and writing, according to the study, which forms the basis of the new book Academically Adrift: Limited Learning on College Campuses. The study attributed much of the problem to easy courses and lax study habits.

Real requirements at most colleges and universities have all but disappeared. John Hopkins University, for example, is America's premier research institution. Yet a student could complete a bachelor's degree without ever taking a course in science, math, history, or English. Students at John Hopkins - and many other colleges - notes Washington Post writer Daniel DeVise:

. . . choose classes the way a diner patron assembles a meal, selecting items from a vast menu. Broad distribution requirements ensure that students explore the academic universe outside their majors. But no one is required to study any particular field, let alone take a specific course. Shakespeare, Plato, Euclid - all are on the menu: none is required.

The American Council of Trustees and Alumni, a Washington-based advocacy group, recently handed out F grades to Hopkins and many of its peers, inviting debate on a basic question: What, if anything, should America's college students be required to learn?

The group faulted the schools, including Yale, Brown, Cornell, Amherst, and the University of California, Berkeley, for failing to require students to take courses in more than one of seven core academic subjects: math, science, history, economics, foreign language, literature, and composition.

"At Stanford, you can fulfill the American cultures requirement by taking a course on a Japanese drum," said Anne Neil, president of the trustees group.

"We're certainly not saying that Harvard or Hopkins or Yale are not good schools, or that their graduates are not smart kids," said Neal, who attended Harvard and Harvard Law. "What we're saying is that those schools don't do a good job at providing their students with a coherent core."

Richard Ekman, president of the Council of Independent Colleges in Washington, states: "I think the criticism that students may not be learning enough in general education resonates with most colleges."

Neal says that the group's examination of more than 700 college catalogs proves that:

It is quite possible to avoid American history, or Plato, or science. Many colleges don't even require their English majors to take a course on Shakespeare.

The study of history is in the process of dramatic change. In 1975, three quarters of college history departments employed at least one diplomatic historian; in 2005, fewer than half did. The number of departments with an economic historian fell to 32.7 percent from 54.7 percent. By contrast, the biggest gains were in women's history, which now has a representative in four out of five history departments.

The shift in focus began in the late 1960s and early 1970s when a generation of academics began looking into the roles of people generally missing from history books - women, minorities, immigrants, workers. Social and cultural history, then referred to as bottom-up history, offered fresh subjects.

At the University of Wisconsin, Madison, out of the 45 history faculty members listed, one includes diplomatic history as a specialty, one other lists American foreign policy; 13 name gender, race, or ethnicity. Of the 12 American history professors at Brown University, the single specialist in U.S. foreign policy also lists political and cultural history as areas of interest. The professor of international studies focuses on victims of genocide.

"The boomer generation made a decision in the 1960s that history was starting over," said David Kaiser, a history professor at the Naval War College. "It was an overreaction to a terrible mistake that was the Vietnam War." The result is that "history is no longer focused on government, politics, or institutions."

There are no reliable statistics of course offerings, but Mr. Kaiser and others argue that there has been an obvious decline. "European diplomacy is just about completely dead," Kaiser said, "and it's very hard to find a course on the origins of the First World War."

At Ohio University in Athens, when a military historian recently retired, there was a vigorous debate about how to advertise for a replacement. Some faculty members had the view that "military history is evil," said Alonzo L. Hamby, a history professor. The department finally agreed to post a listing for a specialist in "U.S. and the world," the sort of "mushy description that could allow for a lot of possibilities."

Our unity as a nation is threatened, argued Donald Kagan, Professor of History and Classics and Dean of Yale College in his address to Yale's freshman class in September, 1990, by those who would replace the teaching of our history and culture with something else. He declared:

. . . American culture derives chiefly from the experience of Western Civilization, and especially from England, whose language and institutions are the most copious springs from which it draws its life. I say this without embarrassment, as an immigrant who arrived here as an infant from Lithuania. . . . Our students will be handicapped in their lives after college if they do not have a broad and deep knowledge of the culture in which they live and roots from which they come . . . . As our land becomes ever more diverse, the danger of separation and segregation by ethnic group . . . increases and with it the danger to the national unity which, ironically, is essential to the qualities that attracted its many peoples to this country.

In his book The Roots of American Order, Russell Kirk pointed out that these roots go back to the ancient world - to the Jews and their understanding of a purposeful universe and under God's dominion; to the Greeks, with their high regard for the use of reason, to the stern virtues of the Romans such as Cicero; to Christianity, which taught the duties and limitations of Man, and the importance of the Transcendent in our lives. These roots, in addition, include the traditions and universities of the medieval world, the Reformation and the response to it, the development of English Common Law, the debates of the 18th century, and the written words of the Declaration of Independence and the Constitution.

American colleges and universities do our students - and our country - a disservice by not transmitting our history and culture to the next generation. Unless those who understand the very fragile nature of our civilization and the uniqueness of the tradition upon which free institutions are based, rise in defense of that culture may well be swept away. If this takes place, all of us will be the losers, not least the various groups in whose name such a cultural assault has been launched. *

Saturday, 05 December 2015 04:34

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

The Need to Curb the Role of Public Employee Unions Is Clear as Bankruptcy Looms for Many States and Cities

The state of Illinois is trying to pay billions in bills that it got from schools and social service providers last year. Arizona recently stopped paying for certain organ transplants for people in its Medicaid program. States are releasing prisoners early, largely to cut expenses. In December, the city of Newark, New Jersey, laid off 13 percent of its police officers.

"It seems to me that crying wolf is probably a good thing to do at this point," said Felix Rohatyn, the financier who helped save New York City from bankruptcy in the 1970s.

One of the important contributing factors in the current economic decline in the fortunes of our states and cities is the role being played by public employee unions.

While union membership has collapsed in the private sector over the past 30 years, from 33 percent of the workforce to 15 percent, it has remained buoyant in the public sector. Today, more than 35 percent of public employees are unionized, compared with only 11 percent in 1960.

The role of public employee unions in our political life has been growing. "We just won an election," labor boss Andy Stern declared two years ago, at about the time Barack Obama was taking the oath of office and the union movement was giving itself much of the credit for his victory. After spending some $450 million to elect Obama and other Democrats, labor was indeed riding high. Teachers alone accounted for a tenth of the delegates to the Democratic convention in 2008.

All too often, states The Economist:

Politicians have repeatedly given in, usually sneakily - by swelling pensions, adding yet more holidays or dropping reforms, rather than by increasing pay. Too many state workers can retire in their mid-50s on close to full pay. America's states have as much as $5 trillion in unfunded pensions liabilities. . . . Sixty-five should be a minimum age for retirement for people who spend their lives in classrooms and offices; and new civil servants should be switched to defined contribution pensions.

Another Battleground, reports The Economist, will be

. . . the unions' legal privileges. It is not that long since politicians of all persuasions were uncomfortable with the idea of government workers joining unions. (Franklin Roosevelt opposed this on the grounds that public servants have "special relations" and "special obligations" to the government and the rest of the public.) It would be perverse to ban public sector unions outright at a time when governments are trying to make public services more like private ones. But their right to strike should be more tightly limited; and the rules governing political donations and even unionization itself should be changed to "opt-in" ones, in which a member decides whether to give or join.

There are now more American workers in unions in the public sector (7.6 million) than in the private sector (7 million), although the private sector employs five times as many people. In fact, union density is now higher in the public sector than it was in the private sector in the 1950s.

Andy Stern, head of the Service Employees International Union (SEIU), was the most frequent guest at the White House in the first six months of the Obama administration. Public-sector unions, as providers of vital monopoly services, can close down entire cities. As powerful political machines, they can help pick the people who are on the other side of the bargaining table. David DiSalvo, writing in National Affairs, points out that the American Federation of State, County and Municipal Employees (AFSCME) was the biggest contributor to the political campaigns in 1989-2004. He also notes that such influence is more decisive in local campaigns, where turnout is low, than in national ones.

Evidence from the American Bureau of Labor Statistics shows that public-sector unions have used their power to extract a wage premium: public-sector workers earn, on average, a third more than their private-sector counterparts. At the same time, governments give their workers generous pensions, which do not have to come out of current budgets. Many public employees also game the system. Eighty-two percent of senior California Highway Patrol officers, for example, discover a disabling injury about a year before they retire.

Unions have also made it almost impossible to remove incompetent workers. Mary Jo McGrath, a California lawyer, says that "getting rid of a problem teacher can make the O. J. Simpson trial look like a cake-walk." In 2000-10 the Los Angeles school district spent $3.5 million trying to get rid of seven of its 33,000 teachers, and succeeded with only five.

Newly elected governors - both Republicans and Democrats - are focusing their attention on public-sector unions. In Wisconsin, Governor Scott Walker, a Republican, is promising to use "every legal means" to weaken the bargaining power of state workers - including decertification of the public employees' union. Ohio's new governor, Republican John Kasich, wants to end the rule that requires nonunion contractors to pay union wages, and is targeting the right of public employees to strike.

Even in states where Democrats remain in power, unions are under the gun. New York's Governor Andrew Cuomo intends to freeze the salaries of the state's 190,000 government workers, and has promised to tighten the budget belt when public union contracts are renegotiated this year. In California, Governor Jerry Brown, who gave public employees the right to unionize when he was governor in the 1970s, now speaks about the unsustainable drain that union pensions and health benefits are on the state's budget.

Things in the states are so bad that, in the case of Arizona, it has sold several state buildings - including the tower in which the governor has her office - for a $735 million upfront payment. But leasing back the building over the next 20 years will ultimately cost taxpayers an extra $400 million in interest. Many states are delaying payments to their pension funds, which eventually need to be made. In New Jersey, Governor Chris Christie deferred paying the $3.1 billion that was due to the pension funds in 2010.

The role of our public employee unions in leading our states and cities to insolvency has not been properly examined. It is high time that it was.

Up from the Projects: The Life of Walter E. Williams

Walter Williams has had a distinguished career - as an economist, author, columnist, teacher, and sometime radio and television personality. As a black advocate of the free market and a genuinely color-blind society, he has often come under bitter attack. Now 74, he has decided to tell the story of his life.

"What I've done, said, and written, and the positions I have taken challenging conventional wisdom," he writes

. . . have angered and threatened the agendas of many people. I've always given little thought to the possibility of offending critics and contravening political correctness and have called things as I have seen them. With so many "revisionist historians" around, it's worthwhile for me to set the record straight about my past and, in the process, discuss some of the general past, particularly as it relates to ordinary black people.

He recalls an angry response that former Secretary of Health, Education, and Welfare, Patricia Roberts Harris, wrote in response to a two-part series, "Blacker Than Thou," by another black conservative economist and good friend, Thomas Sowell, in The Washington Post in 1981. Assessing his criticism of black civil rights leaders, Harris said, "People like Sowell and Williams are middle class. They don't know what it is to be poor."

This assessment, however, is completely false. Williams notes that:

Both Sowell and I were born into poor families, as were most blacks who are now as old as we are. . . . While starting out poor, my life, like that of so many other Americans, both black and white, illustrates one of the many great things about our country: just because you know where a person ended up in life doesn't mean you know with any certainty where he began. . . . Unlike so many other societies around the world, in this country, one needn't start out at, or anywhere near, the top to eventually reach it. That's the kind of economic mobility that is the envy of the world. It helps explain why the number one destination of people around the world, if they could have their way, would be America.

Williams describes his humble beginnings, growing up in a lower middle-class, predominantly black neighborhood in West Philadelphia in the 1940s, raised by a strong and demanding single mother with high academic aspirations for her children. He recalls the teachers in middle school and high school who influenced him - teachers who gave him an honest assessment of his learning and accepted no excuses. In discussing his army experience, he recounts incidents of racial discrimination but stresses that his time in the army was a valuable part of his maturation process.

Growing up in the Richard Allen housing project, he remembers that:

Grocery, drug, and clothing stores lined Poplar between 10th and 13th streets. . . . Most of the grocery stores had unattended stands set up outside for fruits and vegetables - with little concern about theft. Often customers would select their fruits and vegetables and take them into the store to be weighed and paid for. . . . There was nothing particularly notable about a thriving business community in black neighborhoods, except that it would one day virtually disappear due to high crime and the 1960s riots. Such a disappearance had at least several results: in order to shop, today's poor residents must travel longer distances . . . and bear that expense; high crime costs reduce incentives for either a black or white business to locate in these neighborhoods. . . .

The absence of shops and other businesses, writes Williams:

. . . also reduces work opportunities for residents. One of my after-school and weekend jobs was to work at Sam Raboy's grocery store. . . . I waited on customers, delivered orders, stocked shelves and cleaned up. Other stores hired other young people to do the same kind of work.

After high school Williams joined his father in Los Angeles and enrolled at Los Angeles City College. Following Army service, including time in Korea, and his marriage to Connie - a major force in his life - in February 1962 he enrolled as a full-time student at California State College in Los Angeles, originally majoring in sociology. In the summer of 1965, after shifting to economics, Williams graduated from California State and was encouraged by professors to consider graduate school. He was admitted to UCLA, worked at night for the county probation department, and decided to pursue his Ph.D.

Initially, Williams failed in economic theory. He notes that:

I later realized this did have a benefit. It convinced me that UCLA professors didn't care anything about my race; they'd flunk me just as they'd flunk anyone else who didn't make the grade. The treatment reassured me in terms of my credentials.

After completing his Ph.D. examinations, he was offered a full-time tenure-track assistant professorship at Cal State. "Sometimes," Williams writes:

I sarcastically, perhaps cynically, say that I'm glad that I received virtually all of my education before it became fashionable for white people to like black people. By that I mean that I encountered back then a more honest assessment of my strengths and weaknesses. Professors didn't hesitate to criticize me - sometimes even to the point of saying, "That's nonsense, Williams."

In those years, Williams' political views were liberal. In 1964, he voted for Lyndon Johnson. He believed that higher minimum wages were the way to help poor people. "That political attitude," he writes:

. . . endured until I had a conversation with a UCLA professor (it might have been Armen Alchian) who asked me whether I most cared about the intentions behind a higher minimum wage or its effects. If I was concerned about the effects, he said, I should read studies by Chicago University Professor Yal Brozen and others about the devastating effects of the minimum wage on employment opportunities for minimally skilled workers. I probably became a libertarian through exposure to tough-minded professors who encouraged me to think with my brain instead of my heart.

It was Williams' desire

. . . to share my conviction that personal liberty, along with free markets, is morally superior to other forms of human organization. The most effective means of getting them to share it is to give them the tools to be rigorous, tough-minded thinkers.

Being a black professor, he reports:

. . . led to calls to become involved with the campus concerns of black students. They invited me to attend meetings of the Black Student Union. I tried to provide guidance with regard to some of the BSU's demands, such as black studies programs and an increase in the number of black faculty. As I was to do later at Temple University, I offered tutorial services for students having trouble in math. One of my efforts that fell mostly on deaf ears was an attempt to persuade black students that the most appropriate use of their time as students was to learn their subject as opposed to pursuing a political agenda.

Walter Williams' academic career began with teaching one class a week at Los Angeles City College to eventually holding the department chairmanship at George Mason University. He tells of his long friendship with economist Thomas Sowell and with J. A. Parker, president of the Lincoln Institute, with which Williams has been associated for many years. He reports of his time at the Urban Institute and the Hoover Institution of Stanford University, as well as his frequent testimony before Congress on issues ranging from the minimum wage to the negative effects of the Davis-Bacon Act. He was a member of the Reagan administration's transition team at the Department of Labor, but following the advice of economist Milton Friedman declined a position in the administration so that he could remain in his academic position where he could speak his mind freely.

The attacks upon black conservatives, which he cites, were often bitter. NAACP General Counsel Thomas Atkins, upon hearing that President Reagan was considering appointing Thomas Sowell as head of the Council of Economic Advisers, declared that Sowell "would play the same kind of role which historically house niggers played for the plantation owners." Syndicated columnist Carl Rowan said "If you give Thomas (Sowell) a little flour on his face, you'd think you had (former Ku Klux Klan leader) David Duke." NAACP Executive Director Benjamin Hooks called black conservatives "a new breed of Uncle Tom and some of the biggest liars the world ever saw."

Williams has much to say about what he calls "prevailing racial dogma." One element of that dogma asserts that "black role models in teaching are necessary to raise black achievement, instill pride, and offset the effects of our legacy of slavery and subsequent discrimination." But, Williams argues, his own life is a refutation of that notion:

Attending predominantly black junior high and high schools, and graduating from the latter in 1954, I recall having no more than two, possibly three, black teachers. . . . Nonetheless, many of my classmates, who grew up in the Richard Allen housing project and with whom I've kept up over the years, managed to become middle-class adults; and one, Bill Cosby, became a multi-millionaire. Our role models were primarily our parents and family; any teachers who also served in that role were white, not black.

Every few years, former Richard Allen residents hold a reunion. "I've asked some of my friends," Williams writes:

. . . and ex-schoolmates whether they recall any of our peers who couldn't read or write well enough to fill out a job application or who spoke the poor language that's often heard today among black youngsters, The answer is they don't remember anyone doing either. Yet in 2005, at my high school alma mater, Benjamin Franklin, only 4 percent of eleventh grade students scored "proficient" and above in reading, 12 percent in writing, and 1 percent in math. Today's Philadelphia school system includes a high percentage of black teachers and black administrators, but academic achievement is a shadow of what it was yesteryear. If the dogma about role models had any substance, the opposite should be the case.

Throughout this book, Williams refers to the immeasurable contribution of his wife of 48 years, who shared his vision through hard work and love. He leaves the reader with a bit of advice passed on by his stepfather: A lot of life is luck and chance and you never know when the opportunity train is going to come along. Be packed and ready to hop on board.

Walther Williams has lived a singular American life. This book deserves widespread recognition as a record of that life. And he is still teaching economics and, hopefully, will be doing so for many years to come,

A Thoughtful Look at Christianity as the Lifeblood of the American Society

When we look to the earliest days of the American society and seek to discover the beliefs and worldview which animated the Founders, we must carefully consider the role of religion.

In a thoughtful new book, Christianity: Lifeblood of America's Free Society (1620-1945) , Dr. John Howard argues that it was Christianity that was the dominant influence in the development of the American nation and the American society.

John Howard has had a distinguished career. After service in the First Infantry in World War II, he returned with two silver stars, two purple hearts, and a commitment to use his career to sustain and strengthen America's religious ideals. In the Eisenhower Administration, he headed the first program using government contracts to open jobs for qualified minority applicants. Dr. Howard served as president of Rockford College for seventeen years and as the national president of the American Association of Presidents of Independent Colleges and Universities for three years. Currently, he is a Senior Fellow at the Howard Center on Family, Religion, and Society. In the 2007 national contest for speechwriters sponsored by Vital Speeches and The Executive Speaker, John Howard's "Liberty Revisited" received the Grand Award as the best speech submitted in any of the 27 categories.

Sadly, at the present time, America's religious history is largely unknown. The review provided by Dr. Howard is instructive.

After the Plymouth colony was established, the number of settlers coming from England increased rapidly. In 1630, five ships with 900 passengers arrived to start the Massachusetts Bay colony with John Winthrop as governor. During a two-month voyage, Winthrop preached a long sermon entitled "A Model of Christian Charity." He set forth the tenets of Jesus' teaching that should be applied in the daily living of the new colonies.

His concluding instructions included the following:

The end is to improve our lives to do more service to the Lord. . . . We are entered into a covenant with Him to do this work. . . . We must entertain each other in brotherly affection. . . . We must delight in each other, make others' condition our own, rejoice together, mourn together, labor together and suffer together . . . so shall we keep the unity of spirit in the bond of peace. The Lord will be our God and delight to dwell among us. . . . We must consider that we shall be as a city upon a hill. The eyes of all people are upon us.

In ten years, the population of the Massachusetts Bay colony swelled to 16,000. Dr. Howard notes that:

It was recognized that schools as well as churches were essential to the Christian well-being of the society. In 1636, the Massachusetts legislature authorized the establishment of a college, and two years later Harvard College enrolled students. The purpose was "to train a literate clergy." . . . For many years, most of the Congregational Ministers in New England were Harvard graduates. . . . It is not surprising that the first English language book published in North America was religious. The Whole Book of Psalms Faithfully Translated into English Meter appeared in 1640. It was so much in demand that there were twenty editions of it and seventy printings.

During the 18th century, a number of powerful preachers influenced the American society - among them Jonathan Edwards, John Wesley and George Whitfield. Whitfield, an Englishman, is acknowledged as the most powerful influence in spreading the Great Awakening of that time. He had a great dramatic flare that brought people from long distances to hear him. The crowds grew until he was preaching to over 30,000 people at once, with no amplification. Benjamin Franklin wrote in his autobiography "he was able to hear his voice nearly a mile away." The two men became close friends and Franklin built a huge auditorium in Philadelphia to accommodate his revival meetings. The edifice later became the first building of the University of Pennsylvania.

"During the three-quarters of a century leading up to the Revolutionary War," writes Howard:

. . . the church services, revival meetings, and religious encampments were the primary social events in the lives of the colonists. . . . Many of the foremost clergy were instrumental in convincing the colonists that the revolution was necessary and just.

The people of a self-governing republic or democracy must, said Montesquieu, be virtuous, or that form of government cannot operate. In his inaugural address on April 30, 1789, George Washington stressed the need for honorable and conscientious citizens. "Like the other Founding Fathers," writes Howard:

Washington knew . . . that the people of a self-governing republic must be virtuous for that form of government to operate successfully and was keenly committed to do everything he could to help Americans measure up to the standards of virtue required for a self-governing republic. Rectitude and patriotism, he declared, keep the acts of government fair and just and free of the damage caused by party politics. Rectitude and patriotism will also assure that "national policy will be laid in the pure and immutable principles of private morality. . . . There is no truth more thoroughly established than that there exists . . . an indissoluble union between virtue and happiness; between duty and advantage; between the genuine maxims of an honest and magnanimous policy and the solid reward of prosperity and felicity. . . ."

When Alexis de Tocqueville visited the U.S. in 1831, he was amazed by the religious character of the people and impact Christianity had on the systems and institutions of the society. In Democracy in America he writes that:

By their practice Americans show that they feel the urgent necessity to instill morality into democracy by means of religion. What they think of themselves in this respect enshrines a truth that should penetrate into the consciousness of every democratic nation.

De Tocqueville marveled that the different denominations were in comfortable agreement about teaching morality:

There is an innumerable multitude of sects in the United States. They are all different in the worship they offer to the Creator, but all agree concerning the duties of men to one another . . . all preach the same morality in the name of God. . . .

The historian Paul Johnson observes that this denominational unity about teaching morality was even broader. It was

. . . a consensus which even non-Christians, deists, and rationalists could share. Non-Christianity, preeminently including Judaism, could thus be accommodated within the framework of Christianity. Both Catholicism and Judaism became heavily influenced by the moral assumptions of American Protestantism because both accepted its premise that religion (meaning morality) was essential to democratic institutions.

In the post-World War II years, Dr. Howard shows, the influence of religion upon the American society has been in decline. In 1988, after a number of highly placed graduates of Harvard and other elite universities engaged in a variety of less than honorable behavior, Harvard president Derek Bok published a long essay in Harvard Magazine. He provided a summary of Harvard's transition from being an instrument of the Christian church to a modern intellectual and research center, free of any coordinated effort to teach Christian or any other morality to the student body.

He wrote:

Until this century, education throughout history not only sought to build the character of their students, they made this task their central responsibility. . . . These tendencies continued strongly into the 19th century.

During the 20th century, he notes:

First artists and intellectuals, then broader segments of society, challenged every convention, every prohibition, every regulation that cramped the human spirit and blocked its appetites and ambitions.

In 1954, Robert Hutchins, president of the University of Chicago, noted that:

The pedagogical problem is how to use the educational system to form the kind of man that the country wants to produce. But in the absence of a clear ideal, and one that is attainable through education, the pedagogical problem is unsolvable; it cannot even be stated. The loss of an intelligible ideal lies at the root of the troubles of American education.

John Howard concludes his book with a quote from James Russell Lowell, author, poet, editor of the Atlantic Monthly, and U.S. ambassador to Spain from 1877 to 1880, and then to Great Britain. The French historian Francois Guizot once asked Lowell, "How long will the American Republic endure?" Lowell replied, "As long as the ideas of the men who founded it remain dominant."

John Howard hopes that we can recapture those ideas. He has performed a notable service with his important review of the role religion has played in the development of our country - and can play once again. *

Saturday, 05 December 2015 04:30

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to such publications as Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

A Growing and Largely Ignored Crisis: Public Pension Funds Are Running Out of Money

There is now a $1 trillion gap as of fiscal 2008 between what states had promised to public employees in retiree pensions, health care, and other benefits and the money they currently have to pay for it, according to a Pew Center on the States study. Some economists say that Pew is too conservative and that the problem is two or three times as large.

Roger Lowenstein, an outside director of the Sequoia Fund and author of While America Aged, points out that:

For years, localities and states have been skimping on what they owe. Public pension funds are now massively short of the money to pay future claims -- depending on how their liabilities are valued, the deficit ranges from $1 trillion to $3 trillion. Pension funds subsist on three revenue streams: contributions from the employer; contributions from the employees; and investment earnings. But public employees have often contributed less than the actuarially determined share, in effect borrowing against retirement plans to avoid having to cut budgets or raise taxes.

Pension funds, Lowenstein notes, also assumed that they could count on high annual returns, typically 8 percent, on their investments.

In the past, many funds did earn that much, but not lately. Thanks to high assumed returns, governments projected that they could afford to both ratchet up benefits and minimize contributions. Except, of course, returns were not guaranteed. Optimistic benchmarks, actually heightened the risk because they forced fund managers to overreach.

Consider the case of Massachusetts. The target of its pension board was 8.25 percent. "That was the starting point for all of our investment decisions," says Michael Travaglini, until recently its executive director. "There was no way a conservative strategy is going to meet that."

Lowenstein notes:

Travaglini put a third of the state's money into hedge funds, private equity, real estate, and timber. In 2008, assets fell 29 percent. New York State's fund, which is run by the comptroller, Thomas DiNapoli, a former state assemblyman with no previous investment experience, lost $40 billion in 2008.

In October, a report issued by the Empire Center for New York State Policy, a research organization that studies fiscal policy, reports that the cities, counties, and authorities of New York have promised more than $200 billion worth of health benefits to their retirees while setting aside almost nothing, putting the pubic work force on a collision course with the taxpayers who are expected to foot the bill.

Lowenstein notes:

The Teacher's Retirement System of Illinois lost 22 percent in the 2009 fiscal year. Alexandra Harris, a graduate journalism student at Northwestern University who investigated the pension fund, reported that it invested in credit-default swaps on A.I.G., the State of California, Germany, Brazil, and "a ton" of subprime mortgage securities.

According to Joshua Rauh of the Kellogg School of Management at Northwestern, assuming states make contributions at recent rates and assuming they do earn 8 percent, 20 state funds will run out of cash by 2025. Illinois, the first, will run dry by 2018.

In a new report issued by Professor Rauh and Robert Novy-Marx, a University of Rochester professor, five major cities -- Boston, Chicago, Cincinnati, Jacksonville, and St. Paul -- are said to have pension assets that can pay for promised benefits only through 2020. Philadelphia, according to the report, has assets on hand that can only pay for promised benefits through 2015.

Professor Rauh declares that:

We need fundamental reform of the ways employees in the public sector are compensated. It is not feasible to make promises of risk-free pensions when in the private sector (nearly) everyone has to share some of the risk.

In Roger Lowenstein's view, states need to cut pension benefits:

About half have made modest trims, but only for future workers. Reforming pensions is painfully slow, because pensions of existing workers are legally protected. But public employees benefit from a unique notion that, once they have worked a single day, their pension arrangement going forward can never be altered. No other American enjoy such protections. Private companies often negotiate (or force upon their workers) pension adjustments. But in the world of public employment, even discussion of cuts is taboo.

The market forced private employers like General Motors to restructure retirement plans or suffer bankruptcy. Government's greater ability to borrow enables it to defer hard choices but, as Greece discovered, not even governments can borrow forever. The days when state officials may shield their workers while subjecting all other constituents to hardship are fast at an end.

Recently, some states have begun to test the legal boundary. Minnesota and Colorado cut cost-of-living adjustments for existing workers' pensions. Each now faces a lawsuit.

In Colorado, in what many have called an act of political courage, a bipartisan coalition of state legislators passed a pension overhaul bill. Among other things, the bill reduced the raise that people who are already retired get in their pension checks each year. "We have to take this on, if there is any way of bringing fiscal sanity to our children," said former Governor Richard Lamm, a Democrat. "The New Deal is demographically obsolete. You can't fund the dream of the 1960s on the economy of 2010."

In Colorado, the average public retiree stops working at 58 and receives a check of $2,883 each month. Many of them also got a 3.5 percent annual raise, no matter what inflation was, until the rules changed this year.

Discussing the Colorado case in The New York Times, Ron Lieber notes that:

Private sector retirees who want their own monthly $2,883 check for life, complete with inflation adjustments, would need an immediate fixed annuity if they don't have a pension. A 58-year-old male shopping for one from an A-rated insurance company would have to hand over a minimum of $860,000, according of Craig Hemke of Buyapension.com. A woman would need at least $928,000 because of her longer life expectancy. Who among aspiring retirees has a nest egg that size, let alone people with the same moderate earning history as many state employees? And who wants to pay to top off someone else's pile of money via increased income taxes or a radical decline in state services?

"We have to do what unions call givebacks," said Mr. Lamm, the former Colorado governor. "That's the only way to sanity. Any other alternative, therein lie dragons."

Americans were quite properly shocked when it was revealed that the Los Angeles blue-collar suburb of Bell, California, was paying its city manager Robert Rizzo $787,637 a year -- with 12 percent annual pay increases. In July, Rizzo, along with Police Chief Randy Adams and Assistant City Manager Angela Spaccia resigned. The combined annual salary of these employees was $1,620,925 in a city where one of every six residents lives in poverty. The city's debt quadrupled between 2004 and 2009.

The Washington Examiner states that:

The likely reason why Rizzo, Adams, and Spaccia resigned so readily is that they are eligible for public pensions. Under current formulations, Adams will make $411,000 annually in retirement and Spaccia could make as much as $250,000, when she's eligible for retirement in four years at age 55. . . . California is but one of many states on the brink of fiscal ruin largely due to outrageous public employee benefits.

While the Bell, California, example may be extreme, the crisis in public employee pension funds across the country is very real -- and must be confronted to avoid massive bankruptcies in the future.

Racial Achievement Gap Is Alarming and Focuses Renewed Attention on the "Culture of Poverty"

An achievement gap separating black from white students has long been documented. But a new report focusing on black males suggests that the picture is bleaker than generally known.

Only 12 percent of black fourth-grade boys are proficient in reading, compared with 38 percent of white boys, and only 12 percent of black eight-grade boys are proficient in math, compared with 44 percent of white boys.

Poverty alone does not seem to explain the differences. Poor white boys do just as well as black boys who do not live in poverty, measured by whether they qualify for subsidized school lunches.

This data comes from national math and reading tests, known as the National Assessment for Educational Progress, which are given to students in fourth and eight grades, most recently in 2009. The report, "A Call for Change," was released in November by the Council of the Great City Schools, an advocacy group for urban public schools.

"What this clearly shows is that black males who are not eligible for free and reduced-price lunch are doing no better than white males who are poor," said Michael Casserly, executive director of the council.

The report shows that black boys on average fall behind from the earliest years. Black mothers have a higher infant mortality rate and black children are twice as likely as whites to live in a home where no parent has a job. In high school, black boys drop out at nearly twice the rate of white boys, and their SAT scores are on average 104 points lower. In college, black men represented just 5 percent of students in 2008.

The search for explanations is looking at causes besides poverty. "There's accumulating evidence that there are racial differences in what kids experience before the first day of kindergarten," said Ronald Ferguson, director of the Achievement Gap Initiative at Harvard. He said:

They have to do with a lot of sociological and historical forces. In order to address those, we have to be able to have conversations that people are unwilling to have.

Those include "conversations about early childhood parenting practices," Dr. Ferguson said.

The activities that parents conduct with their 2, 3, and 4 year-olds. How much we talk to them, the ways we talk to them, the ways we enforce discipline, the ways we encourage them to think and develop a sense of autonomy.

The New York Times reports that:

For more than 40 years, social scientists investigating the causes of poverty have tended to treat cultural explanations like Lord Voldemort: That Which Must Not Be Named. The reticence was a legacy of the ugly battles that erupted after Daniel Patrick Moynihan, then an assistant labor secretary in the Johnson administration introduced the idea of a "culture of poverty" to the public in a startling 1965 report. . . . His description of the black family as caught in an inescapable "tangle of pathology" of unmarried mothers and welfare dependency was seen as attributing self-perpetuating moral deficiencies to black people, as if blaming them for their own misfortune.

Now, after decades of silence, even liberal scholars who were harshly critical of Moynihan's thesis are conceding that culture and persistent poverty are enmeshed.

"We've finally reached the state where people aren't afraid of being politically incorrect," said Douglas S. Massey, a sociologist at Princeton who has argued that Moynihan was unfairly maligned.

In September, Princeton and the Brookings Institution released a collection of papers on unmarried parents, a subject, it noted, that became off limits after the Moynihan report. At the recent annual meeting of the American Sociological Association, attendees discussed the resurgence of scholarship on culture.

In recent years, prominent black Americans have begun to speak out on the subject. In 2004 the comedian Bill Cosby made headlines when he criticized poor blacks for "not parenting" and dropping out of school. President Obama, who was abandoned by his father, has repeatedly talked about "responsible fatherhood."

None of this is new. Repeated studies have shown that the decline of the black family and the decline in graduation rates, student achievement, and employment is clear. A 2002 report from the Institute for American Values, a non-partisan group that studies families, concluded that "marriage is an issue of paramount importance if we wish to help the most vulnerable members of our society: the poor, minorities, and children."

The statistical evidence for that claim is strong. Research shows that most black children, 68 percent, were born to unwed mothers. Those numbers have real consequences. For example, 35 percent of black women who had a child out of wedlock live in poverty. Only 17 percent of married black women overall are in poverty. In a 2005 report, the institute concluded:

Economically, marriage for black Americans is a wealth-creating and poverty-reducing institution. The marital status of African American parents is one of the most powerful determinants of the economic status of African-American families.

Over the past fifty years, the percentage of black families headed by married couples declined from 78 percent to 34 percent. In the thirty years from 1950 to 1980, households headed by black women who never married jumped from 3.8 per thousand to 69.7 per thousand. In 1940, 75 percent of black children lived with parents. By 1990 only 33 percent of black children lived with a mother and father.

"For policymakers who care about black America, marriage matters," wrote the authors of the report, a group of black scholars. They called marriage in black America an important strategy for "improving the well-being of African Americans and for strengthening civil society."

The latest study of the achievement gap separating black and white students should focus attention on the real causes for this problem. Unless a problem is diagnosed properly, it will never be solved. For too long, all discussions of the "culture of poverty" have been silenced with the false charge of "blaming the victim."

In America today, any individual, regardless of race or background, can go as far as his or her abilities will allow. But when doors are opened to all, it still requires hard work and determination to take advantage of these opportunities. Without strong and intact families, these qualities seem to be in short supply. How to rebuild such families should be the focus of increasing attention.

NPR and Juan Williams: the Peril of Speaking Honestly in an Era of Political Correctness

Juan Williams, a respected journalist and ten-year veteran of National Public Radio (NPR) was fired in October as a top news analyst for remarks he made about Muslims during an appearance on Fox News.

In a segment with Fox News talk-show host Bill O'Reilly, Williams acknowledged feeling "nervous" in the wake of the September 11 attacks when he sees Muslims board a plane on which he is traveling.

He said:

Look, Bill, I'm not a bigot. You know the kind of books I've written about the civil rights movement in this country. But when I get on the plane, I got to tell you, if I see people who are in Muslim garb and I think, you know, they are identifying themselves first and foremost as Muslims, I get worried. I get nervous.

Later, in the same segment, Williams challenged O'Reilly's suggestion that "the Muslims attacked us on 9/11," saying it was wrong to generalize about Muslims in this way just as it was to generalize about Christians, such as Oklahoma City bomber Timothy McVeigh, who have committed acts of terrorism. "There are good Muslims," Williams said later, making a distinction from "extremists."

A former Washington Post reporter and columnist, Williams began his tenure with Fox News in 1997, predating his hiring by NPR three years later. While at NPR, he has hosted the daily program "Talk of the Nation," and commented on its news program, "Morning Edition" and "All Things Considered."

In an interview shortly after being fired, Williams declared:

As a journalist, it's unsupportable that your employer would fire you for stating your honest feelings in an appropriate setting. . . . I think that I am open to being misinterpreted only if you snip one line out of what I said. But I would never guess that people who are professional journalists would just take one line and make me look bigoted so they can use it as an excuse to get rid of me.

Williams was not only fired by NPR, but his mental stability was also questioned. NPR chief executive Vivian Schiller told an audience at the Atlanta Press Club that Williams should have kept his feelings about Muslims between himself and "his psychiatrist or his publicist."

The outcry against Williams' firing has been widespread. Conservatives, of course, were particularly vocal. Leading Republicans, including former Arkansas Governor Mike Huckabee, former Alaska Governor Sarah Palin, political strategist Karl Rove, and House Minority Leader John Boehner of Ohio, released statements attacking the move and questioning why taxpayers should help fund NPR's budget. This, of course, was to be expected.

Many others have also criticized NPR's action. The Washington Post reported that, "Even NPR's own staff expressed exasperation at the decision during a meeting . . . with NPR president Vivian Schiller." Editorially, The Post declared that, "In firing Juan Williams, NPR discourages honest conversation."

In The Post's view:

In a democracy, the media must foster a free and robust political debate, even if such debate may, at times, offend some people. . . . What was Mr. William's sin? He admitted, with apparent chagrin, that he has engaged in a kind of racial profiling in the years since the September 11 attacks. . . . In making this confession, Mr. Williams undoubtedly spoke for many Americans who are wrestling with similar feelings. His words could be offensive to some, if construed as an endorsement of negative stereotyping. But the full broadcast makes clear that Mr. Williams intended the opposite. To be sure, he struggled to get his point across, because host Bill O'Reilly kept interrupting him. But Mr. Williams did manage to observe that "We don't want in America people to have their rights violated, to be attacked on the street because they hear rhetoric from Bill O'Reilly and they act crazy."

The Post concludes:

In short, Mr. Williams was attempting to do exactly what a responsible commentator should do: speak honestly without being inflammatory. His reward was to lose his job, just as Agriculture Department employee Shirley Sherrod lost hers over purportedly racist remarks that turned out to be anything but. NPR management appears to have learned nothing from that rush to judgment. "Political correctness can lead to some kind of paralysis where you don't address reality," Mr. Williams told Mr. O'Reilly. NPR, alas, has proved his point.

T.V. political satirist Jon Stewart criticized the firing of Juan Williams as well as the extremes to which "political correctness" frames what is permissible speech. When CNN commentator Rick Sanchez, complained of Stewart's mocking him on "The Daily Show" he complained of his treatment as a member of a minority group, being a Cuban-American. He was then told by his interviewer that Stewart, being Jewish, was also a member of a minority. Sanchez responded by pointing out that most of those in leadership positions at CNN and other networks were "just like Stewart." He was then fired. Jon Stewart also spoke out against the firing of Sanchez as an example of the extremes to which political correctness have taken us.

Interestingly, there are examples of editing free speech coming from liberals as well. The recipient of this year's Mark Twain Prize for American Humor was Tina Fey. In its broadcast from the Kennedy Center's award ceremony, PBS edited Fey's acceptance speech, in which she mock-praised "conservative women" like Sarah Palin, whom Fey has impersonated on "Saturday Night Live." Fey said that the rise of conservative women in politics is good for all women "unless you don't want to pay for your own rape kit. . . . Unless you're a lesbian who wants to get married to your partner of 20 years. . . Unless you believe in evolution."

That, however, was not what viewers heard when PBS broadcast an edited version of Fey's speech. The part about the rape kits and evolution was gone, leaving only Fey's more harmonious and blander comments about Palin and politics. Was PBS shielding its viewers from Fey's more pointed remarks?

It is unfortunate that our society has fewer and fewer newspapers. It is unfortunate that cable news provides less news and more overheated opinion, both on the right and left. What a free society desperately needs is free and open discussion -- with a variety of viewpoints being heard. Juan Williams has always provided a thoughtful voice, whether or not one agrees with his views. To fire him as NPR has done sends a chilling message to the nation about free speech. Juan Williams, of course, has a new contract with Fox and is doing very well indeed. It is the rest of us who are the losers. *

"No taxes can be devised which are not more or less inconvenient and unpleasant." --George Washington

Sunday, 29 November 2015 03:51

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to such publications as Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

Remembering James J. Kilpatrick, a Leading Conservative Voice

James J. Kilpatrick, a prominent conservative voice for half a century as a newspaper editor and columnist, author, and television personality, died in August at the age of 89.

He was a prolific writer and a sharp debater and is perhaps best remembered for his intellectual combat with liberal journalist Shana Alexander on "60 Minutes."

He was also a highly controversial figure because of his promotion of racial segregation as editor of The Richmond News Leader in the 1950s.

Kilpatrick gave new life to the idea of interposition championed by Senator John C. Calhoun of South Carolina in the years before the Civil War. It was Calhoun's view that states could protect their citizens from the authority of the federal government. A number of southern states began to pass pro-segregation laws that adopted the interposition language promoted by Kilpatrick.

In their Pulitzer-Prize winning book, The Race Beat (2005), about journalism in the civil rights era, Gene Roberts and Hank Klibanoff wrote that:

Kilpatrick, by propagating a whole vernacular to serve the culture of massive resistance -- interposition, nullification, states' rights, state sovereignty -- provided an intellectual shield for nearly every racist action and reaction in the coming years.

Later on, Kilpatrick regretted the role he had played in defending segregation. He told a Roanoke newspaper in 1993 that he had intended merely to delay court-mandated integration because

. . . violence was right under the city waiting to break loose. Probably, looking back, I should have had better consciousness of the immorality, the absolute evil of segregation.

Virginia in the 1950s is hard for young Americans of the 21st century to imagine. At that time, this writer was a student at the College of William and Mary in Williamsburg. These were the years of segregation. Blacks could not eat in public places frequented by whites. Each public place had four restrooms, "Black Men," "Black Women," "White Men," "White Women." Williamsburg's small movie theater did not have a balcony, so a segregated section was created by roping off several aisles for the use of black patrons. Blacks did not only not attend the college as students, but were not welcome as speakers.

In my junior year, as an officer of a club called the Political Science Club, I and several of my fellow members decided that it was time for a change. We invited the distinguished president of Hampton Institute (now Hampton University) -- the black college down the road -- to address our group. Dr. Alonzo Moron came to Williamsburg and we managed to take him to dinner at the Williamsburg Inn (his light complexion helped). His address was academic and non-controversial. The result: our group was thrown off campus and I was called to the president's office.

At this time, I had already been writing a column in The Flat Hat, the college newspaper, each week and was recognized as something of a conservative. The president said: "I am surprised at you. I thought you were a conservative." I replied that racism was not one of the things I wanted to conserve. We had our next meeting at the Methodist Church. I told the president that our next speaker would be a defender of segregation.

That speaker was Richmond News Leader editor James J. Kilpatrick. We had dinner with him, and accompanied him to the Methodist Church. He presented a legal defense of segregation. During the question period, he was asked, "Isn't it immoral in a free society for men and women, because they are black, not be able to have dinner or go to a movie or use a rest room?" His reply was something to the effect that it was simply a legal question and nothing more.

But Jack Kilpatrick could not be so easily categorized. An energetic newsman, he became a favorite of Douglas Southall Freeman, the News Leader's long-time editor, and was named its chief editorial writer in 1949. Mr. Freeman retired and Kilpatrick was appointed the paper's editor in 1951.

One of Kilpatrick's first actions at the News Leader was to champion the case of Silas Rogers, a young black shoeshine man who had been convicted of fatally shooting a Virginia police officer in 1943. Poring over the court transcripts, Kilpatrick found inconsistencies in the testimony. He retraced the steps of the accused killer and tracked down witnesses the police had never contacted. His reporting over two years led the governor to pardon Rogers. A black newspaper in Richmond inducted Kilpatrick into its honor roll for courage and justice in 1953.

Jack Kilpatrick showed us that there are indeed second acts in American life. He ultimately acknowledged that segregation was wrong and re-examined his earlier defense of it.

"I was brought up a white boy in Oklahoma City in the 1920s and 1930s," he told Time Magazine in 1970.

I accepted segregation as a way of life. Very few of us, I suspect, would like to have our passions and profundities at age 28 thrust in our faces at 50."

In his widely syndicated national column, Kilpatrick shunned racial politics. His signature issues became self-reliance, patriotism, a free market-place and skepticism of federal power. He regularly exposed overbearing local laws and judicial rulings. This campaign got under way in 1959 when a pedestrian who climbed across the hood of a car that blocked a downtown Richmond intersection was hauled into court by the driver, an off-duty policeman, and fined $25 for malicious mischief. That inspired Kilpatrick to create the Beadle Bumble Fund, named for the Dickens character in Oliver Twist who proclaimed "the law is an ass."

The fund, Kilpatrick said, was devoted to "poking fun and spoofing the hell out of despots on the bench." It paid the fines of victims of legal travesties in the Richmond area with contributions from readers.

Kilpatrick also railed against turgid prose in his book The Writer's Art (1984). In his "On Language" column in The New York Times, William Safire wrote that Kilpatrick's essays on "the vagaries of style are classic."

After leaving the News Leader, in 1966 he embarked on a column for The Washington Star syndicate. "A Conservative View," which was carried by newspapers throughout the country. This column continued until 1993 when he began a weekly column, "Covering the Courts." In the 1970s, he sparred on television with Nicholas von Hoffman and later Shana Alexander on the "Point-Counterpoint" segment of "60 Minutes." The Kilpatrick-Alexander clashes on issues like the Vietnam War and the women's movement were parodied on "Saturday Night Live" by Dan Akroyd and Jane Curtin.

Kilpatrick said that liberal critics thought of him as extremely right-wing -- "10 miles to the right of Ivan the Terrible" -- but he befriended many who were his ideological opposites. In 1998, he married Marianne Means, a liberal columnist for Hearst Newspapers. His acquaintances included the late Sen. Eugene McCarthy (D-MN), an anti-Vietnam War presidential candidate who was his neighbor in rural Virginia. McCarthy said he found Kilpatrick charming and well-versed on 17th and 18th century literature and philosophy. "The man is not locked into a mold. He's not just the curmudgeon you see on TV," he said in 1973, adding that Kilpatrick had "kind of a country manor style."

In today's journalistic arena, commentators are likely to identify themselves clearly as liberals or conservatives, Republicans or Democrats, and toe what they believe to be the proper party line. They end up often defending the indefensible and stirring controversy where real issues and principles are hardly involved. They seem always angry, and the decibel level is higher than ever. That was not the style of Jack Kilpatrick. He viewed himself as a "fiercely individualistic" writer and spoke only for himself -- not any political faction. He said he was once on television to take the side of "The Conservative's View of Watergate." He asked himself, "Just what is a conservative's view of burglary?"

Jack Kilpatrick thought for himself. Sometimes he was right, and sometimes wrong. Our national discourse is weaker and less vibrant without voices such as his.

Celebrating Young Americans for Freedom at Fifty: The Real Beginning of the Modern Conservative Movement

Fifty years ago a small group of conservative students and young adults gathered at the family home of Bill and Jim Buckley in Sharon, Connecticut. When they left that weekend they had formed a new organization, called Young Americans for Freedom (YAF) and agreed on a basic statement of principles they called the Sharon Statement. In many respects, this event marks the real beginning of the modern American conservative movement.

In September, that event was celebrated in Washington, D.C. as many long-time political activists, who started their involvement in YAF, gathered to commemorate that meeting in Connecticut on September 9-11, 1960.

In part the Sharon Statement, largely composed by M. Stanton Evans, then the 26-year-old editor of The Indianapolis News, declared:

We as young conservatives believe . . . the foremost among the transcendent values is the individual's use of his God-given free will, whence derives his right to be free from the restrictions of arbitrary force. That liberty is indivisible and that political freedom cannot long exist without economic freedom. That the purpose of government is to protect those freedoms through the preservation of internal order, the provision of national defense and the administration of justice. That when government ventures beyond these rightful functions it accumulates power, which tends to diminish order and liberty. That the Constitution of the United States is the best arrangement yet devised for empowering government to fulfill its proper role, while restraining it from the concentration and abuse of power.

An important new book has been written about the history of YAF, A Generation Awakes: Young Americans for Freedom and the Creation of the Conservative Movement (Jameson Books). Its author, Wayne Thorburn, joined YAF in 1961 and over the next 14 years held nearly every position in the organization, including serving as executive director from 1971-73. Formerly a faculty member at three universities, Thorburn held appointments in the Reagan and George H. W. Bush administration.

Since that meeting in Connecticut in 1960, writes Thorburn:

Literally thousands of young conservatives have passed through YAF and developed into leaders of the conservative and libertarian movement. . . . They were the initial force for the Goldwater campaign and the opposition to New Left violence on American campuses. They provided an intellectual challenge to the liberalism that has dominated American intellectual life. They were the grassroots force and the strategic leadership for the campaigns of Ronald Reagan and constituted a major part of the Reagan Revolution of the 1980s. Today they lead the major organizations of the 21st century Conservative movement."

While much has been written on the various liberal trends and groups that emerged in the 1960s, the history of YAF and its contributions to contemporary American society have not received the attention they deserve. Out of this organization came a Vice President of the United States, 27 members of Congress, 8 U.S. Circuit Court judges, numerous media personalities and journalists, college presidents, professors, and authors.

Lee Edwards, an early YAF leader, editor of the group's magazine The New Guard (this writer also edited The New Guard in the late 1960s) and a respected author, maintains that Senator Barry Goldwater, Russell Kirk and William F. Buckley, Jr. were the three critical pillars of the developing movement:

First came the man of ideas, the intellectual, the philosopher; then the man of interpretation, the journalist, the popularizer; and finally the man of action, the politician, the presidential candidate.

With the contributions of these men and others, the stage was set for the creation of a national political movement.

Historian Matthew Dalek notes that:

Anti-statism, economic laissez-faire, and militant anti-Communism -- three of the ideological pillars that would support a resurgence of conservatism in American life -- had been articulated by YAF on September 11, 1960.

John A. Andrew in The Other Side of the Sixties, writes:

Historians of the sixties have focused chiefly on the Left as an advocate for change, but in the early sixties, the Right was actually more active in challenging the status quo.

YAF was coming into being as a national conservative youth organization at a time when there was no "conservative movement" as it is known today. Two early conservative student leaders were David Franke, who attended George Washington University, and Doug Caddy, who was at Georgetown. As Franke was about to move to New York City to start work at National Review in the summer of 1960, Caddy organized a tribute dinner in his honor. The speakers at the event were Bill Buckley and Bill Rusher, from the magazine, Reed Larson of the National Right to Work Committee, and James J. Wick, publisher of Human Events, while telegrams were read from Senator Goldwater and Governor Charles Edison of New Jersey. As Caddy noted:

About 25 persons were present. In 1960, this was the size of the Conservative movement's leadership -- 25 persons could easily fit into a small hotel dining room.

Those who joined YAF were dedicated to bringing about change in society. Their enemy was what they perceived as the Establishment -- and it was a liberal establishment that they saw in power on campus and in the Nation's Capital. Historian Matthew Dallek noted that:

YAF was the first national youth movement to shatter the silent conformity that had reigned on campuses since World War II . . . before anyone ever heard of the counter-culture.

When YAF came into existence, partisan politics were quite different than they have become at the present time. YAF identified itself as a "conservative" organization, not a "Republican" one, "emphasizing the non-partisan conservatism of the organization," Thorburn points out:

. . . the featured speaker at an early meeting of the Greater New York Council was Connecticut Democratic Senator Thomas Dodd, an ardent anti-Communist opponent of the admission of Communist China to the United Nations. . . . The composition of congressional letter writers to The New Guard was a vivid commentary on mid-20th century American politics as letters of praise and congratulations came from several Democrats, including Sen. Frank Lusche of Ohio, Strom Thurmond of South Carolina and Spessard Holland of Florida along with a number of Republican elected officials . . . . Senator Robert Byrd of West Virginia was a featured speaker at the 1971 YAF National Convention.

YAF worked eagrly for the nomination of Senator Barry Goldwater, helped to start the Conservative Party of New York in 1961, worked in the New York City mayoralty campaign of William F. Buckley, Jr., launched campaigns against East-West trade, engaged the New Left in vigorous debate over the war in Vietnam, and constituted the foot soldiers in Ronald Reagan's campaigns for the presidency.

From the very beginning, YAF rejected all forms of racial, religious, and ethnic discrimination. At the YAF National Convention of 1963, held in Ft. Lauderdale, Florida, the Galt Ocean Mile Hotel initially refused to allow Don Parker, a leader in the Kings County (Brooklyn) YAF who was black, to register. The Brooklyn delegation and then Bob Schuchman, the former National YAF Chairman, threatened to walk out and cancel the event, the hotel withdrew its objection and Don Parker integrated the hotel.

At the YAF 1965 national convention of 1965, J. A. Parker, who was to become a leader of black conservatives and President of the Lincoln Institute of Research and Education, was named to the National Board of Directors. He eventually became president of both the Philadelphia YAF chapter and Chairman of Pennsylvania YAF. He recalls his commitment to the Goldwater campaign, asking:

Did you know he was a lifetime member of the NAACP? . . . When he was a member of the City Council, he was the guy who led the effort to desegregate downtown Phoenix. When he took over his family's department store, Goldwater's, he was the first to hire black junior executives and start a training program for them. All this had nothing to do with the law.

Within YAF, of course, there many disagreements and divisions, often pitting conservative traditionalists against libertarians. One successful crusade YAF entered into, with libertarians in the lead, was opposition to the military draft. David Franke, a YAF founder and editor of The New Guard, testified before the Senate Armed Services Committee. After reading the YAF Board's resolution, Franke outlined the reasons for a voluntary military, a system that would remove the elements of compulsion as well as the inequities of the selection process then in effect, provide the military and individuals properly motivated and trained to serve, and force changes in military pay and procedures. This testimony was printed in the May 1967 issue of The New Guard, an issue almost totally devoted to the case for a volunteer military. Included were supportive quotes from a number of political leaders, academics, business and military leaders, as well as feature articles by Barry Goldwater ("End the Draft"), Russell Kirk ("Our Archaic Draft"), and Milton Friedman ("The Case for a Voluntary Army"). An opening editorial proclaimed:

With this issue begins a long-range educational and action program by YAF promoting voluntarism not only in the military, but in other areas of society as well.

Principle was YAF's motivation, not partisan political gain. In recent years, many of those who call themselves "conservative" seem to have other motivations. After all, the growth of government power, of deficits, and of governmental involvement in almost all aspects of society have come about as a result of actions taken by Republicans, as well as Democrats.

Summing up YAF's legacy, Wayne Thorburn writes:

During the latter half of the 20th century, YAF was a major contributor to the development of a conservative movement in the U.S. The efforts of those who met in Sharon, Connecticut, resulted in the creation of grassroots cadres of dedicated supporters on campuses and in communities around the nation.

Among those speaking at the anniversary celebration in Washington were many old YAF leaders: former Rep. Robert Bauman (R-MD), Don Devine, former director of the U.S. Office of Personnel Management, former Rep. Barry Goldwater, Jr. (R-CA), and Richard Viguerie, who transformed American politics in the 1960s and 1970s by pioneering the use of direct mail fundraising. J. A. Parker recited the pledge of allegiance and former Senator James L. Buckley (C-NY), addressed the group. It was an historic occasion. Fortunately for the country, YAF's influence continues into the 21st century. *

"Measures which serve to abridge . . . free competition . . . have a tendency to occasion an enhancement of prices." --Alexander Hamilton

Sunday, 29 November 2015 03:47

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to such publications as Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

An Early Conservative Leader Reflects on How America -- and the Republican Party -- Lost Its Way and How We Can Find Our Way Back

There can be little doubt that the American society is now in trouble. We have huge budget deficits, two questionable wars, an ever-expanding federal government, and a battered economy.

While some blame all of these developments on the current Obama administration, the fact is that the previous Republican administration, that of George W. Bush, set all of these trends in motion. Many traditional conservatives have been critical of the departures from the Goldwater-Reagan tradition upon which President Bush embarked, with the embrace of those who called themselves "compassionate" or "big government" conservatives. Now, in an important new book, Bringing America Home: How America Lost Her Way and How We Can Find Our Way Back, an early conservative leader, Tom Pauken, examines the question of how America went from having the strongest economy in the world to facing our most serious economic crisis since the Great Depression. He asks, "What became of an American culture that once was guided by the principles of Christianity."

Tom Pauken was elected national chairman of the College Republicans during the rise of the anti-Vietnam protest movement. Enlisting in the U.S. Army in 1967, Tom served as an intelligence officer in Vietnam and served on President Ronald Reagan's White House staff. Named director of the Action agency by President Reagan, he eliminated the use of federal tax dollars to fund Saul Alinsky-style leftist organizers. For his service at Action, Pauken was awarded the Ronald Reagan Medal of Honor. In 1994 he was elected Texas Republican State Chairman, and helped build a Republican majority in Texas from the grassroots. He currently serves as chairman of the Texas Workforce Commission. (This writer has known Tom Pauken since our college days.)

He asks:

How did the Bush administration squander the political capital that Goldwater-Reagan conservatives took more than three decades to build? In one sense, success has led to our downfall. When conservatives made the Republican Party the majority party in America, the opportunists, pragmatists, and phony conservatives moved in and took control of the Republican Party, and of the conservative movement itself -- all in the name of "conservatism."

What passes for conservatism in the post-Reagan era of Republican politics, writes Pauken:

. . . is barely recognizable to many of us who were grassroots activists in the early days of the conservative movement -- especially after eight years of a Republican administration headed by George W. Bush who claimed to be a conservative. The results were not pretty. . . . On the domestic scene, the Bush administration failed to act in time to stem the credit and spending excesses of the "bubble economy." While these excesses stem from bad decisions made during the Clinton years by Federal Reserve Chairman Alan Greenspan and Clinton Treasury Secretary Robert Rubin, the Bush administration did nothing to reverse those flawed policies. Now we are paying a heavy economic price for postponing action to deal with what turned out to be a slow-moving train wreck.

These developments are particularly troubling to those who, like Pauken, invested so much in developing and applying the conservative philosophy of limited government, fiscal responsibility, and a prudent foreign policy, beginning in the early 1960s.

"The Bush administration," he writes:

. . . squandered the political capital built up over three decades of hard work by the Goldwater-Reagan movement. In the process, great damage has been done to the conservative movement, the Republican Party, and our country. That capital is depleted and we conservatives have to start all over in putting together a set of principled policies to address the enormous economic, foreign policy, and cultural challenges our nation faces. . . . We should not support Republican candidates for president just because they happen to be the lesser of two evils. That has not worked out well for conservatives in the post-Reagan era of Republican politics.

Pauken is particularly critical of the emergence of neo-conservatives as a driving force in taking the U.S. into what he believes was an unnecessary "preemptive" war in Iraq. He notes that:

These former liberal Democrats turned Republicans remind me of Robert McNamara's civilian whiz kids who planned and oversaw our flawed strategy during the Vietnam War. Those intellectuals thought they were a lot smarter and knew a lot more about how to fight the war than our soldiers in Vietnam did. Indeed, the McNamara whiz kids had high IQs, but they were brilliantly wrong. I saw that firsthand as a young military intelligence officer in Vietnam. Similarly, the neo-conservatives, architects of George W. Bush's strategy to defeat militant Islam, were a group of arrogant intellectuals with very little, if any, military experience. They made things worse, not better, for the soldiers who had to carry out their plans.

The first goal for traditional conservatives, in Pauken's view:

. . . must be to recapture a Republican Party that has been taken over by Machiavellian pragmatists and neo-conservative ideologues. . . . That will not happen with slick political slogans or 30-second sound bites but with a serious assessment of how our leaders have failed us and what is required to make things right. And that assessment can be made only on the basis of principles. . . . The conservative movement has been gravely wounded by wrongheaded decisions made during the presidency of George W. Bush. The economic- and social-conservative majority, which was so much in evidence when I left the Reagan administration at the end of the President's first term, is history.

Any real progress in reversing course, Pauken argues, must address the problems of our twin deficits -- our huge budget deficits and our trade deficits. It requires, he believes, slowing the growth of government and taxation, while providing incentives to encourage savings and investment in American businesses and creating jobs at home. Beyond this, he declares, it requires policies that will return us to the "constitutional morality" of our Founding Fathers with their emphasis on checks and balances, separation of powers, and support for the principles of federalism. And it urgently requires the restoration of a culture guided by the principles of Christianity, rather than one shaped by what Pope Benedict XVI has called the "dictatorship of relativism."

Under George W. Bush, there was a surge of federal spending. A Cato Institute study refers to Bush as "the biggest spending president in 30 years." Beyond this, notes Pauken:

President Bush completely failed to exercise his veto power during his first term in office. The President even teamed up with Teddy Kennedy to ensure the passage of his "No Child Left Behind" legislation, which expanded the federal role in education beyond the wildest dreams of diehard liberals. Moreover, President Bush sought and received from Congress, the first extension of entitlements (in this case, Medicare entitlements) since the Johnson administration. . . . The budget of the Department of Education more than doubled during George W. Bush's tenure.

There are ways to provide an economic "stimulus," Pauken writes:

. . . without deficit spending. We need to rekindle the American work ethic in which individuals take pride in their work and in using the talents God gave them. The early Obama policies are doing just the opposite, creating the expectation of handouts. Give a man a fish, the proverb goes, and you'll feed him for a day; teach a man to fish and you'll feed him for a lifetime. That was the message of Booker T. Washington to the students at the Tuskegee Institute, which he founded in 1881 to educate freed slaves. . . . Compare that with the socialist economic philosophy of Washington's rival for leadership of the black community at that time. W. E. B. Du Bois favored government solutions to the economic problems of the black community. President Obama seems to be listening more to the advice of W. E. B. Du Bois than that of Booker T. Washington.

Of particular concern to Pauken is the coarsening of the American society and the embrace of the imperial presidency by Republicans, who once decried excessive executive power. He has issued a clarion call to conservatives to recapture the Republican Party and move America back onto the path of limited government, fiscal integrity, and a prudent -- rather than messianic -- foreign policy. Hopefully, his eloquent voice will be heard.

National Debt Is Described as a Fiscal "Cancer" and a Threat to National Security

The co-chairmen of President Obama's debt and deficit commission called current budgetary trends a cancer "that will destroy the country from within" unless checked by strong action in Washington.

The two leaders -- former Republican senator Alan Simpson of Wyoming and Erskine Bowles, White House chief of staff under President Clinton -- sought to build support for the work of the commission, whose recommendations are due later this year.

Bowles said that unlike the current economic crisis, which was largely unforeseen before it hit in fall 2008, the coming fiscal problems are staring the country in the face. "This debt is like a cancer."

The commission leaders said that, at present, federal revenue is fully consumed by three programs: Social Security, Medicare, and Medicaid. Simpson said:

The rest of the federal government, including fighting two wars, homeland security, education, art, culture, you name it, veterans -- the whole rest of the discretionary budget is being financed by China and other countries.

Addressing the National Governors Association in July, Bowles declared:

We can't grow our way out of this. We could have decades of double-digit growth and not grow our way out of this enormous debt problem. We can't tax our way out. . . . The reality is we've got to do exactly what you all do every day as governors. We've got to cut spending or increase revenues or do some combination of that.

Michael J. Boskin, a senior fellow at the Hoover Institution and professor of economics at Stanford University, declares that:

President Obama's budget for fiscal year 2011 lays out a stunningly expensive big-government spending agenda, mostly to be paid for years down the road. In addition to the proposed increase from today's levels in capital gains, dividend, payroll, income, and energy taxes, the enormous deficits and endless accumulation of debt will eventually force growth-inhibiting income tax hikes, a national value-added tax similar to those in Europe, or severe inflation.

In the first three years of President Obama's term, federal spending rose by an average of 4.4 percent of GDP. That is far more than during President Johnson's Great Society and Vietnam War buildup and President Reagan's defense buildup combined. Spending will reach the highest level in American history (25.l percent of GDP) except for the peak of World War II. The deficit of $l.4 trillion (9.6 percent of GDP) is more than three times the previous record in 2008.

Professor Boskin points out that:

Remarkably, Obama will add more red ink in his first two years than President George W. Bush -- berated by conservatives for his failure to control domestic spending and by liberals for the explosion of military spending in Iraq and Afghanistan -- added in eight. In his first fifteen months, Obama will raise the debt burden -- the ratio of the national debt to GDP -- by more than President Reagan did in eight years.

It must be remembered, argues Boskin, that:

Obama inherited a recession and a fiscal mess. Much of the deficit is the natural and desirable result of the deep recession. As tax revenues fall much more rapidly than income, those so-called automatic stabilizers cushioned the decline in after-tax income and helped natural business-cycle dynamics and monetary policy stabilize the economy. But Obama and Congress added hundreds of billions of dollars a year of ineffective "stimulus" spending more accurately characterized as social engineering and pork, when far more effective, less expensive options were available.

President Obama said in his budget message that "we cannot continue to borrow against our children's future." Yet, his budget proposal appears to do exactly that. He projects a cumulative deficit of $11.5 trillion, bringing the publicly held debt (excluding debt held inside the government, for example Social Security) to 77 percent of GDP and the gross debt to over 100 percent. Presidents Reagan and George W. Bush each ended their terms at about 40 percent.

Kenneth Rogoff of Harvard and Carmen Reinhart of the University of Maryland have studied the impact of high levels of national debt on economic growth in the U.S. and around the world. In a study presented earlier this year to the American Economic Association, they concluded that as long as the gross debt/GDP ratio is relatively modest, 30-39 percent of GDP, the negative growth impact of higher debt is likely to be modest as well. But as it rises to 90 percent of GDP, there is a dramatic slowing of economic growth by at least 1 percentage point a year. The Obama budget takes the gross debt over this line, rising to 103 percent of GDP by 2015. Such a huge debt implies immense future tax increases. Balancing the 2015 budget would require a 43 percent increase in everyone's economic tax.

Two other factors greatly compound the risk from the Obama budget plan, according to Dr. Boskin:

First, he is running up this debt and current and future taxes just as the baby boomers are retiring and the entitlement cost problems are growing. . . . Second, the president's programs increase the fraction of people getting more money back from the government than they pay in taxes to almost 50 percent. Demography will drive it up further. That's an unhealthy political dynamic.

To Indiana's Governor Mitch Daniels (R), the federal debt represents as much a threat to the American way of life as terrorism. He suggests placing other, partisan issues aside to concentrate on debt reduction:

If you don't accept the premise that the American experience is mortally threatened by a couple of problems, this one (dealing with the budget) -- and I would add the problem of terrorism . . . then my suggestion may not make sense to anyone. But I personally feel that there's urgency around dealing with those problems, both as a matter of priority but also as a matter of getting a broader consensus of Americans together.

Recently, Daniels told The Weekly Standard that the next president "would have to call a truce on the so-called social issues until the country's fiscal house was in order." Speaking to a Washington summit organized by Americans for Generational Equity and the American Benefits Institute, he said:

I don't think we can solve that problem (on the debt) . . . in a sharply divided America. We're going to need to trust each other, and accept the good will, accept the sincerity of other parties to do some fundamental things.

Mitchell noted that skeptics of the United States and of democracy have questioned whether U.S. leaders are good at dealing with big problems -- such as our skyrocketing budget deficits. The challenge before us is now clear -- as deficits skyrocket and entitlement spending promises to grow dramatically. Fortunately, a number of thoughtful observers are beginning to confront it. Whether our political process is able to move us forward, however, remains to be seen. *

"[T]he opinion which gives to the judges the right to decide what laws are constitutional and what not, not only for themselves, in their own sphere of action, but for the Legislature and Executive also in their spheres, would make the Judiciary a despotic branch." --Thomas Jefferson

Sunday, 29 November 2015 03:45

Ramblings

Ramblings

Allan C. Brownfeld

Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review and a contributing editor to such publications as Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.

America's Ethical Decline Is Accompanied by Growing Ignorance of Our Religious Traditions

More and more, the concept of ethics in Congress, in business, in government, and in other areas of our society seems to be an oxymoron.

There has been a succession of headline stories underscoring our moral "pay-to-play" lapses. The alleged pay-to-play scheme laid out by Illinois Gov. Rod Blagojevich -- to sell Barack Obama's Senate seat to the highest bidder -- is, sadly, not much different from the usual political enterprise observed in Washington and throughout the country. "In some ways, the only thing Blagojevich did wrong was he was stupid enough to say it out loud," said Meredith McGehee of the Campaign Legal Center. "For other people, it's a wink and a nod. Verbalizing it crosses the line." Northwestern University law Professor Albert Altschuler says that if Blagojevich

. . . made a quid pro quo offer for campaign funds, then he's guilty of breaking the law. There's a thin line. It's different if he said, "If I was in the Senate, I'd be in a position to raise money for you."

It may be legal for those who contribute large amounts to presidential campaigns to be named to ambassadorships and other important posts, but what exactly is the ethical difference between this and what the Illinois Governor is accused of doing?

Or consider the Ponzi scheme of Wall Street insider Bernard Madoff, formerly the head of the NASDAQ stock market. Len Fisher, author of Rock, Paper, Scissors: Game Theory in Everyday Life, writes that:

There seems to be little doubt that Bernard Madoff is a cheat. . . . But was it all Madoff's fault? I contend that the losses would have been less severe, and might not have occurred at all, if many of Madoff's investors had not been cast from the same mold that Madoff was. The facts should have been enough to make anyone suspicious. Madoff's accounts were only perfunctorily audited. . . . Above all his business returns were consistently good -- too good -- and he never reported a down month, let alone a down quarter or year . . . such oddities had to have set off alarm bells. So why did so many professionals continue to invest with him? Only one answer makes sense. Some of these investors must have suspected that he was a cheat but continued to invest because they thought they were benefiting from that cheating. In other words, they took him for a different kind of cheat from who he was -- one who was using information gained from his market-making operation to earn illegal profits rather than one who was operating a breathtakingly audacious Ponzi scheme.

Nobel Prize winning economist Paul Krugman asks: "How different, really, is Mr. Madoff's tale from the story of the investment industry as a whole." He declares:

The financial services industry has claimed an ever-growing share of the nation's income over the past generation, making the people who run the industry incredibly rich. Yet, at this point, it looks as if much of the industry has been destroying value, not creating it. And it's not just a matter of money: the vast riches achieved by those who managed other people's money have had a corrupting effect on our society as a whole.

Young people, observing the behavior of their elders, are exhibiting precisely the same sort of indifference to traditional moral and ethical standards.

In the past year, 30 percent of U.S. high school students have stolen from a store and 64 percent have cheated on a test, according to a new large-scale survey.

The Josephson Institute, a Los Angeles-based ethics institute, surveyed 29,760 students at l00 randomly selected high schools nationwide, both public and private. Michael Josephson, the institute's founder and president, said he was most dismayed by the findings about theft. The survey found that 35 percent of boys and 26 percent of girls -- 30 percent overall -- acknowledged stealing from a store within the past year. One-fifth said they stole something from a friend; 23 percent said they stole something from a parent or other relative.

"What is the social cost of that -- not to mention the implication for the next generation of mortgage brokers?" Mr. Josephson said: "In a society drenched with cynicism, young people can look at it and say 'Why shouldn't we? Everyone else does it.'"

Among the findings:

* Cheating in schools is rampant and getting worse. Sixty-four percent of students cheated on a test in the past year, and 38 percent did so two or more times, up from 60 percent and 35 percent in a 2006 survey.
* Thirty-six percent said they used the Internet to plagiarize an assignment, up from 33 percent in 2004.
* Despite such responses, 93 percent of the students said they were satisfied with their personal ethics and character, and 77 percent affirmed that "When it comes to doing what is right, I am better than most people I know."

Iris Murdoch, in Metaphysics As a Guide to Morals, said:

The child . . . who is led by his observation to conclude that "Do not lie" is part of an espionage system directed against himself, since the prohibition obviously means nothing to his elders, is being misled concerning the crucial position of truth in human life.

This flexibility toward the truth shows later in life. According to a recent study conducted by Who's Who Among High School Students, 78 percent of the high school students polled say they have cheated. Paul Krouse, the publisher, said, "There certainly has to be a breakdown in the ethics and integrity of young people which probably mirrors the society they are living in."

To deal with the ethics breakdown, the U.S. Marine Corps has added a "value training" course to its boot camp curriculum. One senior Marine officer summed up the situation:

The communities (new recruits) are coming from have put less emphasis on ethical standards and these kinds of core values we want to see. . . . They're not teaching values in schools. They're not learning it from church members to the extent. . . . they used to. So there is a need that must be stressed in values-based education.

A decade ago, when Rep. Dan Rostenkowski (D-IL) was sentenced to 17 months

in prison after pleading guilty to defrauding Congress of $636,000 in an illegal payroll scam, U.S. Attorney Thomas Motley called it a "scheme to defraud the U.S. that stretched over more than 20 years." The judge in the case, Norma Holloway Johnson, said that, "When I think of your case . . . the one phrase that comes to mind is betrayal of trust." Yet, when Rostenkowski responded, he said: "Having pled guilty, I do not believe that I am different from the vast majority of members of Congress." Recent examples of congressional corruption indicate that Rostenkowski may have had a point.

One important reason -- perhaps the important reason -- for the breakdown in our society which is evident all around us is the fact that we have entered an era of moral relativism in which we hesitate to declare any action wrong and immoral, or to confront the existence of evil. Will Herberg, theologian and author of the well-known volume Protestant, Catholic, Jew stated:

. . . the really serious threat to morality in our time consists not in the multiplying violations of an accepted moral code, but in the fact that the very notion of morality or a moral code seems to be itself losing its meaning. . . . It is here that we find a breakdown of morality in a radical sense, in a sense almost without precedent in our Western society.

The decline of religion in our society has been recorded by Professor Stephen Prothero, chairman of the Department of Religion at Boston University, in his book Religious Literacy. In spite of the fact that more than 90 percent of Americans say they believe in God, only a tiny portion of them knows a thing about religion. When he began college teaching 17 years ago, Prothero writes, he discovered that few of his of students could name the authors of the Christian Gospels. Fewer could name a single Hindu Scripture. Almost no one could name the first five books of the Hebrew Bible.

"During the 1930s," writes Prothero:

The neo-orthodox theologian H. Richard Niebuhr skewered liberal Protestants for preaching "a God without wrath (who) brought men without sin into a kingdom without judgment through the ministrations of a Christ without a cross." But my students' "dogma aversion" (as one put it) goes liberal Protestantism one further. These young people aren't just allergic to dogma. They are allergic to divinity and even heaven. In the religions of their imagining, God is an afterthought at best. And the afterlife is, as one of my students told me, "on the back burner." What my students long for is not salvation after they die but happiness . . . here and now. They want less stress and more sleep.

The disconnect between young Americans and their religious tradition is, in Prothero's view, a subject about which all of us should be concerned. He hopes that Americans will have enough religious knowledge to debate ethics positions using holy texts, to understand Biblical references in political speeches, to question their own beliefs about God -- and to encourage others to question theirs. He faults priests, rabbis, imams, and ministers for not engaging the younger generation. "Far too often," he declares, "religious services in the U.S. are of the adults, by the adults, and for the adults. And don't think young people aren't noticing."

The connection between our ethical decline and the growing ignorance of our religious traditions is a subject to which our religious leaders -- and not only our religious leaders -- should turn their attention.

To Move Africa Forward, It Is Essential to Understand the Real Causes of Poverty

Poverty in Africa is widespread, but its causes are largely misunderstood. Indeed, because of such a misunderstanding men and women of good will in the Western world, seeking to alleviate such poverty, have pursued programs of massive foreign aid that, in most instances, have done little good.

Professor James Robinson of Harvard University, in an article "Property Rights and African Poverty," in Defining Ideas, 2010, declares that:

The real reason Africa has been poor historically and is poor today have to do with property rights. In short, African countries do not have the type of property rights that are connected with economic progress in Western Europe or North America.

The evidence of widespread poverty is stark. The World Bank measures poverty levels by the number of people who live on less than $1 a day; the majority of those people, around 350 million of them, live in sub-Sahara Africa. This is the only part of the world in which the absolute number of poor people is increasing. By 2015, the number of poor people in Africa is forecast to be more than 400 million.

Before the Industrial Revolution began in Great Britain about 230 years ago, Professor Robinson points out:

Differences in the level of prosperity among countries were much smaller than they are now. Whereas today, the average income of a citizen of the United States is about 40 times that of a citizen of a country such as Ethiopia or Sierra Leone, in 1750 that difference was probably only two or three times. Between 1750 and 2009, the U.S. experienced rapid economic growth, but African countries did not.

The insecurity of property rights in Africa, Professor Robinson notes:

. . . was exacerbated by the slave trade, which distorted paths of political development and led to the emergence of states, such as the Kongo, based not on investing but on slavery. Africa's increasing backwardness made it vulnerable to colonialism, which replaced one form of absolutism with another. Later, independence did the same, with predictable consequences for property rights.

Sierra Leone was taken over by Siski Stevens, who pulled up the railway line to the south of the country, and sold off all the track and rolling stock to isolate the Mendeland region, where support for his opposition was strongest. The roads fell to pieces and schools disintegrated. National television broadcasts stopped in 1987. The Sierra Leone Produce Marketing Board expropriated farms. When the governor of the national bank complained about fiscal profligacy in 1980, he was thrown to his death from the roof of the central bank offices. In 1991, the regime collapsed into civil war. Today most of Sierra Leone is controlled by 149 chiefs. Sadly, this is all too typical.

In a recent report, "The State of Liberal Democracy In Africa," the Cato Institute states that:

Africa's transition to liberal democracy is unlikely to happen without far-reaching economic reforms; in fact, all liberal democracies are also market-oriented economies. Regrettably, many African countries are not only politically repressive but also economically dirigiste. Increased economic freedom and the emergence of a vibrant private sector can bring about direct economic benefits, such as higher incomes, and indirect benefits, such as decentralization of power.

All too often, foreign aid can be siphoned off by corrupt politicians. In the opinion of Professor Robinson:

Three things would help sub-Saharan African societies get on the right track economically. First, give Africans more economic opportunities, which doesn't mean throwing money at them. What it does mean is opening markets to African exports and trade. . . . Second, economics must play a bigger role in foreign policy. Foreign policy toward Africa has been driven too much by short-term politics without focusing on economic development, but promoting prosperity in Africa is good long-run foreign policy. Supporting dictators who are "pro-Western" risks creating an anti-Western society. Finally, development assistance must help change the political trajectories of societies. . . . Good political and economic institutions emerge from a balance of power in society. To achieve this, we should help civil society and the media promote de facto checks and balances on rulers.

There are some dim signs of hope. Sixteen years ago, the world watched in horror as 800,000 Rwandans were systematically murdered by their neighbors. In just l00 days, over 10 percent of the country's population, mostly Tutsis, the country's minority group, were slaughtered. Paul Kagame led the Rwandan Patriotic Front (RPF), the guerrilla army made up largely of Tutsi refugees, that ultimately overthrew the government and ended the bloodshed. Now he's president.

The Wall Street Journal reports that:

You might suppose that the leader of a country synonymous with genocide would be far more interested in seeking foreign aid than in talking supply-side economics. But then you probably haven't met Mr. Kagame. His agenda for improving the state of his country boils down to one goal: "spurring private investment."

According to the Wall Street Journal's Anne Jolis, Kagame told her that:

We believe in private enterprise, free markets, and competition . . . So we have to make sure there is a conducive environment for people to be creative and innovative.

Jolis writes that:

Unlike many of his peers in the Third World, his focus is not on how to beg for charity. During our entire conversation, Mr. Kagame doesn't once utter the word poverty. "We can only have ourselves to blame for our failures," he says. "We don't expect anyone to hand us any success or progress we hope to be making." That attitude makes Mr. Kagame a skeptic when it comes to foreign aid, which he faults for many of the world's ills. "It has created dependency, it has distorted the markets, it has detached people from their leaders and their values, it has created conflicts in some cases."

Rwanda has cut its dependence on aid by half in the past 15 years, and has become self-sufficient in food for the first time in its history. Gradual improvements to property rights, along with government money for fertilizer to farmers, which the farmers have since repaid with the revenue from their produce, have even allowed Rwanda to begin exporting some of its crops.

To help Africa move from poverty to prosperity, it is essential that we understand the reasons for its current dilemma -- and the proper path forward. Property rights and free and open markets represent that path. *

"Equal and exact justice to all men . . ." --Thomas Jefferson

Page 5 of 8

Calendar of Events

Annual Seminar 2021
Thu Oct 14, 2021 @ 2:30PM - 05:00PM
Annual Seminar 2022
Thu Oct 13, 2022 @ 2:30PM - 05:00PM
Annual Dinner 2022
Thu Oct 13, 2022 @ 6:00PM - 08:00PM
Annual Seminar 2023
Thu Oct 19, 2023 @ 2:30PM - 05:00PM
Annual Dinner 2023
Thu Oct 19, 2023 @ 6:00PM - 08:00PM

Words of Wisdom