Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
On the brink of the so-called "fiscal cliff," the House voted 257-167 to approve the Senate bill that would avoid major tax hikes and spending cuts. In response, the stock market soared. What went largely unremarked was the fact that neither Republicans nor Democrats made any effort to confront the real financial challenges that lie ahead. As we seem repeatedly to say when it comes to Congress dealing with the problems facing the nation, "they kicked the can down the road."
What we face - and no one in Washington seems prepared to confront - are massive structural deficits as the baby boomers start to retire in large numbers.
In 1900, 1 in 25 Americans was over the age of 65. In 2030, 17 years from now, 1 in 5 Americans will be over 65. Because we have many programs that provide guaranteed benefits to the elderly, this has major budgetary implications.
In 1960, there were about five working Americans for every retiree. By 2025, there will be just over two workers per retiree. In 1975, Social Security, Medicare and Medicaid made up 25 percent of federal spending. Today, they add up to 40 percent. Within a decade, these programs will take over half of all federal outlays.
We have postponed the problem by borrowing heavily for decades. Our debt stands at 100 percent of GDP. Federal spending on everything other than entitlements, defense, and interest on the debt has been shrinking for many years. A recent report from the National Governors, Association points out that Medicaid is now the single largest item on state budgets and has grown by over 20 percent for each of the past two years.
This trend is escalating. The Peter G. Peterson Foundation calculates, using Congressional Budget Office numbers, that by 2040 we are likely to spend 10 percent of the GDP on interest payments alone - versus 1.4 percent today.
Congress and President Obama set up the artificial "fiscal cliff" scenario that would, allegedly, force them to do the right thing. Completely ignored, however, were the deep structural reforms that will eventually be needed.
Casting the budget problem as a question of whether the richest 1 or 2 percent of the population should pay more taxes is avoiding the real issues before us. The real problem we face is the bipartisan promises made to Americans of both high government benefits and low taxes at the same time. This may be democracy at work. Most Americans may want high benefits and low taxes. No one in Washington seems prepared to tell them the hard truth - that we cannot afford benefits we are unwilling to pay for.
The nonpartisan Congressional Budget Office puts it this way:
With the population aging and healthcare costs per person likely to keep growing faster than the economy, the U.S. cannot sustain the federal spending programs that are now in place with the federal taxes as a share of GDP that it has been accustomed to paying.
Washington Post columnist Robert Samuelson faults both parties for the situation we face:
The main reason that we keep having these destructive and inconclusive budget confrontations is not simply that many Republicans have been intransigent on taxes. The larger cause is that Obama refuses to concede that Social Security, Medicare and Medicaid are driving future spending and deficits. So when Republicans make concessions on taxes (as they have), they get little in return. . . . Just as many Republicans don't want to raise taxes a penny, many Democrats don't want benefits cut a penny.
One reasonable example is the proposal to shift the standard consumer price index (CPI) to a "chained" CPI to adjust Social Security benefits. From 2013 to 2022, this change is estimated to reduce Social Security spending by $100 billion. Over that decade, total Social Security benefits are estimated at trillions, but the cut would be a tiny percentage. Yet, Democrats in Congress rejected any serious consideration of this proposal.
As the population ages and health care costs soar, to avoid the country sinking into debilitating debt, revenue must rise and spending - particularly on Medicare, Social Security. Medicaid and military healthcare - must be brought under control. No solution that pleases extremists on either side - those who reject any increase in revenue and those who oppose any decrease in benefits - has, at this time, any chance of becoming law.
One thoughtful member of Congress, Senator Michael Bennett (D-CO) voted against the "fiscal cliff" deal because it did not have any meaningful deficit reduction. He said:
Going over the cliff is a lousy choice and continuing to ignore the fiscal realities that we face is a lousy choice. . . . The burden of proof has to shift from the people who want to change the system to the people who want to keep it the same. I think if we can get people focused to do what we need to do to keep our kids from being stuck with this debt that they didn't accrue, you might be surprised at how far we can move this conversation.
Senator Bennet laments that:
Washington politics no longer follows the example of our parents and our grandparents who saw as their first job creating more opportunities, not less, for the people who came after them. . . . The political debate now is a zero sum game that creates more problems than solutions.
Unfortunately, things will probably have to get worse than they are at the present time before either Democrats or Republicans will begin to make the hard decisions necessary to set our society on a sustainable economic path for the future. Real leadership is hard to discover in today's Washington.
The fact that our government is increasingly out of control is becoming clear to more and more Americans. We continue to embark upon huge new public programs that, regardless of whether one agrees or disagrees with their merits, we refuse to pay for. How long can any society continue to spend well beyond its means without dire consequences? Eventually, if things continue on their present trajectory, we will see.
The tendency to spend without raising funds to pay the bills continues whichever party is in power. Current discussion of a "fiscal cliff" does not confront the reality of politicians of both parties busy subsidizing the variety of special interest groups from which they raise money to attain office. As many have said, "We have the best Congress money can buy."
The Founding Fathers, if they suddenly arrived in contemporary America, would be disappointed with what they saw. But it is unlikely that they would be surprised. They were, after all, thoughtful students of human nature, and how it influences the nature of the government under which we live.
Many things have changed in society. The Framers of the Constitution could never have foreseen the creation of automobiles, airplanes, television, computers, cell phones and other elements of our modern life. But while things around us have undergone dramatic change, what has remained the same is man himself.
Human nature, for better or worse, is unchanged. If this were not true, we could not, in the 21st century, identify with the writings of Plato or Aristotle, or the characters in Shakespeare. The teachings of Moses and Jesus are as relevant to modern man as they ever were.
The Founding Fathers were not utopians. They understood man's nature. They attempted to form a government that was consistent with, not contrary to, that nature. Alexander Hamilton pointed out that:
Here we have already seen enough of the fallacy and extravagance of those idle theories that have amused us with promises of an exemption from the imperfections, weaknesses, and evils incident to society in every shape. Is it not time to awake from the deceitful dream of a golden age, and to adopt as a practical maxim for the direction of our political conduct that we, as well as the other inhabitants of the globe, are yet remote from the happy empire of perfect wisdom and perfect virtue?
Rather than viewing man and government in positive terms, the framers of the Constitution had almost precisely the opposite view. John Adams declared that, "Whoever would found a state and make proper laws for the government of it must presume that all men are bad by nature." As if speaking to those who place ultimate faith in egalitarian democracy, Adams attempted to learn something from the pages of history.
We may appeal to every page of history we have hitherto turned over, for proofs irrefragable, that the people, when they have been unchecked, have been as unjust, tyrannical, brutal, barbarous and cruel as any king or senate possessed of uncontrollable power. . . . All projects of government, formed upon a supposition of continued vigilance, sagacity, and virtue, firmness of the people when possessed of the exercise of supreme power, are cheats and delusions. . . . The fundamental article of my political creed is that despotism, or unlimited sovereignty, or absolute power, is the same in a majority of a popular assembly, an aristocratic council, an oligarchical junto, and a single emperor. Equally bloody, arbitrary, cruel, and in every respect diabolical.
That government should be clearly limited and that power is a corrupting force was the essential perception of the men who made the nation. In The Federalist Papers, James Madison wrote:
It may be a reflection on human nature that such devices should be necessary to control the abuses of government. But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: you must first enable the government to control the governed, and in the next place oblige it to control itself.
Yes, if the Founding Fathers arrived in Washington tomorrow, they would be disappointed - but they would not be surprised.
The official national debt of $16 trillion is growing at the rate of $4 billion a day. This, together with what the government owes its various trust funds, is more than 100 percent of gross domestic product. The state's debts are about $3 trillion, and their unfunded liabilities are approximately another $4 trillion. "Debts of this magnitude," says Michael Greve, a scholar at the American Enterprise Institute, "will not be paid. . . . Our politics aims at inspiration on the cheap.''
Senator Tom Coburn (R-OK), one of the few members of Congress who appears serious about cutting unnecessary spending, points out that for the past two years the Government Accountability Office (GAO) has shown Congress more than $200 billion in duplicative spending alone. There are, for example, 47 job-training programs across nine agencies that cost $18 billion but are not producing results. GAO found dozens of other areas of costly duplication that if streamlined could improve outcomes while saving money.
GAO identified 209 federal Science, Technology, Engineering, and Mathematics (STEM) programs run by 13 federal agencies, costing taxpayers more than $3 billion annually. According to Sen. Coburn:
Duplication is only part of the problem. Our budget is full of outrageous examples of waste and mismanagement. For instance, our government has borrowed $20 million from future generations in order to send millionaires unemployment checks. Funds meant to help agencies coordinate intelligence and counterterrorism efforts at the Department of Homeland Security "fusion centers" have been spent on flat-screen TVs, SUVs for personal use, and intelligence that has been described as useless and irrelevant. What Washington is lacking is not options for savings but the political courage to make specific decisions. Millions of families and individuals in America are already living in the world of hard decisions and priorities. It's long past time for Washington politicians to join them.
Often, politicians speak of cutting spending and tax breaks that, in reality, amount to savings of very little money. Mitt Romney and Big Bird is one example. And President Obama declared that, "I want to stop giving tax breaks to companies that ship jobs and factories overseas." He discussed this in his "economic patriotism" ad. Conservative columnist Timothy Carney provided this assessment:
Sounds fine, but what is he actually talking about? Not much, it turns out. One expense businesses incur is moving labor or equipment to new locations. Obama is proposing that the cost of moving facilities out of the U.S. shouldn't be deductible. Those costs should be part of the company's profits. This targeted tax penalty would raise $14 million per year, or 0.0005 percent of the federal budget, according to the Joint Committee on Taxation.
Neither party speaks of reigning in expensive government programs of subsidies for corporations or agriculture. Veronique de Rugy, a senior research fellow at the Mercatus Center at George Mason University, writes that:
The big winner of the three presidential debates is government spending. Mitt Romney singled out only a few small programs that he thinks are ripe for cutting. President Obama stayed away from any specifics. . . . But the present levels of spending are not an option. . . . Everything has to be on the table. . . . The bottom line is that there is no silver bullet for balancing the budget. We didn't get in this fiscal mess overnight, and it will take us some time to get out of it.
Clearly, everything should be put on the table - including the Pentagon, which consumes 18 percent of the federal budget. Thus far, however, politicians of both parties seem unwilling to consider cutting programs that have strong constituencies. At the present time, 30 cents of every dollar government spends is borrowed. With the coming explosion in programs such as Social Security, Medicare, and Medicaid - not to mention President Obama's new health care program - we will be drowning in debt.
How bad do things have to get before our politicians begin to pay real attention? Thus far, there are no good answers to this question.
In the past presidential campaign, there had been much discussion about those who receive one form or another of government assistance and the need to reduce such aid.
The government aid we seem to focus upon is that received by those at the lower end of the economic spectrum, such as food stamps.
While it is certainly proper to review all forms of government assistance, it is surprising that neither Republicans nor Democrats have had much - if anything - to say about corporate welfare. For politicians who have bailed out Wall Street and the auto industry - and who subsidize many others - to focus all of their attention on forms of government aid other than that for corporations tells us a great deal about how our current politics avoids the real problems we face. Perhaps it would be different if both parties were not so busy raising campaign funds from the very people they subsidize.
A recent Cato Institute study finds that federal business subsidies total almost $100 billion annually. This includes subsidies to small businesses, large corporations and industry organizations. The subsidies come from programs in many federal departments including Agriculture, Commerce, Energy, Housing, and Development. As part of the national income accounts, the Bureau of Economic Analysis calculates that the federal government handed out $57 billion in business subsidies in 2010.
A recently issued report from The Heritage Foundation by Chris Edwards and Tad DeHaven, notes that:
There are several upsides to ending federal subsidies to businesses: it would reduce the amount of money taken from taxpayers and given to big corporations; and it would reduce the incentives for political corruption. A less obvious, but no less important, reason to end corporate welfare is that an economy that doesn't depend on subsidies from government is a more entrepreneurial economy that will grow faster.
Edwards and DeHaven provide examples of the failure of government subsidization in the energy industry, with the complicity of both parties:
An early subsidy effort was the Clinch River Breeder Reactor, which was an experimental nuclear fission power plant in Oak Ridge, Tennessee, in the 1970s. This Republican-backed boondoggle cost taxpayers $1.7 billion and produced absolutely nothing in return. Then we had the Synthetic Fuels Corporation (SFC) approved by President Jimmy Carter in 1980, who called it a "keystone" of U.S. energy policy. The government sank $2 billion of taxpayer money into this scheme that funded coal gasification and other technologies before it was closed down as a failure.
More recently, the Obama administration's failures in subsidizing green energy projects are piling up - Solyndra, Raser Technologies, Ecotality, Nevada Geothermal, Beacon Power, First Solar, and Abound Solar. These subsidy recipients have either gone bankrupt or appear to be headed in that direction. The Washington Post found that, "Obama's green-technology program was infused with politics at every level."
The flow of taxpayer money to business continues to grow, whichever party is in power. A recent The New York Times article - "Ties To Obama Aided in Access for Big Utility" - documented how Exelon Corp., a Chicago-based utility, used its political support for President Obama for access and profit. Exelon was one of six utilities that received the maximum $200 million stimulus grant. It also got a $600 million renewable-energy loan guarantee for a solar project in Los Angeles. One of Exelon's subsidiaries received a $200 million grant to install "smart meters" in the Philadelphia area.
Republicans and Democrats are in this together. Sheldon Richman, a senior fellow at The Future of Freedom Foundation and editor of The Freeman, points out, with regard to Republican Vice Presidential candidate Paul Ryan that:
In the Bush years, Ryan voted for everything: No Child Left Behind (which increased the centralization of education), the Medicare drug entitlement, housing subsidies, unemployment-benefits extension, the bank bailouts, and the 2008 subsidies to failing Chrysler and GM. In voting for TARP (the Troubled Asset Relief Program), Ryan said, "Madame Speaker, this bill offends my principles, but I'm going to vote for this bill in order to preserve my principles.
William O'Keefe of the George C. Marshall Institute notes that:
"Crony capitalism," a frequently used term describing firms that seek to invest with taxpayer dollars instead of share owner dollars will not reduce unemployment, promote robust economic growth or help the United States compete in the global economy. Reform is needed shrinking the public trough, creating a level playing field, providing business confidence and providing true regulatory reform provide a good start.
Thus far, neither party has had anything to say about corporate welfare. This indicates, sadly, that neither party is serious about bringing government spending under control. It is politics as usual - just what has produced the economic morass we are now in. Americans deserve something better. Things, it seems, will have to become even worse before anyone begins to tell the truth about our problems. *
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Now that the 2012 election campaign has come to an end, it would be good if Americans could set partisan acrimony aside as the nation prepares to celebrate Thanksgiving.
This holiday has an interesting history and debate continues over where, in fact, the first Thanksgiving took place. Those of us who live in Virginia believe that the Old Dominion has a powerful historical case that others have tended to overlook.
This writer visited the Berkeley Plantation in Charles City County, Virginia, many years ago as the plantation prepared to host a celebration of the 350th anniversary of the first commemoration of Thanksgiving. Plantation owner Malcolm Jamieson displayed letters from President John F. Kennedy and former Massachusetts Governor John Volpe declaring that Berkeley was the site of the first formal Thanksgiving in the New World.
Berkeley is the site of other historical firsts as well. The land on which it stands was part of a grant made in 1619 by King James I to the Berkeley Company and was designated "Berkeley Hundred." On December 4, 1619, the settlers stepped ashore there and in accordance with the proprietors' instructions that "the day of our ships' arrival. . . shall be yearly and perpetually kept as a day of Thanksgiving" celebrated the first Thanksgiving Day more than a year before the Pilgrims arrived in New England.
There is much history at Berkeley. In 1781, it was plundered by British troops under Benedict Arnold. During the Civil War it served as headquarters for General McClellan after his withdrawal from the Battle of Malvern Hill. Federal troops were encamped in its fields, transports and gunboats were anchored in the James River. While quartered here with McClellan in the summer of 1862, Gen. Butterfield composed "Taps." It is also reported that the first bourbon distilled in America was distilled at Berkeley by an Episcopal minister.
Walking around the grounds at Berkeley is to enter another world. This is where America began. It was strong men and women who built a nation on these often inhospitable shores. They made many mistakes, as people are always wont to do, but they created a new society in which, as George Washington wrote to the Hebrew Congregation at Newport, Rhode Island, there would "be none to make men afraid." We are a young country, but we are also an old one. Our Constitution is the oldest in the world, and we have continuously maintained the freedoms to which we first paid homage. There has been no period of an elimination of freedom of religion, or of the press, or of assembly. We have weathered wars and depressions. We will also weather the difficulties in which we are now embroiled. But we will do so only if Americans begin to recall their history and their values and not give assent to those who seek only to condemn and to destroy.
Several years ago, I visited a U.S. military ceremony in Italy - near Anzio - with my son Peter and grandson Dario. This visit caused me to reflect on the unique nature of American society.
It was instructive to read the names of the American dead. Virtually all nationalities, ethnic groups and religions are represented. In the 1840s, Herman Melville wrote that, "We are heirs of all time and with all nations we divide our inheritance." If you kill an American, he said, you shed the blood of the whole world.
America is more than simply another country. Visiting New Amsterdam in 1643, French Jesuit missionary Isaac Jogues was surprised to discover that 18 languages were spoken in this town of 8,000 people. In his Letters From an American Farmer, J. Hector St. John Crevecoeur wrote in 1782:
Here individuals of all nations are melted into a new race of men, whose labors and posterity will one day cause great changes in the world.
Author Mario Puzo declared that:
What has happened here has never happened in any other country in any other time. The poor who had been poor for centuries - whose children had inherited their poverty, their illiteracy, their hopelessness, achieved some economic dignity and freedom. You didn't get it for nothing, you had to pay a price in tears, in suffering, why not? And some even became artists.
As a young man growing up in Manhattan's Lower East Side, Puzo was asked by his mother, an Italian immigrant, what he wanted to be when he grew up. When he said he wanted to be a writer, she responded that, "For a thousand years in Italy, no one in our family was even able to read." But in America, everything was possible - in a single generation.
In 1866, Lord Acton, the British Liberal party leader, said that America was becoming the "distant magnet." Apart from the "millions who have crossed the ocean, who should reckon the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West?"
America has been a nation much loved. Germans have loved Germany, Frenchmen have loved France, Swedes have loved Sweden. This, of course, is only natural. America has been beloved not only by native Americans, but by men and women of every race and nation throughout the world who have yearned for freedom. America dreamed a bigger dream than any nation in the history of man.
As we gather for our Thanksgiving celebrations it is proper that we reflect upon that first Thanksgiving in Virginia. We have come a long way since that time, and most of that way has been good. Happy Thanksgiving!
The history of the world indicates that freedom is not natural to man, but must be carefully cultivated and taught. Through most of recorded history, man's natural state has been to live under one form of tyranny or another. Freedom must be learned and carefully transmitted from one generation to another if it is to endure.
There is little effort in our contemporary society to transmit our history, or culture, and the values upon which a free society is built. In an important new book, American's Best Colleges! Really? (Crossbooks), John Howard, at 90, continues his strenuous efforts as an educator to reverse recent trends.
This book is dedicated to Angus MacDonald, the long-time editor and publisher of The St. Croix Review and to James Crawford, the founding editor of The Herald Examiner of Freeville, New York.
John Howard has lived an extraordinary life. During World War II, he served in the 745th Tank Battalion, First Infantry Division, and received two Silver Stars and two Purple Hearts. From 1960-77 he was president of Rockford College. He then served as President of the Rockford Institute and at the present time is a Senior Fellow at the Howard Center for Family, Religion and Society.
He believes that our institutions of higher learning have let us down in carrying out their responsibility of introducing our history, culture, and values to the new generation of Americans. He quotes Aristotle: "Of all the things I have mentioned, that which contributes most of the permanence of constitutions is the adaption of education to the form of government." And Montesquieu, in The Spirit of the Laws, analyzed various forms of government. He stated that each one had a unique relationship with the people, and if that relationship changed, that form of government would perish.
In despotism or tyranny, he argues, the government could last as long as the people were afraid of it, doing what they were told to do for fear of severe penalties. A monarchy could last as long as the people were loyal to the crown.
"But a democracy," writes Howard:
. . . or other self-governing regime, depended upon a virtuous populace, which voluntarily abided by the laws and other settled standards of behavior. This free society was the best form of government, and the hardest to achieve and sustain. America's free society was destined for success because the colonists who came to New England and left England for the sole purpose of finding a land where they could practice the Christian faith. . . were already deeply committed to a virtuous life, wholly suited for the government of a free society.
John Howard believes that the Founding Fathers fully understood and supported the cardinal principle proclaimed by Aristotle:
On July, 1787, the Continental Congress enacted the Northwest Ordinance. It set forth the plan for the government of the residents of the Northwest Territory and the basis on which a region might qualify for statehood. Article III begins, "Religion, morality, and knowledge being necessary to good government and the happiness of mankind. . . ." Here is an acknowledgment that our self-government is dependent on religion, morality, and education in that order of importance. That document, and the Declaration of Independence, and the U.S. Constitution were so intelligently conceived that they reflect a breadth of knowledge and wisdom often said to be superior to the products of any other deliberative body in world history. Certainly, there have been no comparable accomplishments in recent times.
The stress on religion and morality was echoed in the main body of George Washington's inaugural address. American education's attention to the development of character among students was summarized in a 1979 report published by the Hastings Center. The author was Columbia Professor Douglas Sloan. He wrote:
Throughout the 19th century, the most important course in the college curriculum was moral philosophy, taught usually by the college president and required of all senior students. . . . The full significance and centrality of moral philosophy in the 19th century curriculum can only be understood in the light of the assumption held by American leaders and most ordinary citizens that no nation could survive, let alone prosper, without some common moral and social values. . . . However, moral philosophy did not carry the whole burden of forming the students' character and guiding their conduct, the entire college experience was meant above all to be an experience in character development and the moral life.
The wise political philosopher Edmund Burke declared that political liberty cannot exist unless it is sustained by moral behavior. This truth was embraced by our Founding Fathers. President John Adams' Second Inaugural Address was the first one given in the new capitol building. He urged:
May this residence of virtue and happiness . . . here and throughout our country, may simple manners, pure morals, and true religion flourish forever.
President James Madison wrote:
We have staked the whole future of American civilization, not upon the power of government, far from it. We have staked the future of all of our political institutions upon the capacity of mankind for self-government upon the capacity of each and all of us to govern ourselves according to the Ten Commandments of God.
Alexis de Tocqueville visited America in the 1830s. His book Democracy in America is a classic description of the government and people of America: "By their practice, Americans show they feel the urgent necessity to instill morality into democracy by means of religion."
John Howard declares: "Instill morality into democracy by means of religion - De Tocqueville saw this as the only means by which liberty can be perpetuated in all democratic nations."
John Howard has dedicated his long life to promoting the values upon which a free society depends. In this book are collected a series of his speeches and essays, as well as his latest thoughts on how to preserve a free society and extend it into the future. Those who seek to understand how the values upon which such a society depends can endure into the future would do well to ponder John Howard's thoughtful words on this subject.
American education is in the grip of an epidemic of cheating on the part of students and, sad to say, teachers as well.
In August, some 125 students at Harvard University were being investigated for cheating on a final examination.
Howard Gardner, professor of cognition and education at the Harvard Graduate School of Education, conducted a study of 100 of Harvard's "best and brightest" students nearly 20 years ago. "The results of that study," he writes
. . . surprised us. Over and over again, students told us they admired good work and wanted to be good workers. But they also told us they wanted - ardently - to be successful. They feared that their peers were cutting corners, and that if they themselves behaved ethically, they would be bested. And so they told us in effect, "Let us cut corners now and one day when we have achieved fame and fortune, we'll be good workers and set a good example.". . a classic case of the end justifies the means.
During the past six years, Gardner and colleagues have conducted reflection sessions at elite colleges. They found "hollowness at the core." In one case, that of a dean who was fired because she lied about her academic qualifications, the most common student response was, "Everyone lies on their resumes." In a discussion of the movie, "Enron: The Smartest Guys in the Room," students were asked what they thought of company traders who manipulated the price of energy. Not one student condemned the traders.
The example set by professors, Gardner argues, is not good:
. . . all too often they see their professors cut corners - in their class attendance, their attention to student work, and, most flagrantly, their use of others to do research. Most embarrassingly, when professors are caught - whether in financial misdealings or even plagiarizing others' work - there are frequently no clear punishments . . .
In surveys of high school students, the Josephson Institute of Ethics has found that about three-fifths admit to having cheated in the previous year. Michael Josephson, president of the institute, states that:
Few schools place any meaningful emphasis on integrity, academic or otherwise, and colleges are even more indifferent than high schools.
Some teachers have actually encouraged students to cheat and some have cheated themselves when reporting test scores. In July 2011, a cheating scandal erupted in school systems in and around Atlanta. Georgia state investigators found a pattern of "organized and systemic misconduct" dating back over 10 years. One hundred seventy-eight teachers and the principals of half the system's schools, aided and abetted students who were cheating on their tests. Top administrators ignored news reports of this cheating. A New York Times story described "a culture of fear and intimidation that prevented many teachers from speaking out."
This was not an isolated incident. In a feature on school testing, CBS News reported:
New York education officials found 21 proven cases of teacher cheating. Teachers have read off the answers during a test, sent students back to correct wrong answers, photocopied secure tests for use in class, inflated scores, and peeked at questions and then drilled those topics in class before the test.
William Damon, professor of education at Stanford and a senior fellow at the Hoover Institution, notes that
It is practically impossible to find a school that treats academic integrity as a moral issue by employing revealed incidents of cheating to communicate to its student body values such as honesty, respect for rules, and trust. . . . I have noticed a palpable lack of interest among teachers and staff in discussing the moral significance of cheating with students. The problem here is the low priority of honesty in our agenda for schooling specifically and child-rearing in general.
In the past, Professor Damon points out:
. . . there was not much hesitancy in our society about using a moral language to teach children essential virtues such as honesty. For us, today, it can be a culture shock to leaf through old editions of the McGuffey Readers, used in most American schools until the mid-20th century, to see how readily educators once dispensed unambiguous moral lessons to students. . . . As the Founders of our Republic warned, the failure to cultivate virtue in citizens can be a lethal threat to any democracy. . . . Honesty is no longer a priority in many of the settings where young people are educated. The future of every society depends upon the character development of its young. It is in the early years of life, when basic virtues that shape character are acquired. . . . Honesty is a prime example of a virtue that becomes habitual over the years if practiced consistently - and the same can be said about dishonesty.
The cheating scandals among students and teachers are, of course, simply the tip of the iceberg of our society's retreat from honesty - and honor. Ethical lapses on the part of Wall Street, Congress and other sectors of society seem to be growing. Each time a political leader speaks, the fact-checkers fill columns reporting about their misstatements. Didn't anyone think that if we stopped teaching morals and ethics - and the difference between right and wrong - that society would lose its moral compass? It appears no one did.
New York Times columnist James Reston once noted that writing newspaper columns about the events of the day is like making "footprints in the sand," quickly covered by something new.
Some writers, however, while focusing upon the events of their own time, write for the future as well, applying their philosophy and worldview to the events of the day, but focusing upon the timeless principles that reflect their view of the past as well as the future.
One such writer who graced late 20th century America was Joe Sobran, who died in 2010. He was referred to by Pat Buchanan as perhaps "the finest columnist of our generation."
In 1972, Sobran began working at National Review and stayed for 21 years, 18 as senior editor. He also spent 21 years as a commentator on the CBS Radio "Spectrum" program series and was a syndicated columnist, first with the Los Angeles Times and later with the Universal Press Syndicate.
In a new book, Joseph Sobran: The National Review Years, the Fitzgerald-Griffin Foundation has gathered together some of Sobran's best articles from 1974 to 1991. These cover a wide range of topics, including Christianity, the media, the Constitution, motion pictures, Shakespeare, and baseball. In the foreword, Buchanan writes that, "What is extraordinary about this book of essays is the range of Joe's interests and the quality of his insights."
One essay deals with an incident in 1987 when a gang of young toughs in Queens, New York, beat up three young men. When one of the three, trying to escape, was hit by a car and killed, Mayor Ed Koch called the crime a "racial lynching," because the perpetrators were white and the victims black. The media referred to America as an increasingly "racist" society, even though all indications pointed toward improving race relations.
In what came to be known as the "Howard Beach Incident" Sobran saw a built-in bias on the part of the media at work:
All news is "biased" in that it's the selection of information in accordance with tacit standards of relevance. We notice the bias when the news is chosen to fit a "super story" the audience doesn't necessarily subscribe to. . . . The super story behind the Howard Beach Story was Racist America. The very fact that it was empirically atypical made it all the more dramatic as a synecdoche. . . . The media are so saturated by myth that it's fair to see "news" as an early stage on the assembly line whose final product is a New York Times editorial.
In a review of the book Whatever Happened to the Human Race by Everett Koop and Francis Schaeffer, Sobran confronts the growing advocacy of abortion, infanticide, and euthanasia, what he calls the "cheapening of life." He declares that
. . . as the abortion issue shows, the definition of "defective" has quickly broadened to mean anything not wanted by people in a position to kill. There is the case of a young couple who asked for a prenatal test to determine the sex of the child they are expecting: they said they feared a boy would be a hemophiliac. When the test showed it was a girl, they admitted they actually wanted a boy, because they preferred a boy. The girl was aborted.
In an essay on censorship and stereotypes, Sobran points out that
Religion is still a real and vital part of American life, but it is amazingly "underrepresented" (to use the liberal term) in mass communications. This is not a matter of conspiracy or even conscious avoidance, but of unconscious habit, much like modes of dress: religion simply isn't in the intellectual wardrobe of media people.
Sobran's 1990 essay, "The Republic of Baseball" is accompanied by a picture of the author on National Review's cover in Yankee uniform at Yankee Stadium. To all Americans who grew up in the mid-20th century - particularly men - baseball was central, as Sobran shows:
Not to play means missing out on the common experience of the male sex. And once you get into it, it's easy to get absorbed. In Ypsilanti, Michigan, I spent long winters studying baseball statistics to while away the endless cold grey days until the snow melted. Then, around mid-March, we started our new season in the park, or any empty field. . . . Baseball wasn't just something we played and watched. It was something we lived.
Beyond this, writes Sobran
The statistics, discreetness of individual performance, set against the game's stable history, gives achievement in baseball a permanence and stature that other sports can seldom confer. . . . The rules are really impartial. . . . There are no "racist" balls and strikes . . . only balls and strikes. . . . In politics, men are elected to bend the rules in someone's favor. It shouldn't surprise us when they break them too. A key difference between baseball and democracy is that in baseball the winners don't get to rewrite the rules. And it never occurs to the losers to blame the rules for their losses.
Sobran was an admirer of the British author G. K. Chesterton, to whom he has been compared. He reports about his attendance in Toronto of a meeting of the Chesterton Society in 1979 and recalls Chesterton's early opposition to "the science of eugenics" whose "consequences he foresaw." Advocates of eugenics included Oliver Wendell Holmes, who supported mandatory sterilization. Of Chesterton, he wrote:
His defense of the poor was rooted in a defense of the family and of liberty against those state planners who pined for population refinement. It is not hard to see the likeness to those enlightened souls who think the state should now promote contraception and abortion among the poor. . . . It reminds us that we who are alive today are the lucky survivors of Nazism and related evils; those of the next generation will be the lucky survivors of abortion "reform."
There is, of course, much more excellent writing - and thoughtful insights in these essays, including several advancing Sobran's thesis that the 17th Earl of Oxford was, in fact, the author of the works attributed to William Shakespeare.
In the afterword, author Ann Coulter states that
Joe could say in a sentence what most writers would need an entire column to express. His specialty was to make blindingly simple points that would cut through mountains of sophistry.
One need not agree with all of Sobran's views to appreciate the keen intelligence and moral perspective he brought to his work.
Fran Griffin and the Fifzgerald-Griffin Foundation are to be congratulated for publishing this collection of Joe Sobran's essays. Hopefully, through this book a new generation of readers will be made aware of some of the best writing of the recent past. *
Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby (Council for Inter-American Security). He has been a staff aide to a U.S. Vice President, Members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Milton Friedman, the 1976 winner of the Nobel Prize for Economic Science, and the pre-eminent American advocate of free enterprise, was born on July 31, 1912 - a hundred years ago. This is an appropriate time to commemorate his life and reflect upon his achievements in advancing freedom.
It was Milton Friedman's belief that free enterprise was the only form of economic organization consistent with other freedoms. In his important book Capitalism and Freedom, he points out that
The kind of economic organization that provides economic freedom directly, namely competitive capitalism, also promotes political freedom because it separates economic power from political power and in this way enables one to offset the other.
It was his view that
Political freedom means the absence of coercion of a man by his fellow men. The fundamental threat to freedom is power to coerce, be it in the hands of a monarch, a dictator, an oligarchy, or a momentary majority. The preservation of freedom requires the elimination of such concentration of power to the fullest possible extent and the dispersal and distribution of whatever power cannot be eliminated - a system of checks and balances. By removing the organization of economic activity from the control of political authority, the market eliminates this source of coercive power. It enables economic strength to be a check to political power rather than a reinforcement.
Businessmen, Friedman liked to point out, believe in maximizing profits, not necessarily in genuinely free markets. He declared that
With some notable exceptions, businessmen favor free enterprise in general, but are opposed to it when it comes to themselves.
In a lecture entitled "The Suicidal Impulse of the Business Community" given in 1938, he declared that
The broader and more influential organizations of businessmen have acted to undermine the basic foundation of the free market system they purport to represent and defend.
What would Milton Friedman think of the recent bailout of failing banks, supported by both Republicans and Democrats? Wall Street Journal columnist David Wessel points out that
He didn't trust central bankers. He blamed the Bank of Japan for the deflation of the 1990s and the Fed for the Great Depression of the 1930s and the Great Inflation of the 1970s. He would, if his co-author Anna Schwartz is any clue, have condemned the bank bailouts of recent years. "They should not be recapitalizing firms that should be shut down," she told The Journal in 2008.
The issue he devoted most of his time to in his later years was school choice for all parents, and his Friedman Foundation for Educational Choice, is dedicated to that cause. He used to lament that
We allow the market, consumer choice, and competition to work in nearly every industry except for the one that may matter most, education.
Friedman was also proud to have been an influential voice in ending the military draft in the 1970s. When his critics argued that he wanted a military of mercenaries, he would respond:
If you insist on calling our volunteer soldiers "mercenaries," I will call those who you want to draft into service involuntary "slaves."
One of Friedman's former students at the University of Chicago, the respected economist Thomas Sowell, recalls that
Like many, if not most, people who became prominent opponents of the left, Professor Friedman began on the left. Decades later, looking back at a statement of his own from his early years, he said: "The most striking feature of this statement is how thoroughly Keynesian it is." No one converted Milton Friedman, either in economics or in his views on social policy. His own research, analysis, and experience converted him. As a professor, he did not attempt to convert students to his political views. I made no secret of the fact that I was a Marxist when I was a student . . . but he made no effort to change my views. He once said that anybody who was easily converted was not worth converting. I was still a Marxist after taking Professor Friedman's class. Working as an economist in the government converted me.
As a student of Friedman's in 1960, Sowell, who is black, notes that
I was struck by two things - his tough grading standards and the fact that he had a black secretary. This was years before affirmative action. People on the left exhibit blacks as mascots. But I never heard Milton Friedman say that he had a black secretary, though she was with him for decades. Both his grading standards and his refusal to try to be politically correct increased by respect for him.
In the late 1960s, Friedman explained that "there is no such thing as a free lunch." If the government spends a dollar, that dollar has to come from producers and workers in the private economy.
Friedman once said that
The true test of any scholar's work is not what his contemporaries say, but what happens to his work in the next 25 or 50 years. And the thing that I will really be proud of is if some of the work I have done is still cited in the textbooks long after I'm gone.
It seems certain that Milton Friedman will not only be cited in the textbooks but that men and women who value freedom everywhere in the world will recognize in him one of its prophetic voices. He clearly identified the intrinsic link between freedom of speech, religious freedom, the freedom to govern oneself - and economic freedom that, as he often pointed out, is simply democracy applied to the marketplace.
Honor and integrity used to be important in the American society. This writer, as a student at the College of William and Mary, signed the school's Honor Code, which declared that anyone who stole or cheated would be immediately removed from the college. This was the first Honor Code at an American college. It reflected the values of that society. Professors left the room when students took exams, and dormitory rooms were often left unlocked. Honor was more valued than anything that might be gained from dishonor.
Now, our society seems to have embraced a different standard of value, or non-value. Consider just a few recent developments.
* Seventy-one at New York's elite Stuyvesant High School were involved in cheating on the state's Regent's examinations in Spanish, U.S. History, English, and Physics. Stuyvesant admits just the top tier of 8th graders. The students involved in cheating were not expelled - and were not even given a failing mark in the exam. Instead, they remain enrolled in the school and will be able to retake the exam. Commenting on this, Frank W. Abagnale, the subject of the book, movie and Broadway musical "Catch Me if You Can," declared: "We do not teach ethics at home, and we do not teach ethics in school because the teacher would be accused of teaching morality. In most cases, we do not teach ethics in college or even instill ethics in the workplace."
* A report issued in mid-July by former FBI Director Louis Freeh, after an eight-month investigation, concluded that four of Penn State University's most powerful leaders, including head football coach Joe Paterno and the school's president, covered up allegations of sexual abuse by an assistant coach because they were concerned about negative publicity. Confronted with reports that Jerry Sandusky lured boys to the State College campus where he sexually abused them, Penn State's leadership deferred to "a culture of reverence for the football program" and repeatedly "concealed Sandusky's activities from authorities." Freeh said "Our most saddening and sobering finding is the total disregard for the safety and welfare of Sandusky's child victims by the most senior leaders at Penn State."
* Congressional ethics, we know, is an oxymoron. Recently, a report was issued by Rep. Darrell Issa (R-CA), chairman of the House Committee on Oversight and Government Reform, about how a group of lawmakers and their staff benefited from a "VIP" loan program not available to the public that waived fees, cut interest rates, and eased borrowing standards. Countrywide Financial offered the special loans in an effort to dissuade lawmakers from voting for stiffer banking regulations. The report names names, with many lawmakers still in Congress, but it did not include a letter from Issa calling for the ethics panel to investigate the matter. Without the letter, the ethics panel is not required to do a thing.
* In Washington, D.C., there are calls for the resignation of Mayor Vicent Gray. He has refused to answer questions about whether or not he knew, before or during the 2010 Democratic mayoral primary, about a secret, well-funded and illegal "shadow campaign" on his behalf. More than $653,000 was unlawfully used to purchase materials and hire workers to secure his victory over then Mayor Adrian Fenty two years ago, money allegedly supplied by a prominent businessman with significant contractual interests with the U.S. Government. Mayor Gray's campaign slogan was "character, integrity, leadership." Three members of the D.C. Council - and a host of others in the city - have called for the mayor to resign.
Many books have been written about financial misdeeds on Wall Street, about child abuse and cover-ups within the Roman Catholic Church, and among the Orthodox Jewish community in New York. While it may be true that bad news is news while good news is not, the bad news is increasingly widespread.
Our crisis in values has been building for some time. The May-June1988 issue of The Harvard Magazine published an eleven-page essay, "Ethics, the University, and Society," by President Derek Bok. He declares that
The American nation is greatly in need of some means to civilize new generations of the people, preparing them to serve as honest, benevolent, productive citizens of a free society, and all of Harvard's deliberations and studies and initiatives and earnest concerns have not resulted in any effective means of Character Education.
Derek Bok concludes:
Despite the importance of moral development to the individual and the society, one cannot say that higher education has demonstrated a deep concern for the problem. . . . The subject is not treated as a serious responsibility worthy of sustained discussion and determined action. . . . If this situation is to change there is no doubt where the initiative must lie. Universities will never do much to encourage a genuine concern of ethical issues or to help their students to acquire a strong and carefully considered set of moral values until presidents and deans take the lead.
Needless to say, things have deteriorated a great deal since then. It is not only the values of average Americans that appear to be in a fee fall of decline, but those of our elites may be leading the way. Who in Washington or Wall Street - or at Penn State - is held responsible for what they do?
New York Times columnist David Brooks laments about the decline on the part of today's elites. In the past, he writes, elites
. . . had a stewardship mentality, that they were temporary caretakers of institutions that would span generations. They cruelly ostracized people who did not live up to their codes of gentlemanly conduct and scrupulosity. They were insular and struggled with intimacy, but they did believe in restraint, reticence, and service.
Today's elite, in Brooks' view,
. . . is more talented and open but lacks a self-conscious leadership code. The language of meritocracy (how to succeed) has eclipsed the language of morality (how to be virtuous). Wall Street firms, for example, now hire on the basis of youth and brains, not experience and character. Most of their problems can be traced to this. If you read the e-mails from the Libor scandal you get the same sensation from reading the e-mails in so many recent scandals: these people are brats; they have no sense that they are guardians for an institution the world depends on; they have no consciousness of their larger social role.
How to reverse our moral decline is not a subject that is being widely discussed in our contemporary society. It should be. If it is not addressed, all of us - and our children and grandchildren - will be losers.
Government spending, everyone realizes, is out of control. During President George W. Bush's tenure from 2001 through 2009, the national debt doubled. According to Bruce Riedl of the Heritage Foundation, the prescription drug bill alone is projected to add nearly $400 billion in its first decade.
Both parties are clearly in this together. According the Factcheck.org:
Spending under President Obama remains at a level that is quite high by historical standards. Measured as a percentage of the nation's economic production, it reached the highest level since World War II in fiscal 2009.
Since 2009, the Obama administration has maintained trillion dollar deficits. Writing in Roll Call, Dustin Siggins and David Weinberger report that
If we average spending as a percentage of GDP under Bush from 2001 to 2009, it comes to just over 20 percent. . . . If we do the same for Obama from 2010 to 2012, we get about 24 percent, quite a bit higher than the historical average.
One place to cut spending is to eliminate depression-era farm subsidies. But since each farm state has two senators, some Democrats, some Republicans, this has been difficult to do. Now, in our era of economic decline and skyrocketing government debt, it is time to take a serious look at this wasteful program.
In 2012, the Department of Agriculture is projected to spend $22 billion on subsidy programs for farmers. Veronique de Rugy of the Mercatus Center at George Mason University notes that this program was introduced in the 1930s to help struggling small family farms but
. . . the subsidies have become the poster child for government welfare for the affluent. Farm households have higher incomes, on average, than do non-farm U.S. households. Figures from the USDA show that in 2010 the mean farm household income was $84,400, up 9.4 percent from 2009. This is 25 percent higher than the mean U.S. household income of $67,350 as reported by the U.S. Census Bureau for 2010.
Beyond this, farm subsidies tend to flow toward the largest and wealthiest farm businesses. According the Environment Working Group database, in 2010, 10 percent of farms received 74 percent of all subsidies. These recipients are large commercial farms with more than $250,000 in sales and mostly produced crops tied to political interests. The Cato Institute's Tad DeHaven and Chris Edwards calculate that more than 90 percent of all farm subsidies go to farmers of just five crops - corn, wheat, soybeans, rice, and cotton. For every federal dollar spent on farm subsidies, 19 cents goes to small farms, 19 cents goes to intermediate (middle income) farms, and 62 cents to the largest commercial farms.
In De Rugy's view
The tragedy is that while cronyism benefits the haves, all other Americans - especially those with lower incomes - suffer from the resulting distortions. Take the domestic sugar industry as an example. The government protects its producers against foreign competitors by imposing U.S. import quotas, and against low prices generally with a no-recourse loan program that serves as an effective price floor. As a result . . . U.S. consumers and businesses had to pay twice the world price of sugar on average since 1982.
According to the Center for Responsive Politics, the U.S. farm lobby contributes millions of dollars to political campaigns to maintain federal support for the subsidies. The agribusiness sector as a whole spent $124 million on lobbying 2011. For the past decade, the amount of money this sector has spent on lobbying has grown more than 69 percent. Between 1995 and 2009, 23 farmers currently serving in Congress have signed up for farm subsidies.
The fact is that the U.S. has the richest, most productive agricultural sector, and the best fed population in the world. Boosted by $136.3 billion in gross sales to other countries, U.S. net farm income hit a record $98.1 billion in 2011. A new Economist Intelligence Unit report ranks the U.S. as the most "food secure" nation in the world, based on the affordability and quality of its food supply. The U.S. provides the equivalent of 3,748 calories per day for each of its roughly 314 million people. That is nearly 1,500 calories more than the minimum necessary for a healthy life.
Still, every five years Congress drafts a farm bill as if U.S. agriculture cannot possibly exist in a real free market economy. At this very moment, farm-state lawmakers and the lobbyists who swarm around them, are preparing to extend this program of subsidies. The Senate has already passed a measure priced at $969 billion over the next decade. The House has gone on summer vacation without acting as Republicans weigh the election-year political risks of proceeding with that chamber's own near-trillion dollar measure.
Editorially, The Washington Times points out that
Like the bank bailouts and TARP, the farm bill illustrates the capture of the legislative process by special interests. The last farm bill in 2008 was the focus of $173.5 million in lobbying expenditure. . . . This is all money spent on what the Mercatus Center's Matthew Mitchell calls "unproductive entrepreneurship" where people are organizing and expending their talent to become rent seekers, and the end result is wealth redistribution, not wealth creation. Real entrepreneurship innovates in ways that are socially useful. Cronyism diverts resources . . . into a system that rewards privileges to favored groups. In the case of the 2008 farm bill, recipients of subsidies of $30,000 or more had an average household income of $210,000.
Many in Congress who speak of their belief in the free market and who decry huge government deficits, nevertheless seem ready to extend farm subsidies into the future. This tells us, unfortunately, that what we are witnessing is politics as usual. And both parties are in it together. No one needs to wonder why we can't bring government spending under control. This is why. *
The growth of presidential power in recent years represents a serious threat to representative government. The idea of the executive "executing" the laws passed by the elected representatives of the people in the Congress seems to those in power - whether Republicans or Democrats - to be an old-fashioned notion.
When President Obama unilaterally called a halt to deportation proceedings for certain unauthorized immigrants who came to the U.S. as minors, the eligibility requirements roughly tracked the requirements of the Dream Act, which was never passed by Congress.
In an interview with a panel of Latino journalists last fall, the president said:
This notion that somehow I can just change the laws unilaterally is just not true. We live in a democracy. You have to pass bills through the legislature and then I can sign it.
Gene Healy, vice president of the Cato Institute, notes that,
As it happens, Obama's "royal dispensation" for young immigrants is hardly the most terrifying instance of administration unilateralism. In fact, as a policy matter, it's a humane and judicious use of prosecutorial resources. But given the context, it stinks. It looks uncomfortably like implementing parts of a bill that didn't pass and - carried out as it was with great fanfare and an eye to the impending election - the move sits uneasily with the president's constitutional responsibility to "take care that the laws be faithfully executed."
Or consider the president's claim of "executive privilege" in withholding information about the Justice Department's Operation Fast and Furious, which deliberately put assault weapons in the hands of Mexican drug cartels as part of a sting, and then lost track of hundreds of them. A Border Patrol agent was killed in 2010, apparently by one of these guns.
Executive privilege, affirmed by the Supreme Court in U.S. v. Nixon, is historically limited to the president's own discussions. President Obama has now extended it to his attorney general. This contravenes the president's promises of transparency.
Recent legislation has made legal the president's right to detain a person indefinitely on suspicion of affiliation with terrorist organizations or "associated forces," a broad, vague power that can be abused without real oversight from the courts or the Congress.
At the same time, American citizens can now be targeted for assassination or indefinite detention. Recent laws have also canceled the restraints in the Foreign Intelligence Surveillance Act of 1978 to allow unprecedented violations of our right to privacy through warrantless wiretapping and government mining of our electronic communications.
According to The New York Times, President Obama is personally deciding upon every drone strike in Yemen and Somalia and the riskiest ones in Pakistan, assisted only by his own aides. It is reported that suspects are now being killed in Yemen without anyone knowing their names, using criteria that have not been made public.
Editorially, The Times declares that no president
. . . should be able to unilaterally order the killing of American citizens or foreigners located far from a battlefield - depriving Americans of their due process rights - without the consent of someone outside his political inner circle. How can the world know whether this president or a successor truly pursued all methods short of assassination, or instead - to avoid a political charge of weakness - built up a tough-sounding list of kills?
To permit President Obama - or any president - to execute American citizens without judicial review and outside the theater of war, gives him the power of judge, jury and executioner without any check or balance. This is clearly an abuse of presidential power.
For many years, under both parties, the power of the executive has been growing. In his classic work, The American Presidency, written in 1956. Professor Clinton Rossiter writes that:
The presidency has the same general outlines as that of 1789, but the whole picture is a hundred times magnified. The president is all the things he was intended to be, and he is several other things as well. . . . The presidency today is distinctly more powerful. It cuts deeply into the power of Congress. In fact it has quite reversed the expectations of the framers by becoming itself a vortex into which these powers have been drawn in massive amounts. It cuts deeply into the lives of the people; in fact, it commands authority over their comings and goings that Hamilton himself might tremble to behold. The outstanding feature of American constitutional development is the growth of the power and prestige of the Presidency.
He also makes the converse explicit:
The long decline of Congress has contributed greatly to the rise of the presidency. The framers . . . expected Congress to be the focus of our system of government.
That was 1956. The power of the presidency has steadily expanded since then, no matter which party was in power.
When Republican presidents have expanded the power of the presidency, Republicans in the Congress have acquiesced. When Democratic presidents expanded the power of the executive, Democrats in the Congress have embraced that expansion. The result is an executive branch increasingly unaccountable to the elected representatives of the people. That is not the system the authors of the Constitution had in mind. We would do well to return to the constitutional philosophy of checks and balances, and a division of powers. An all powerful executive is a threat to freedom and accountability, as the Framers of the Constitution understood very well as a result of their own experience and of the experience of the world.
Government spending and government debt has been skyrocketing. Under President George W. Bush, the debt reached unprecedented levels. Under President Barack Obama, it has exploded still further. Whichever party is in power, government gets larger and government debt increases.
Our political system, sadly, rewards big spenders. Every organized special interest group in the American society - the farmers, the teachers, the labor unions, manufacturers, Wall Street financiers, realtors, etc. - want one or another form of government subsidization for themselves.
They all have active political action committees, which promise rewards for those who open the government coffers to them, and penalties for those who do not. The incentive is clearly one-sided. As Democrats used to say in the New Deal days, the way to electoral success is clear: "Spend and spend, tax and tax, elect and elect." Now, Republicans too have learned this lesson. Since neither Republicans nor Democrats are too eager to antagonize voters by raising taxes to pay for their extravagant spending, the budget deficits grow each year.
In May, for example, President Obama reauthorized the Export-lmport Bank, raising its lending authority 40 percent to $140 billion by 2014, one day before the 78-year-old federal bank would have been shut down had he not signed the bill. During the 2008 presidential campaign, Mr. Obama called the bank, "little more than a fund for corporate welfare."
Despite President Obama's frequent criticism of corporate jets, the bill includes $1 billion in subsidies for jet manufacturers, which have experienced a steep decline in demand in recent years. Export-lmport Bank supporters in the business community - who speak of "free markets" but campaign vigorously for government subsidies - welcomed the President's support. John Murphy, vice president for international affairs at the U.S. Chamber of Commerce, said that the President's action was "Great news for thousands of American workers and businesses of all sizes." The National Association of Manufacturers - and both Republicans and Democrats in Congress-supported the reauthorization of the Export-lmport Bank.
Tim Phillips, president of Americans for Prosperity, described the Bank in these terms:
In (its) nearly 80 years, the official credit export agency of the United States has financed over $450 billion in purchases. Ex-Im allows the federal government to pick winners and losers in the market, and all too often, that leads to back room deals and government cronyism. . . . It is a heinous practice that gives money to a small number of politically connected companies while leaving taxpayers with the risk. . . . The American taxpayer does not exist in order to keep businesses from failing.
Republicans and Democrats are co-conspirators in this enterprise. The incentive structure for both parties is precisely the same. Republicans may talk of the "free market" and argue that Democrats are against it, but both parties raise their funds on Wall Street and in corporate boardrooms, and both parties have supported bailouts of failed businesses and subsidies for others.
Voters say that they are against big government, and oppose deficit spending, but when it comes to their own particular share, they act in a different manner entirely. This is nothing new. Longtime Minnesota Republican Congressman Walter Judd once recalled that a Republican businessman from his district
. . . who normally decried deficit spending berated me for voting against a bill which would have brought several million federal dollars into our city. My answer was, "Where do you think federal funds for Minneapolis come from? People in St. Paul?". . . My years in public life have taught me that politicians and citizens alike invariably claim that government spending should be restrained - except where the restraints cut off federal dollars flowing into their cities, their businesses, or their pocketbooks.
If each group curbed its demands upon government it would be easy to restore health to our economy. Human nature, however, leads to the unfortunate situation in which, under representative government, people have learned that through political pressure they can vote funds for themselves that have, in fact, been earned by the hard work of others.
This point was made 200 years ago by the British historian Alexander Tytler:
A democracy cannot exist as a permanent form of government. It can only exist until the voters discover they can vote themselves largess out of the public treasury. From that moment on, the majority always votes for the candidates promising the most benefits from the public treasury - with the result that democracy collapses over a loose fiscal policy, always to be followed by a dictatorship.
Hopefully, we can avoid fulfilling this prediction. It is an illusion to think that such a thing as "government money" exists. The only money which government has is what it first takes from citizens. Many years ago, Senator William Proxmire (D-Wisconsin) pointed out that no one ever petitions members of Congress to "leave us alone," everyone who comes before Congress, he lamented, wants something. Members of Congress - of both parties - have the same incentive, to give each group what it wants to ensure their support for the future. The result is that government spending - and government debt - steadily grows.
Unless we find a way to change this incentive structure, it seems unlikely that we will bring government spending - and government deficits - under control. As the presidential campaign gets under way, neither party is addressing this crucial question. Politics as usual, unfortunately, will not help us to resolve the very real problems we face.
In June, the effort by labor unions and others to recall Wisconsin Governor Scott Walker was soundly defeated. Governor Walker had not committed any crime or other indiscretion. He was being recalled only because he had implemented the policies he promised during his campaign.
In February 2011, Walker announced his plan to limit the subjects covered by collective bargaining for public employees, compel them to contribute more to their healthcare and pension plans, stop government from collecting dues automatically on unions' behalf, and require public employee unions to hold annual certification elections. Once in office, he implemented these policies.
Wisconsin's recall policy is questionable and, in many ways, a threat to representative democracy. Officeholders can be removed at the conclusion of their terms for policy disagreements. Wisconsin has had 14 elected state government officials involved in recall elections during the past year alone. The state's largest newspaper, the Milwaukee Journal Sentinel, endorsed Governor Walker, arguing that elected officials should not be recalled simply for policy differences. Some 60 percent of voters in exit polls agreed.
Beyond this, the union effort in Wisconsin has focused a much-needed spotlight on the excesses of public pensions. Over the years, politicians have given in to union demands for higher pensions - rather than wage increases - because they knew that such pensions would be paid many years later, under someone else's watch. Now, these bills are coming due -and many states and cities are in no position to pay them.
In New Jersey, Governor Chris Christie is seeking to reform his state's sick-pay policies. Current law allows public workers to accumulate unused sick pay, which they can cash in upon retirement. "They call them 'boat checks,"' Christie says.
Know the reason they call them boat checks? It is the check they use to buy their boat when they retire - literally.
He tells the story of the town of Parsippany, where four police officers retired at one time and were owed a collective $900,000 in unused sick pay. The municipality didn't have the money and had to float a bond in order to make the payments.
Governor Christie wants to end sick-pay accumulation. "If you're sick, take your sick day," he says. "If you don't take your sick day, know what your reward is? You weren't sick - that was the reward." Newsweek notes that,
It was by the force of such arguments that Christie was able to overcome the powerful teachers, union and force educators to help pay for their healthcare costs, and to win broad rollbacks in benefits for the state's huge public workforce.
Shortly after the vote in Wisconsin, there were landslide victories in San Jose and San Diego, California, of ballot measures meant to cut back public sector retirees' benefits. Warren Buffet calls the costs of public-sector retirees a "time bomb." They are, he believes, the single biggest threat to our fiscal health.
In California, total pension liabilities - the money the state is legally required to pay its public-sector retirees - are 30 times its annual budget deficit. Annual pension benefits rose by 2,000 percent from 1999 to 2009. In Illinois, they are already 15 percent of general revenue and growing. Ohio's pension liabilities are now 35 percent of the state's GDP.
Commentator Fareed Zakaria notes that,
The accounting at the heart of the government pension plans is fraudulent, so much so that it should be illegal. Here's how it works. For a plan to be deemed solvent, employees and the government must finance it with regular monthly contributions as determined by assumptions about investment returns of the plan. The better the investment returns the less the state has to put in. So states everywhere made magical assumptions about investment returns.
David Crane, an economic adviser to former California Governor Arnold Schwarzenegger, points out that state pension funds have assumed that the stock market will grow 40 percent faster in the 21st century than it did in the 20th. While the market has grown 175 times during the past 100 years, state governments are now assuming that it will grow 1,750 times its size over the next 100 years.
Inflated retirement benefits are the reason for dramatic cuts in spending by state and local government for anything else. Last year, California spent $32 billion on employee pay and benefits, up 65 percent over the past 10 years. In that same time period, spending on higher education is down 5 percent. Three-quarters of San Jose's discretionary spending goes to its public safety workers alone. The city has closed libraries, cut back on park services, laid off many civil servants and asked the rest to take pay cuts. By 2014, San Jose, the 10th largest city in the country, will be serviced by 1,600 public workers, one-third the number it had 25 years ago.
The Pew Center on the States has quantified the problem. In 2008, the states had set aside $2.35 trillion to pay for public employees' retirement benefits while owing $3.35 trillion in promises. A year later, the trillion-dollar gap had grown by 26 percent. The massive expanding obligation cuts into the provision of government services. Former Los Angeles Mayor Richard Riordan notes that:
A lot of things are going to happen dramatically over the next couple of years and then people will listen. If you close down all the parks and all the libraries, this is political dynamite.
In Wisconsin, as a result of Governor Walker's reforms, the state has balanced its two-year budget without tax increases and local school districts have used their new bargaining power to save money without layoffs or significant increases in class size. While leading Democrats, such as President Obama and former President Clinton, supported the recall effort in Wisconsin, many others, such as the liberal Democratic mayor of San Jose, recognize that it is time to bring the excesses of public sector unions under control. Editorially, The Washington Post declared,
. . . those who voted for Mr. Walker to show approval for his policies, and not just disapproval for the recall itself, had plausible reasons for doing so. . . . Public employee union leaders are pledging to fight . . . new laws in court. . . . They would do better to engage governments in a good-faith effort to restructure and preserve public services for the long term. States and localities face genuine problems, and the unions share responsibility for them.
In recent days, with the extraordinary publicity surrounding the Trayvon Martin case in Florida and an escalation in overheated racial rhetoric, one would think that the real problem facing black Americans are a result of "white racism."
Needless to say, this overlooks the fact that race relations in America have dramatically improved in recent years and that we are well on our way to achieving a genuinely color blind society.
Writing in The Washington Post, columnist Richard Cohen points out that most Americans
. . . do not know what a miracle has been pulled off - how a nation that once contained so much bigotry now contains so little. I am not a fool on these matters, I think, and I recognize . . . the residue of bigotry, but still the big picture is that Obama is a black man and is the president of the United States. Mamma, can you believe it?
Cohen provides this assessment:
Some insist that not much has changed. They cite a persistent racism. There are many such examples - not all that many, actually - but they are newsworthy because they are exceptions to the rule, not what we expect.
Recently, Wesley A. Brown, the first African American to graduate from the U.S. Naval Academy died. He was the sixth black man admitted and - the only one to successfully endure the racist hazing that had forced others to quit. He graduated in 1949. Cohen writes that,
When I read the obituary on Wesley A. Brown, I was shocked once again at the depth and meanness of our racism and just plain dumbstruck by how far we have come. The new field house at the Naval Academy is named for Brown. He called it, "The most beautiful building I've ever seen," but he was wrong. It's not a building. It's a monument.
This is not to say that the black community does not face many problems. These problems, however, are not a result of white racism but of internal forces at work within the black community. One such serious problem is crime.
Each year, roughly 7,000 blacks are murdered; 94 percent of the time, the murderer is another black person. According to the Bureau of Justice Statistics, between 1976 and 2011, there were 279,384 black murder victims. The 94 percent figure suggests that 262,621 were murdered by other blacks.
Though blacks are 13 percent of the national population, they account for more than 50 percent of homicide victims. Nationally, the black homicide victimization rate is six times that of whites, and in some cities, it is 32 times that of whites. Blacks are also disproportionately victimized by violent personal crimes, such as assault and robbery.
Economist Walter Williams points out that,
The magnitude of this tragic mayhem can be viewed in another light. According to a Tuskegee Institute study, between the years 1882 and 1998, 3,446 blacks were lynched at the hands of whites. Black fatalities during the Korean War (3,075), Vietnam War (7,243) and all the wars since 1980 (8,107) come to 18,425, a number that pales in comparison with black loss of life at home. Tragically, young black males have a greater chance of reaching maturity on the battlefields of Iraq and Afghanistan than on the streets of Philadelphia, Chicago, Detroit, Oakland, Newark, and other cities.
Sadly, the question is hardly ever discussed by black leaders. In Williams' view,
A much larger issue is how might we interpret the deafening silence about the day-to-day murder in black communities compared with the national uproar over the killing of Trayvon Martin. Such a response by politicians, civil rights organizations, and the mainstream news media could easily be interpreted as blacks killing other blacks is of little concern, but it's unacceptable for a white to kill a black person.
Several black leaders have started to discuss black-on-black crime. When President Obama commented about the Martin case, William Fair, president of the Urban League of Greater Miami, said that "the outrage should be about us killing each other, about black-on-black crime." He asked rhetorically:
Wouldn't you think to have 41 people shot (in Chicago) between Friday morning and Monday morning would be much more newsworthy and deserve much more outrage?
Former NAACP leader Pastor C. L. Bryant said that the rallies organized by Al Sharpton and Jesse Jackson suggest there is an epidemic of "white men killing young black men," adding, "The epidemic is truly black-on-black crime. The greatest danger to the lives of young black men are young black men."
Beyond this, argues Walter Williams,
Not only is there silence about black-on-black crime, there's silence about black racist attacks on whites - for example, the recent attacks on two Virginian-Pilot newspaper reporters set upon and beaten by a mob of young blacks (in Norfolk, Virginia). The story wasn't even covered by their own newspaper. In March, a black mob assaulted and knocked unconscious, disrobed and robbed a white tourist in downtown Baltimore. Black mobs have roamed the streets of Denver, Chicago, Philadelphia, New York, Cleveland, Washington, Los Angeles and other cities, making unprovoked attacks on whites and running off with their belongings.
This is not a new story. This writer was the author of a book (with Lincoln Review editor J. A. Parker) in 1974 entitled What The Negro Can Do About Crime (Arlington House). There was an extensive discussion of black-on-black crime and the manner in which black leaders refused to confront it.
On Page 54 is the following passage:
Criticizing those Negroes who have not spoken out against crime, Roy Wilkins, executive director of the NAACP, declared that, "except for a few voices, Negro citizens have given consent to robbery, muggings, assaults, and murder by their silence. They have been intimidated by a curious twisting of the 'us blacks together' philosophy that holds that complaining of black criminals is somehow 'betraying the race.'" This is nonsense. One can be proud of being black without embracing every black mugger, rapist, and auto thief.
For those in the black community genuinely concerned about the future prospects of its young men and women, focusing upon the black-on-black crime wave that now engulfs our inner cities, and has broken out into attacks upon the community at large, is an important place to begin. Thus far, however, this has largely been ignored in place of repeated attacks upon "white racism," which, by any standard, has receded dramatically. Such racial demagoguery serves the very community in whose name it is launched. It is time for a radically different direction.
We are now in an era when we are told that a proper goal for society is for "everyone" to go to college. At the same time, there is a serious mismatch of jobs that are now available and the number of individuals who are qualified to fill them. Manufacturing companies, for example, cannot find enough high-tech machinists, and they are subsidizing tuition at local community colleges in a desperate effort to fill vacancies.
The Cato Institute's Andrew Coulson reports that we spend - in real terms - almost twice as much per student in a public school as we did in 1970. Despite this, academic achievement has remained flat or worsened. Vocational training, a long and important path to gainful employment, has been pushed aside.
Vocational education once played an important part in our schools, designed for those who were not suited for, or had no interest in, higher education. About forty years ago, it began to fall out of fashion, in part because it became a civil rights issue. As Time recently noted:
Vocational education was seen as a form of segregation, a convenient dumping ground for minority kids in Northern cities.
Former New York City schools chancellor Joel Klein says that,
This was a real problem. And the vocational education programs were pretty awful. They weren't training the kids for specific jobs or for certified skills. It really was a waste of time and money.
In an important article, "Learning That Works," Time writer Joe Klein declares that,
Unfortunately, the education establishment's response to the voc-ed problem only made things worse. Over time, it morphed into the theology that every child should go to college (a four-year liberal arts college at that) and therefore every child should be required to pursue a college-prep course in high school. The results have been awful. High school dropout rates continue to be a national embarrassment, and most high school graduates are not prepared for the world of work. The unemployment rate for recent high school graduates who are not in school is a stratospheric 33 percent. The results for even those who go on to higher education are brutal: four-year colleges graduate only about 40 percent of the students who start them, and two-year community colleges graduate less than that, about 25 percent.
Diane Ravitch, a professor of education at New York University, says that,
College for everyone has become a matter of political correctness. But according to the Bureau of Labor Statistics, less than a quarter of new job openings will require a bachelor of arts degree. We're not training our students for the jobs that actually exist.
At the same time, the U.S. is beginning to run out of welders, glaziers, and auto mechanics - jobs that actually keep things running, and cannot be outsourced.
In Arizona and a few other states, things are beginning to change. Vocational education there is now called career and technical education (CTE) and now attracts about 27 percent of students. It has been found that they are more likely to score higher on the state's aptitude tests, graduate from high school, and go on to higher education than those who don't.
"It's not rocket science," says Sally Downey, superintendent of the East Valley Institute of Technology in Mesa, Arizona, 98.5 percent of whose students graduate from high school. "It's just finding something they like and teaching it to them with rigor."
At the Auto shop in East Valley, there are 40 late model cars and the latest in diagnostic equipment, donated by Phoenix auto dealers, who are in need of trained technicians. "If you can master the computer science and electronic components," Downey says, "you can make over $100,000 a year as an auto mechanic."
Carolyn Warner, a former Arizona schools chancellor, says tech track students
. . . are more focused, so they're more likely to graduate from two- and four-year colleges. Those who graduate from high school with a certificate of technical expertise in a field like auto repair or welding are certainly more likely to find jobs.
At East Valley, there are 38 programs, with more coming. There are firefighter, police, and EMT programs; a state-of-the-art kitchen for culinary services training and welding (which can pay $40 per hour), aeronautics, radio station, marketing, and massage therapy instruction. Almost all of these courses lead to professional certificates. In addition to high school diplomas, many of the students are trained by employers for needed technical specialties.
An interesting example of business participation in technical and vocational education can be seen in the case of a new public school in Brooklyn, New York. called P-Tech, or Pathways in Technology Early College High School. Started last September, it is a partnership of the New York City department of education, the New York City College of Technology, the City University of New York and IBM, whose head of corporate social responsibility, Stanley Litow, used to be the city's deputy schools chancellor.
The goal is to create a science and tech-heavy curriculum to prepare students - some of whom would be the first in their families to graduate from high school - for entry and mid-level jobs at top tech-oriented companies. Each student gets an IBM mentor and there is also a core curriculum focused on English, math, science, and technology.
P-Tech students will graduate with not only a high school diploma but an associate's degree as well. This is important, since 63 percent of American jobs will require postsecondary training by 2018. The U.S. economy will create more than 14 million new jobs over the next decade, but only for people with at least a community college degree. These jobs - positions like dental hygienist, medical laboratory technician, aircraft mechanic and entry level software engineer - will allow millions entry into the middle class. Many of them will require serious technology skills.
Harvard Business School professor Rosabeth Moss Kanter argues that as much as a third of the increase in unemployment in recent years can be attributed to a mismatch between skills and jobs. The gap is greatest in positions that require more than a high school diploma but less than a bachelor's degree. Companies feel that schools are simply not turning out graduates with the skills they need. That was an impetus for IBM's role with New York's P-Tech.
Chicago Mayor Rahm Emanuel is setting up five new STEM schools - the acronym stands for science, tech, engineering and math - in partnership with IBM, Microsoft, Verizon, Cisco and other companies.
Vocational education deserves a serious second look by school systems across the country. Training young men and women for jobs that actually exist in our economy - something our current educational system is not doing very well - is certainly worth doing, both for the sake of the young people involved and for the health of our larger society and economy. *
The facts in the case of the killing of Trayvon Martin in Sanford, Florida remain unclear. As the trial proceeds, such facts should be revealed.
By any standard, the shooting of an unarmed 17-year-old is a tragedy. Did Martin attack the alleged shooter, George Zimmerman, who claims he was defending himself? Was Zimmerman animated by racial animus? All of this, as we move forward, will, hopefully become known.
What we have seen, however, is a rush to judgment, particularly by those who seem to have a vested interest of their own in painting a bleak picture of race relations in the United States.
In an interview with the Los Angeles Times, the Rev. Jesse Jackson explained that with the election of President Obama:
. . . there was this feeling that we were kind of beyond racism. . . . That's not true. This victory has triggered tremendous backlash. Blacks are under attack.
The New Black Panther Party (NBPP), involved in voter fraud in Philadelphia but never prosecuted, has offered a $10,000 bounty for the capture of George Zimmerman. The Orlando Sentinel asked NBPP spokesman Mikhail Muhammad whether the call for a bounty was incitement. The response: "An eye for an eye, a tooth for a tooth." In an interview with CNN's Anderson Cooper, Muhammad said that black people were not obliged to obey "the white man's law."
On Capitol Hill, Rep. Bobby Rush (D-IL), who is black, was ousted from the House floor for violating the chamber's dress code after attempting to deliver a statement while wearing a gray hoodie with the hood pulled over his head. Rush contended that the hoodie Trayvon Martin was wearing symbolized the "racial profiling" that led to his death. "Racial profiling has got to stop," Rush said. "Just because someone wears a hoodie does not make them a hoodlum."
Writing in The Washington Post, Renique Allen of the New America Foundation, argues that the election of a black president has made it more difficult to talk about race in America. In her view:
The Obama presidency is "post-racial" only in the sense that it gives us an excuse not to grapple with race anymore . . . I have encountered many people who seem to believe . . . that Obama's win is proof that America has reached the mountaintop. What more is there to say about race, they ask me, after this country so proudly and overwhelmingly elected a black president? They cite success stories as disparate as Oprah Winfrey, Jay-Z, and former Time Warner chief Dick Parsons. . . . Even the most well-intentioned white people, who fundamentally understand the challenges of race in America, often can't understand why race, as a subject to wrestle with, can never be "over."
There is no doubt that racial problems have not disappeared overnight with the election of a black president. Still the evidence that race relations have been steadily improving is clear, and it is not helpful for various black spokesmen - from Jesse Jackson to Al Sharpton to Spike Lee - to use any incident, such as in Sanford, Florida, to publicly proclaim that nothing - or very little - has, in fact, changed.
Things in the Trayvon Martin case have clearly gotten out of hand. Marcus Davonne Higgins, a Los Angeles man, sent a tweet to several celebrities including what he thought was the street address of alleged shooter George Zimmerman. Film director Spike Lee didn't check but re-tweeted the incorrect address to the 250,000 people who follow him on Twitter.
Columnist Gregory Kane, who is black, reports that:
Suddenly the Sanford home of Elaine McClain, 70, and her 72-year-old husband, David McClain, started receiving hate mail and threats. George M. Zimmerman does not and has never lived at the address that Lee and others published on Twitter. But William George Zimmerman, Elaine McClain's son from a previous marriage, lived there at one time. Higgins had tweeted the wrong address. . . . Lee, an African-American who's always trying to prove how black he is, and how down with the brothers he is, probably couldn't resist what must have come naturally to him. He decided to retweet the address, the better to make a statement about the Martin shooting. The McClains had to move from their home to a hotel. . . .
Despite the demagoguery we have seen in the wake of this incident in Florida, there are abundant signs that America is really moving in the direction of becoming a color-blind society. According to a study the U.S. Census released late in January, residential segregation has been dramatically curtailed. The study of census results from thousands of neighborhoods by the Manhattan Institute found that the nation's cities are more economically integrated than at any time since 1910. It was found that all-white enclaves "are effectively extinct."
"There is now much more black-white neighborhood integration than 40 years ago," said Professor Reynolds Farley of the University of Michigan's Population Studies Center. "Those of us who worked on segregation in the 1960s never anticipated such decline."
At the same time, interracial marriages in the U.S. have climbed to 4.8 million - a record 1 in 12. A Pew Research Center study, released in February, details an America where interracial unions and mixed-race children are challenging typical notions of race.
"The rise in interracial marriage indicates that race relations have improved over the past quarter-century," said Daniel Lichter, a sociology professor at Cornell University.
Mixed-race children have blurred America's color line. They often interact with others on either side of the racial divide, and frequently serve as brokers between friends and family members of different racial backgrounds.
Black Americans are optimistic about the future. A 2011 survey conducted by the Washington Post-Kaiser-Harvard Poll found that in the midst of our economic downturn, 60 percent of blacks said they believed their children's standard of living would be better than their own, while only 36 percent of whites held this view. On the eve of President Obama's inauguration, 69 percent of black respondents told CNN pollsters that Martin Luther King's vision had been "fulfilled."
From 2002 to 2007, the number of black-owned businesses grew by 60.5 percent to 1.9 million, more than triple the national rate of 18 percent, according to the Census Bureau. Black Americans hold positions of responsibility in every aspect of our society - from President, to Governor, to Attorney General, to Supreme Court justice. We have, in recent years, had two black Secretaries of State. There is no position in our society to which black Americans cannot aspire.
Whatever facts finally emerge in the Trayvon Martin case, we must reject those racial demagogues who seek every opportunity to deny racial progress and to promote themselves as leaders of an embattled and isolated minority. Our society has made dramatic progress. Certainly, there is more progress to be made in the future. But no incident - such as the one in Florida - should be used as a means to deny that progress and paint a dark - and untrue - picture of a society that has moved dramatically to overcome the racial barriers of the past. Those who engage in such tactics are not friends of the black community but may, in the end, be doing it as much harm as genuine racists.
At the present time, many are speaking of launching a pre-emptive strike against Iran. Washington Post columnist Dana Milbank writes that:
It's beginning to feel a lot like 2003 in the capital. Nine years ago . . . there was a similar feeling of inevitability - that despite President George W. Bush's frequent insistence that "war is my last choice," war in Iraq was coming.
In the case of Iraq, one of the key reasons given for launching our attack was that Saddam Hussein was in possession of weapons of mass destruction. This, of course, turned out not to be the case. Once again, some are urging an attack upon Iran because of that country's nuclear program. Our experience in Iraq should give us pause. Not only did we go to war with a country that had not attacked us, had no weapons of mass destruction, and no connection with the terrorists who were responsible for 9/11, but we did serious damage to our economy and lost untold numbers of American lives in an effort which now seems difficult to explain and understand.
Everyone agrees that Iran does not currently have nuclear weapons. U.S. intelligence believes that Iran is several years from achieving a nuclear capacity and it remains unclear that the Iranian leaders have made a decision to move in that direction. Those who have studied this region are most critical of those who call for war.
Gary Sick, national security adviser on Iran during the country's Islamic revolution, does not envision a situation in which Iran decides to break out and build a bomb, unless it is first attacked. Actually crossing the nuclear threshold would be "inviting an attack," Sick said, and would not be in Tehran's interest. But if Iran doesn't build a bomb, its demonstrated capability to do so, Sick explains, will make it a member of a small club of nations, such as Japan, Brazil and Sweden, that can acquire a nuclear weapon if they break away from the Non-Proliferation Treaty. In either case, Iran's goal is to assert its position as a major player in the region, one that the world should take seriously and with which it should consult.
The International Atomic Energy Agency (IAEA) has documented that Iran is putting all the pieces in place to have the option to develop nuclear weapons at some point. If Supreme Leader Ayatolla Ali Khamenei decides to produce a bomb, Iran is believed to have the technical capability to produce a testable nuclear device in a year or so and a missile-capable device in several years. But as Director of National Intelligence James Clapper told the Senate Armed Services Committee on February 16, it does not appear that Khamenei has made this decision.
Colin Kahl, an associate professor at Georgetown University's School of Foreign Service, who was deputy assistant secretary of defense for the Middle East from 2009 to 2011, argues:
Khamenei is unlikely to dash for a bomb in the near future because IAEA inspectors would probably detect Iranian efforts to divert low-enriched uranium and enrich it to weapons-grade level at declared facilities. Such brazen acts would trigger a draconian international response. Until Iran can pursue such efforts more quickly or in secret - which could be years from now - Khamenei is unlikely to act.
A full page ad in The Washington Post was headlined: "Mr. President: Say No to War of Choice With Iran." The signatories included General Joseph Hoar (USMC, Ret.), Brigadier General John H. Johns (USA, Ret.), Major General Paul Eaton (USA, Ret.), Tom Fingar, former Deputy Director of National Intelligence for Analysis, and Paul Pillar, former National Intelligence Officer for the Near East and South Asia. They declare:
The U.S. military is the most formidable military force on earth. But not every challenge has a military solution. Unless we, or an ally, are attacked, arms should be the option of last resort. Our brave servicemen and women expect you to exhaust all diplomatic and peaceful options before you send them into harm's way. Preventing a nuclear-armed Iran is rightfully your priority and your red line. Fortunately, diplomacy has not been exhausted and peaceful solutions are still possible. Military action is not only unnecessary, it is dangerous - for the United States and for Israel. We urge you to resist the pressure for a war of choice with Iran.
In Israel, opinion is sharply divided over the question of pre-emptive war. Many respected Israelis believe that a pre-emptive attack against Iran would be a serious mistake for Israel and would do it serious long-term harm. Political scientist Yeherkel Dror, an Israel Prize winner and founding president of the Jewish People Policy Institute, says that with regard to Iran, Israel needs to rely on "ultimate deterrence," that an attack on Tehran's nuclear facilities will not only be counterproductive and that the real danger Israel faces is from a gradual wearing away of its staying power.
"Assuming you attack, then what?" he says.
In five years, they will recuperate with absolute determination to revenge. The idea that an Israeli attack will make Iran into a peace-loving country is not on my horizon. I don't know anything like this in history. I know the opposite from history. . . . Iran has a very low probability of being a suicidal state. They have a long culture, a long history, and they are much more involved in the Shia-Sunni conflict than the Israeli side issue. I think no one has any doubt that if Israel's future existence is in danger it will use mass killing weapons.
The Jerusalem Report notes that:
Three men once most closely involved in Israeli efforts to stop Iran - former Mossad chiefs Meir Dagan (2002-2011), Efraim Halevy (1998-2002) and Danny Yatom (1996-1998) - all see a lone Israeli military attack as a last resort, to be avoided if at all possible.
Speaking at the Hebrew University last May, Dagan derided an Israeli strike as "a stupid idea," it might not achieve its goals. It could lead to a long war, and worse, it could give Iranian leaders justification to build a nuclear weapon. In Dagan's view, precipitate Israeli action could break up the current anti-Iranian consensus, leading to less pressure on Iran, not more.
According to The Jerusalem Report:
Dagan holds that there is still time; last year he estimated that Iran would not have a nuclear weapon before 2015. . . . Efraim Halevy says Israel should recognize that it is a regional power and act like one. He says the country is too strong to be destroyed and the Israeli people should not have existential fears about Iran or anything else. . . . Israel's strategy should be to work with its allies to convince the Iranian regime to change course without force coming into play. In Halevy's view, this is achievable since the Iranian regime is dedicated primarily to its own survival and will likely back down if it feels threatened by even more crippling sanctions. Israel should be using its international connections to ratchet up pressure on the Iranian regime, while preparing a military option if, and only if, all else fails.
It seems clear that if Iran were ever to develop and use a nuclear weapon there would be massive retaliation, endangering the country's entire population. During the Cold War, the Soviet Union was armed to the teeth with nuclear weapons, and never used them precisely because of a fear of retaliation, what became known as Mutual Assured Destruction. Iran would have to be suicidal to even think of using such a weapon.
The available evidence is that Iran is not suicidal. General Martin Dempsey recently explained that he viewed Iran as a "rational actor." Although some protested this characterization, Time's Fareed Zakaria points out:
Dempsey was making a good point. A rational actor is not necessarily a reasonable actor or one who has the same goals or values that you or I do. A rational actor is someone who is concerned about his survival.
Compared with radical revolutionary regimes like Mao's China - which spoke of sacrificing half of China's population in a nuclear war to promote global Communism - the Iranian regime has been rational and calculating in its actions.
In an essay in the Washington Monthly, former senior U.S. intelligence official Paul Pillar writes:
More than three decades of history demonstrate that the Islamic Republic's rulers, like most rulers elsewhere, are overwhelmingly concerned with preserving their regime and their power - in this life, not some future one.
For the most powerful country in the world to even think of a pre-emptive war against a country that has not attacked us, has no nuclear weapons, and there is serious question about whether or not they have even decided to pursue them in the future, would itself be an irrational act. It is time for a serious debate - and serious discussion of the consequences of any such action. And if the time comes when the U.S. decides that an attack on Iran does make sense, it should be done in the form of a declaration of war by the U.S. Congress, as called for in our Constitution.
Respect for private property is an essential element of a free society. In his Discourse on Political Economy, Rousseau writes that:
It should be remembered that the foundation of the social contract is property; and its first condition, that every one should be maintained in the peaceful possession of what belongs to him.
In The Prince, Machiavelli notes that, "When neither their property nor their liberty is touched, the majority of men live content."
In our own society, there have been increasing efforts to limit the rights of property owners. Fortunately, efforts are now growing to reverse such trends.
In March, the Supreme Court ruled that an Idaho couple facing ruinous fines for attempting to build a home on private property that the federal government considered protected wetlands may challenge an order from the Environmental Protection Agency.
This case was considered the most significant property rights case on the court's docket this year, with the potential to change the balance of power between landowners and the EPA in disputes over land use, development, and the enforcement of environmental regulations.
Critics called the EPA action an example of overreach, as the property in question was a small vacant lot in the middle of an established residential subdivision. The government argued that allowing EPA compliance orders to be challenged in court could severely delay actions needed to prevent imminent ecological disasters.
Justice Antonin Scalia, writing for a unanimous court, said that Michael and Chantell Sackett are entitled to appeal the EPA order, rejecting the agency's argument that allowing landowners timely challenges to its decisions would undermine its ability to protect sensitive wetlands.
In the decision, Justice Scalia wrote:
The law's presumption of judicial review is a repudiation of the principle that efficiency of regulation conquers all. And there is no reason to think that the Clean Water Act was uniquely designed to enable the strong-arming of regulated parties into "voluntary compliance" without the opportunity for judicial review - even judicial review of the question whether the regulated party is within the EPA's jurisdiction.
The EPA issues nearly 3,000 administrative compliance orders a year that call on suspected violators of environmental laws to stop what they're doing and repair the harm they have caused. Business groups, homebuilders, road builders and agricultural interests all came out in opposition to the EPA in the case.
Mr. Sackett said that the Supreme Court ruling affirmed his belief that "the EPA is not a law unto itself." He said that, "The EPA used bullying and threats of terrifying fines, and has made our life hell for the past five years."
Senator John Barasso (R-Wyoming) said that:
This decision delivers a devastating blow to the Obama administration's "War on Western Jobs." This victory by one Western couple against a massive Washington bureaucracy will inspire others to challenge the administration's regulatory overreach.
The case stemmed from the Sacketts' purchase of a 0.63 acre lot for $23,000 near Priest Lake, Idaho in 2005. They had begun to lay gravel on the land, located in a residential neighborhood, when they were hit by an EPA compliance order informing them that the property had been designated a wetland under the Clean Water Act. Justice Scalia noted that the property bore little resemblance to any popular concept of a wetland, protected or not.
The Pacific Legal Foundation in Sacramento, which represented the Sacketts, called it
. . . a precedent-setting victory for the rights of all property owners. . . . The Supreme Court's ruling makes it clear that EPA bureaucrats are answerable to the law in the courts like the rest of us.
There are also efforts under way to stop the abuses of the policy of eminent domain. In February, the Virginia General Assembly gave its first approval to a constitutional amendment restoring the sanctity of private property. The measure was made necessary by the 2005 Supreme Court decision in Kelo v. New London, that gave towns and cities free rein to grab land - not for public uses - but for the use and benefit of well-connected developers.
Over the years, the Supreme Court has expanded the scope of government takings by redefining "public use." The Washington Times declares that:
Originally, the term was applied to such things as parks, roads or rail lines - all of which were open for use by the entire community. The high court elasticized the concept to include land intended for a public "purpose" such as eliminating blight or other catch-all categories related to public safety. The Kelo court went further to rule that economic growth, and the tax revenue that would accrue from it, was sufficient to justify a land grab.
Virginia Attorney General Kenneth Cuccinelli and Governor Robert McDonnell have been leading the fight to reform that state's property laws over the developer interests that, until now, have succeeded in blocking the amendment from consideration. The measure clarifies that eminent domain may be used only for purposes that are truly public. Land could not be transferred by the government to private entities to generate more tax dollars.
Christina Walsh, of the Institute for Justice, which argued the Kelo case before the Supreme Court, states that:
The power of eminent domain is supposed to be for "public use" so government can build things like roads and schools. . . . But starting with the wildly unsuccessful urban renewal efforts of the 1940s and 1950s, "public use" has been stretched to mean anything that could possibly benefit the public. . . . It has been demonstrated time and again that eminent domain is routinely used to wipe out black, Hispanic, and poorer communities, with less political capital and influence in favor of developers' grand plans.
Groups across the political spectrum have recognized the need to limit this abuse of power. The diverse coalition has included the League of United Latin American Citizens, the National Federation of Independent Business and the Farm Bureau. There is now a bipartisan bill, H.R. 1433, making its way through the House that would strip a city of federal economic development funding for two years if the city takes private property to give to someone else for private use.
In all these cases - the Supreme Court decision concerning the EPA, the proposed constitutional amendment in Virginia - and the legislation now being considered in the House, we see a commitment to restore private property rights, an essential ingredient of a genuinely free society. All of these efforts should be encouraged and supported.
The question of economic inequality has become an important part of our national conversation. Recently, the Congressional Budget Office supplied hard data on the widening economic gap. Among Western countries, America stands out as the place where economic and social status is most likely to be inherited.
What is less often discussed are the reasons for this disparity. A key element has been the dramatic changes that have taken place in recent years in family life.
In 1965, the respected liberal intellectual, and later Senator from New York, Daniel Patrick Moynihan, wrote a controversial report on the perilous state of the black family, pointing out that 24 percent of births among blacks and 3 percent among whites were out of wedlock. In retrospect, we can see that the decline in the American family was only beginning. Today, out-of-wedlock births account for 73 percent of births among blacks, 53 percent among Latinos, and 29 percent among whites.
Recently, a front-page article in The New York Times reported that more than half of births to mothers under age 30 now occur out of wedlock. Many are casting aside the notion that children should be raised in a stable two-parent family.
The economic class divide that is attracting increasing attention cannot be considered outside of an understanding of the lifestyle choices of those involved. Almost 70 percent of births to high school dropouts and 51 percent to high school graduates are out of wedlock. Among those with some college experience, the figure is 34 percent and for those with a college degree, just 8 percent.
The breakdown of the family has a significant impact upon children. Children in two-parent families, University of Virginia sociologist Bradford Wilcox shows, are more likely to "graduate from high school, finish college, become gainfully employed, and enjoy a stable family life themselves."
In the new book, Coming Apart: The State of White America, 1960-2010, Charles Murray, of the American Enterprise Institute, focusing on white Americans to avoid capitalizing on the problems faced by minority groups, sees a significant decline in what he considers America's founding virtues - industriousness, honesty, marriage and religiosity - over the last 50 years.
That decline, he illustrates, has not been uniform among different segments of the white population. Among the top 20 percent in income and education, he finds that rates of marriage and church attendance, after falling marginally in the 1970s, have plateaued at a high level since then. And these people have been working longer hours than ever before.
In contrast, among the bottom 30 percent, those indicators started falling in the 1970s, and have been plunging ever since. Among this group, he reports, one-third of men age 30 to 49 are not making a living, one-fifth of women are single mothers raising children, and nearly 40 percent have no involvement in a secular or religious organization. The result is that children being raised in such settings have the odds stacked against them.
Discussing Murray's book, columnist Michael Barone declares that:
These findings turn some conventional political wisdom on its head. They tend to contradict the liberals who blame increasing income disparity on free-market economics. In fact it is driven in large part by personal behavior and choices. They also undermine the conservatives who say that a liberation-minded upper class has been undermining traditional values to which more downscale Americans are striving to adhere. Murray's complaint against upscale liberals is not that they are libertines but that they fail to preach what they practice.
Society does not, of course, move only in a single direction. Some indicators of social dysfunction have improved dramatically, even as traditional families continue to lose ground. There has, for example, been a dramatic decline in teenage pregnancies among all racial groups since 1990. There has also been a 60 percent decline in violent crime since the mid-90s.
Still, something is clearly happening to the traditional working-class family. Part of it, of course, is a reduction in the work opportunities available to less-educated men as many unskilled jobs move abroad to cheaper labor markets, such as China. Adjusted for inflation, entry-level wages of male high school graduates working in the private sector had health benefits, but, by 2009, that was down to 29 percent.
In 1996, sociologist William Julius Wilson published When Work Disappears: The New World of the Urban Poor, in which he argued that much of the social disruption among African-Americans popularly attributed to collapsing values was actually caused by a lack of blue-collar jobs in urban areas.
As with all complex social problems, there are many causes. Charles Murray makes an important point about the importance of marriage and family in fostering economic security and well-being, something which cannot be ignored in confronting the question of economic inequality.
Writing in Time, Rich Lowery, editor of National Review, notes that:
No one wants to be preachy about marriage when everyone knows its inevitable frustrations. . . . At the very least, though, we should provide the facts about the importance of marriage as a matter of child welfare and economic aspiration. As a society, we have launched highly effective public-education campaigns on much less momentous issues, from smoking to recycling. It's not hard to think of a spokeswoman. Michelle Obama is the daughter in a traditional two-parent family and the mother in another one that even her husband's critics admire. If she took up marriage as a cause, she could ultimately have a much more meaningful impact on the lives of children than she will ever have urging them to do jumping jacks. For now, the decline of marriage is our most ignored national crisis. As it continues to slide away, our country will become less just and less mobile.
There is growing evidence that our colleges and universities are not teaching students what they need to compete for jobs in our high-tech international economy.
A 2010 study published by the Association of American Colleges and Universities found that 87 percent of employers believe that higher-education institutions have to raise student achievement if the U.S. is to be competitive in the global market. Sixty-three percent say that recent college graduates do not have the skills they need to succeed. And, according to a separate survey, more than a quarter of employers say entry-level writing skills are deficient.
A recent book, Academically Adrift: Limited Learning on College Campuses, by Richard Arum of New York University and Josipa Roksa of the University of Virginia, point out that gains in critical thinking, complex reasoning, and writing skills are either
. . . exceedingly small or nonexistent for a larger proportion of students. It has been found that 36 percent of students experience no significant improvement in learning (as measured by the Collegiate Learning Assessment) over four years of higher education.
Most universities do not require the courses considered core education subjects -math, science, foreign languages at the intermediate level, U.S. government or history, composition, literature, and economics.
The American Council of Trustees and Alumni (ACTA) has rated schools according to how many of the core subjects are required. A review of more than 1,000 colleges and universities found that 29 percent of schools require two or fewer subjects. Only 5 percent require economics. Less than 20 percent require U.S. government or history.
ACNA President Anne Neal declares:
How can one think critically about anything if one does not have a foundation of skills and knowledge? It's like suggesting that our future leaders only need to go to Wikipedia to determine the direction of our country.
Eight years ago, leaders at the University of Texas set out to measure something few in higher education had thought to do - how much their students learn before graduation. The answer that emerged was: not very much. The conclusion is based on results from a 90-minute essay test given to freshmen and seniors that aims to gauge gains in critical thinking and communication skills. Both the University of Texas and several hundred other public universities have joined the growing accountability movement in higher education in an effort to quantify collegiate learning on a large scale.
Last year, University of Texas freshmen scored an average 1261 on the assessment, which is graded on a scale similar to that of the SAT. Seniors averaged 1303. Both groups scored well, but seniors fared little better than freshmen. "The seniors have spent four years there, and the scores have not gone up that much," says New York University's Richard Arum.
Needless to say, it is not only our colleges that seem not to be properly preparing our students. Our high schools have fallen dramatically behind in teaching algebra, geometry, and trigonometry. This means, writes economist Walter Williams, that:
There are certain relatively high-paying careers that are probably off-limits for life. These include careers in architecture, chemistry, computer programming, engineering, medicine and certain technical fields. For example, one might meet all of the physical requirements to be a fighter pilot, but he's grounded if he doesn't have enough math to understand physics, aerodynamics and navigation. Mathematical ability provides the disciplined structure that helps people to think, speak, and write more clearly.
Drs. Eric Hanushek and Paul Peterson, senior fellows at the Hoover Institution, looked at the performance of our young people compared with their counterparts in other nations in their Newsweek article, "Why Can't American Students Compete?" last year. In the latest international tests administered by the Organization for Economic Cooperation and Development, found that only 32 percent of U.S. students ranked proficient in math - coming in between Portugal and Italy, but far behind South Korea, Finland, Canada, and the Netherlands. Seventy-five percent of Shanghai students tested proficient. In the U.S. only 7 percent could perform at an advanced level in mathematics.
In a 2009 The New York Times article, "Do We Need Foreign Technology Workers?," Dr. Vivek Wadhwa of Duke University said
. . . 49 percent of all U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering workers with doctorates are immigrants, as were 67 percent of the additions to the U.S. science and engineering work force between 1995 and 2006. And roughly 60 percent of engineering Ph.D. students and 40 percent of master's students are foreign nationals.
Recently, President Obama proposed making kids stay in school until they are 18. This would not do much to address the nation's educational woes, say education specialists. "It's not the slam bang that it looks like," said Russ Whitehurst, director of Brown Center on Education Policy at the Brookings Institution. "It's not like you raise the age to 18 and they're going to go ahead and graduate - they're just going to stay in school."
There is much talk about the need for "everyone" to go to college - and very little discussion about what is actually being taught in our colleges. Professor Richard Vedder of Ohio University argues that:
The number going to college exceeds the number capable of mastering higher levels of intellectual inquiry. This leads colleges to alter their mission, watering down the intellectual content of what they do.
Simply put, colleges dumb down courses so that the students they admit can pass them.
Professor Walter Williams notes that:
Much of American education is a shambles. Part of a solution is for colleges to stop admitting students who are unprepared for real college work. That would help reveal the shoddy education provided at the primary and secondary school levels. But college administrators are more interested in larger numbers of students because they translate to more economy.
Beyond this, the nation's security is also at risk if schools do not improve, warns a report by a panel led by former Secretary of State Condoleezza Rice and Joel I. Klein, a former chancellor of New York City's school system.
"The dominant power of the twenty-first century will depend on human capital," the report said. "The failure to produce that capital will undermine American security."
The report said that the State Department and intelligence agencies face critical shortages in the number of foreign-language speakers, and that fields like science, defense, and aerospace face a shortage of skilled workers that is likely to worsen as baby boomers retire.
According to the panel, 75 percent of young adults do not qualify to serve in the military because they are physically unfit, or have criminal records, or inadequate levels of education. It said 30 percent of high school graduates do not do well enough on an aptitude test to serve.
In our global, high-tech economy, we cannot afford to continue the educational system we have. It is high time that we turned our attention to making the necessary changes and reforms that would keep America competitive in the twenty-first century. *
There can be little doubt that government spending is out of hand, and that Washington's role in our society has dramatically expanded in recent years. The American people are dismayed about the manner in which our political life has deteriorated. The party out of power, whichever one it may be, seems to want the party in power to fail - so that it can be replaced by themselves. The long-term best interests of the country are obscured.
Many tend to think of our problems in narrow partisan terms. Some argue, for example, that Democrats favor big government and deficit spending, while Republicans favor balanced budgets and limited government. Our choices in elections would be clear-cut if this were, in fact, the case.
More realistically, we see that whichever party is in power tends to expand government power. Deficits reached all-time highs under President George W. Bush - and have now reached even higher levels - dramatically higher - under President Obama. The dilemma we face is far more complex than partisan political spokesmen permit themselves to admit.
An important new book, Throw Them All Out: How Politicians and Their Friends Get Rich Off Insider Stock Tips, Land Deals, and Cronyism that Would Send the Rest of Us to Prison (Houghton Mifflin Harcourt) by Peter Schweitzer explores the world in which our politicians - both Democrats and Republicans - live.
Three years ago, then-House Speaker Nancy Pelosi and her husband, Paul, made the first of three purchases of Visa stock - Visa was holding an initial public offering, among the most lucrative ever. The Pelosis were granted early access to the IPO as "special customers" who received their shares at the opening price, $44. They turned a 50 percent profit in just two days.
Starting on March 18, the speaker and her husband made the first of three Visa stock buys, totaling between $1 million and $5 million. "Mere mortals would have to wait until March 19, when the stock would be publicly traded to get their shares," writes Peter Schweitzer, a scholar at the Hoover institution. He points out that the Pelosis got their stocks just two weeks after legislation was introduced in the House that would have allowed merchants to negotiate lower interchange fees with credit card companies. Visa's general counsel described it as a "bad bill." The speaker squelched it and kept further action bottled up for more than two years. During the time period the value of her Visa stock jumped more than 200 percent while the stock market as a whole dropped 15 percent.
"Isn't crony capitalism beautiful?" asks Schweitzer. The book shows members of Congress enriching themselves through earmarks and unpunished insider trading, politically connected companies being given billions of dollars in stimulus funds, and public money intended to help the environment, plus many varieties of kickbacks and favors.
Sadly, most of these actions fall within the letter, if not the spirit, of the law and ethics rules governing Congress.
While Senator John K. Kerry (D-MA) was working on healthcare in 2009, he and his wife began buying stock in Teva Pharmaceuticals. The Kerrys purchased nearly $750,000 in November alone. As the bill got closer to passing, the stock value soared. Pharmaceutical companies support these legislative efforts because they would increase the demand for prescription drugs. When President Obama's healthcare bill became law, the Kerrys reaped tens of thousands of dollars in capital gains while holding onto more than $1 million in Teva shares.
Republicans join their Democratic colleagues in these and other enterprises. House majority Leader Eric Cantor (R-VA) relentlessly attacks run-away government spending. To Cantor, an $8 billion high-speed rail connecting Las Vegas to Disneyland is wasteful "pork-barrel spending." Rep. Cantor set up the "You Cut" website to demonstrate how easy it is to slash government spending. Yet Cantor has been pressing the Transportation Department to spend nearly $3 billion in stimulus money on a high-speed rail project in his home state of Virginia. Newsweek found about five-dozen of the most fiscally conservative Republicans - including Texas Governor Rick Perry and Rep. Ron Paul (R-TX) trying to gain access to the very government spending they publicly oppose. According to Newsweek:
The stack of spending-request letters between these GOP members and federal agencies stands more than a foot tall, and disheartens some of the activists who sent Republicans to Washington the last election.
Judson Phillips, founder of the Tea Party Nation, says:
It's pretty disturbing. We sent many of these people there, and really, I wish some of our folks would get up and say, you know what, we have to cut the budget, and the budget is never going to get cut if all 535 members of Congress have their hands out all the time.
Former vice presidential candidate Sarah Palin, writing in The Wall Street Journal, declares that:
The corruption isn't confined to one political party or just a few bad apples. It's an endemic problem encompassing leadership on both sides of the aisle. It's an entire system of public servants feathering their own nests.
Now, Republican presidential candidate Newt Gingrich denounces big government. Previously, he enriched himself at its trough. Conservative columnist Timothey P. Carney notes that:
When Newt Gingrich says he never lobbied, he's not telling the truth. When he was a paid consultant for the drug industry's lobby group, Gingrich worked hard to persuade Republican congressmen to vote for the Medicare drug subsidy that the industry favored. To deny Gingrich was a lobbyist requires an Obama-like parsing over who is and who isn't a lobbyist. . . . Newt Gingrich spent the last decade being paid by big businesses to convince conservatives to support the big government policies that would profit his clients.
The fact - which partisans on both sides like to deny - is that both parties are responsible for the sad state of our political life - and our economic decline. Money-making opportunities for members of Congress are widespread. Peter Schweitzer details the most lucrative methods: accepting sweetheart deals of IPO stock from companies seeking to influence legislation, practicing insider trading with nonpublic government information, earmarking projects that benefit personal real-estate holdings, and even subtly extorting campaign donations through the threat of legislation unfavorable to an industry. The list is a long one.
Congress has been able to exempt itself from the laws it applies to everyone else. That includes laws that protect whistleblowers - nothing prevents members of Congress from retaliating against staff members who expose corruption, as well as Freedom of Information Act requests. Some say that it is easier to get classified documents from the CIA than from a congressional office.
To correct any problem, it is essential first to understand it properly. The problems in our political life are institutional and thinking that a simple change of parties will correct them is to misunderstand reality. Those who seek limited government, balanced budgets, and a respect for the Constitution must understand that both parties are responsible for the current state of affairs. With an appreciation of the real challenge before us, perhaps real solutions will be explored and debated. These, however, do not appear to be on today's political agenda.
Late in December, the Justice Department blocked a new South Carolina law that would require voters to present photo identification, saying the law would disproportionately suppress turnout among eligible minority voters.
The move was the first time since 1994 that the department has exercised its powers under the Voting Rights Act to block out a voter identification law. It followed a speech by Attorney General Eric Holder that signaled an aggressive stance in reviewing a wave of new state voting restrictions enacted in the name of fighting fraud.
In a letter to the South Carolina government, Thomas E. Perez, the assistant attorney general for civil rights, said that allowing the new requirement to go into effect would have "significant racial disparities."
Richard L. Hansen, an election law specialist at the University of California at Irvine, predicts that South Carolina will go to court, which could set up a "momentous" decision in the Supreme Court on whether a part of the Voting Rights Act that prevents states like South Carolina from changing their voting rules without federal permission is unconstitutional.
Governor Nikki Haley criticized the decision, accusing the Obama administration of "bullying" the state. She declared: "It is outrageous, and we plan to look at every possible option to get this terrible, clearly political decision overturned so we can protect the integrity of our electoral process and our 10th amendment rights."
Under the Voting Rights Act, an election rule or practice that disproportionately affects minority voters is illegal - even if there is no sign of discriminatory intent. South Carolina is one of several states that, because of a history of discriminatory practices, must prove that a measure would not disproportionately discourage minority voting.
In 2011, eight states - Arkansas, Kansas, Mississippi, Rhode Island, South Carolina, Tennessee, Texas and Wisconsin - passed variations of a rule requiring photo identification for voters. It is unclear if the four states not subject to the Voting Rights Act requirement - Wisconsin, Kansas, Rhode Island, and Tennessee - will face challenges to their laws. These laws have proven popular. In November, Mississippi voters easily approved an initiative requiring a government-issued photo ID at the polls.
Artur Davis, who serves in Congress from 2003 to 2011, and was an active member of the Congressional Black Caucus, once vigorously opposed voter ID laws. Now, he has changed his mind. In a commentary in the Montgomery Advertiser, Davis says that Alabama "did the right thing" in passing a voter ID law and admits, "I wish I had gotten it right when I was in political office."
As a congressman, he says, he "took the path of least resistance," opposing voter ID laws without any evidence to justify his position. He simply
. . . lapsed into the rhetoric of various partisans and activists who contend that requiring photo identification to vote is a suppression tactic aimed at thwarting black voter participation.
Today, Davis recognizes that the "most aggressive" voter suppression in the black community "is the wholesale manufacture of ballots at the polls" in some predominantly black districts.
Hans A. von Spakovsky, senior legal fellow at the Heritage Foundation and a former member of the Federal Election Commission, wrote a case study about voter prosecution in one such district, Greene County, Alabama, which is 80 percent black. He writes that,
Incumbent black county officials had stolen elections there for years, perpetrating widespread, systematic voter fraud. The Democratic incumbents were challenged by black Democratic reformers in 1994 who wanted to clean up local government. Voter fraud ran rampant that year. Ultimately, the U.S. Department of Justice won 11 convictions of Greene County miscreants who had cast hundreds of fraudulent votes.
Spakowsky argues that,
There was no question that (fraudulent) tactics changed the election in Greene County in 1994. But the worst thing from the standpoint of the reformers who had complained to the FBI was the reaction of the NAACP and the Southern Christian Leadership Conference (SCLC). The reformers thought those civil rights organizations would be eager to help those whose elections had been stolen through fraud. Instead both organizations attacked the FBI and federal prosecutors, claiming that the voter-fraud investigation was simply an attempt to suppress black voters and keep them from the polls.
One of the black reformers, John Kennard, a local member of the NAACP, wrote a letter to then-NAACP chairman Julian Bond charging the group with "defending people who knowingly and willingly participated in an organized . . . effort to steal the 1994 election from other black candidates." Mr. Bond replied simply that "sinister forces" behind the prosecution were "part and parcel of an ongoing attempt to stifle black voting strength." The NAACP Legal Defense Fund even defended those later found guilty of fraud.
The rhetoric used by the NAACP at that time, states Spakovsky, "is exactly the same kind that is being used today by . . . the NAACP and others who oppose voter ID laws. . . . Mr. Davis was disappointed to see Bill Clinton . . . compare voter ID to Jim Crow."
In Davis's view, voter ID is "unlikely to impede a single good-faith voter - and that only gives voting the same elements of security as writing a check at the store, or maintaining a liberty card. The case for voter ID is a good one, and it ought to make politics a little cleaner and the process of conducting elections much fairer."
Photo IDS are required to drive a car, cash a check, collect government assistance and fly on a plane - among other things. No one suggests that the need for photo ID during such transactions are "racist." To ask voters to properly identify themselves seems to be simply common sense.
Robert Knight, a senior fellow for the American Civil Rights Union, notes that, "Article I, Section 4 of the U.S. Constitution leaves voting procedures largely to the states. The Voting Rights Act requires stricter scrutiny of some states, but the case for voter suppression has yet to be made."
What is motivating the Obama administration to embark upon a crusade against voters identifying who they are before casting their ballots, is less than clear. If they think they are somehow fighting "racism," they are clearly on the wrong track.
For many years there has been an effort to read black Americans who dare to think for themselves out of the black community. To disagree with liberal politics or affirmative action is to be, in some way, rejecting one's blackness.
One of the vocal enforcers of this policy of thought control is Professor Randall Kennedy of Harvard, author of books such as Sellout: The Politics of Racial Betrayal. In Kennedy's view, there should be an expulsion option in the black community for blacks who adopt conservative views. Clarence Thomas, he argues, should turn in his black card. There should be boundaries, he declares, or else the notion of a black community bound by shared struggle disappears.
Fortunately for all of us, this point of view is now in retreat. When Professor Cornel West painted President Obama as cowardly and out of touch with black culture, he was sharply criticized by Professor Melissa Harris-Perry of Tulane. Writing in The Nation, she declared:
I vigorously object to the oft-repeated sentiment that African-Americans should avoid public disagreements and settle matters internally to present a united front. . . . Citizenship in a democratic system rests on the ability to freely and openly choose, criticize, and depose one's leaders. This must obtain whether those leaders are elected or self-appointed. It cannot be contingent on whether the critiques are accurate or false, empirical or ideological, well or poorly made. Citizenship is voice. . . . That African-Americans strenuously disagree among ourselves about goals and strategies is an ancient historical truth.
The media attention given to the criticism of President Obama by Professor West, states Harris-Perry, "can be understood only by the repeated refusal by mainstream media and broader American political culture to adequately grasp the heterogeneity of black thought."
An important new book, Who's Afraid of Post-Blackness? has just appeared. Its author, Toure, is a correspondent for MSNBC, a contributing editor at Rolling Stone, and the author of three previous books. The central point of the book is that there is no single way to be black. Justice Clarence Thomas, in his view, is no less black than Jay-Z. One of his goals, Toure writes, is "to attack and destroy the idea that there is a correct or legitimate way of doing blackness." Post-blackness, he declares, has no patience with "self-appointed identity cops" and their "cultural bullying."
What this means, according to the 105 prominent black Americans interviewed for the book, is a liberating pursuit of individuality. Black artists, like other professionals, now feel free to pursue any interest they like and are no longer burdened with the requirement to represent "the race."
Reviewing Toure's book for The New York Times, Professor Orlando Patterson of Harvard notes that:
. . . this is one of the most acutely observed accounts of what it is like to be young, black, and middle-class in contemporary America. Toure inventively draws on a range of evidence - autobiography, music, art, interviews, comedy and popular social analysis - for a performance carried through with unsparing honesty, is a distinctive voice that is often humorous, occasionally wary and defensive, but always intensely engaging.
Toure says that: "If there are 40 million black Americans, then there are 40 million ways to be black," repeating a line from Harvard university's Henry Louis Gates, Jr.:
I'm telling the self-appointed identity cops, who want to say, "This person isn't black enough," to put down their swords. Fear of post-blackness just inhibits our potential. Stop the bullying, and stop telling people they don't walk right, talk right, think right or like the right things. It's silly and ridiculous and pernicious.
When he was a student at Emory University, Toure made friends with the white students in his dormitory. Then he read The Autobiography of Malcolm X, switched his major to African-American studies, started a black-nationalist newspaper and moved into the Black Student Association's private house.
It was in this all-black house, he says, that after a party, in a room full of black people, that he was "loudly and angrily told by a linebacker-sized brother: 'Shut up, Toure! You ain't black!'" This episode led to something of an epiphany, he says. "Who gave him the right to determine what is and is not blackness for me? Who made him the judge of blackness?"
An interesting phenomenon of the emerging 2012 presidential election is the success of Herman Cain among Republican candidates. Ron Christie, a former special assistant to President George W. Bush and a fellow at the Institute of Politics at Harvard's Kennedy School of Government, notes that:
Cain's candidacy is the ultimate extension of the Obama presidency. A contender for the highest office in the land can be taken seriously regardless of race. We are heading into a 2012 election cycle in which Republican and tea party conservatives appear eager to support a candidate who just happens to be black, based on his convictions and ideas.
Several members of the Congressional Black Caucus have called the tea party movement and its backers racist. In August, Rep. Andre Carson (D-Ind.) told an audience at a CBC event in Miami that "some of them in Congress right now of this tea party movement would love to see you and me . . . hanging on a tree." He likened to "Jim Crow" the efforts of the tea party and its supporters in Congress to limit the size of the federal government.
Mr. Christie, who is black, declares that:
There will always be a fringe element in this country that is unable to accept individuals based on the color of their skin. But to me, continuing to paint the tea party as racist - even as Cain is surging - is simply more race baiting by dissatisfied Democrats.
In an interview with CNN, Herman Cain said he thinks at least a third of black voters would be inclined to support his candidacy because they are "open-minded." He declared:
This whole notion that all black Americans are necessarily going to stay and vote for Obama, that's simply not true. More and more black Americans are thinking for themselves, and that's a good thing.
Sadly, for many years, freedom of speech and debate, hailed in the nation at large as an essential element of a thriving democratic society, has been discouraged in the black community in the name of "unity." As Julius Lester, a one-time black radical and later a member of the faculty of the University of Massachusetts, said almost twenty years ago:
For two decades, an honest exchange of ideas in black America has been discouraged in the name of something called unity. Public disagreements have been perceived as providing ammunition to "the Enemy," that amorphous white "they" that works with a relentlessly evil intent against blacks. . . . The suppression of dissent and differences in the name of unity evolved into a form of social fascism, especially on college and university campuses. In some instances, black students were harassed and ostracized for having white friends. . . . Thinking black took precedence over thinking intelligently. . . .
Stifling free speech in the name of "unity," Lester shows, is something quite new in black American history. He notes that:
In the first part of the 19th century, Negro national conventions were held where black leaders debated and disagreed bitterly with each other over slavery and freedom, abolitionism and separatism. Frederick Douglass, the first national leader and Martin Delaney, the first black separatist, were political adversaries and friends. Dissent and disagreement have been the hallmark of black history.
The first black to win a seat in the U.S. House of Representatives in the 20th century and the first to be elected from a Northern state was Oscar de Priest of Illinois, a Republican. He believed in limited government, hard work, and the free market.
Finally, the realization is growing that not all black Americans think alike - nor do white, Hispanic, or Asian Americans. This understanding is long overdue. *
A Washington Post-ABC News poll shows a new high - 84 per cent of Americans - disapproving of the job Congress is doing, with almost two-thirds saying they "disapprove strongly." Just 13 percent of Americans approve of how things are going. It has been nearly four years since even 30 percent expressed approval of Congress.
Editorially, The Washington Examiner notes that,
Nobody can remember the last time the public approval rating of Congress was so low. That's because it's never been as low as it is now. . . . It's not hard to see why: the American people are fed up with the bipartisan corruption, endless partisan bickering, and lack of concrete action to address the nation's most pressing problems, especially out-of-control spending and the exploding national debt. . . . Both parties have presided over congressional majorities as Congress sank in public esteem during the past decade.
One reason for public dismay is the manner in which members of Congress often support while in office the interests they then go to work for once out of office. Of equal concern is the manner in which members of Congress vote for subsidies to the groups that have contributed to their political campaigns. This is true of members of Congress from both parties.
For example, soon after he retired last year as one of the leading liberals in Congress, former Rep. William D. Delahunt (D-MA) started his own lobbying firm with an office on the 16th floor of a Boston skyscraper. One of his first clients was a small coastal town that has agreed to pay him $15,000 a month for help in developing a wind energy project.
The New York Times reports that,
Amid the revolving door of congressmen-turned-lobbyists, there is nothing particularly remarkable about Mr. Delahunt's transition, except for one thing. While in Congress, he personally earmarked $1.7 million for the same energy project. So today, his firm, the Delahunt Group, stands to collect $90,000 or more for six months of work from the town of Hull, on Massachusetts Bay, with 80 percent of it coming from the pot of money he created through a pair of Energy Department grants in his final term in office.
Beyond the town of Hull, Delahunt's clients include at least three others who received millions of dollars in federal aid with his direct assistance. Barney Keller, communications director for the Club for Growth, a conservative group that tracks earmarks, says:
I cannot recall such an obvious example of a member of Congress allocating money that went directly into his own pocket. It speaks to why members of Congress shouldn't be using earmarks.
While this case may be somewhat extreme, it is repeatedly duplicated in one form or another by members of Congress. Consider former Senator Rick Santorum of Pennsylvania. A review of Santorum's many earmarks suggests that the federal money he helped direct to Pennsylvania paid off in the form of campaign cash. In just one piece of legislation, the defense appropriations bill for the 2006 fiscal year, Santorum helped secure $124 million in federal financing for 54 earmarks, according to Taxpayers for Common Sense, a budget watchdog group. In that year's election cycle, Santorum's Senate campaign committee and its "leadership PAC" took in more than $200,000 in contributions from people associated with the companies that benefited or their lobbyists. In all, Taxpayers for Common Sense estimated, Santorum helped secure more than $1 billion in earmarks during his Senate career.
Or consider former House Speaker Newt Gingrich, who speaks about being a "Reagan conservative" who supports "limited government," yet received $1.6 million from Freddie Mac over an eight-year period and gave the government-backed mortgage giant assistance in resisting reformers in Congress. Mr. Gingrich denies that he was a "lobbyist," as do some other former members of Congress. The Lobbying Disclosure Act of 1995 has three tests:
(1) Do you make more than $3,000 over three months from lobbying?
(2) Have you had more than one lobbying contract?
(3) Have you spent more than 20 per cent of your time lobbying for a single client over three months?
Only a person who has met all three tests must register as a lobbyist. Thus, a former member of Congress who has many lobbying contacts and makes $1 million a year lobbying but has no single client who takes up more than 20 per cent of his time would not be considered a lobbyist.
Clearly, it is time to change this rule. A task force of the American Bar Association recommended last year that the 20 percent rule be eliminated, which would require far more people to register as lobbyists, and subject them to ethics and disclosure requirements. The Center for Responsive Politics found that more than 3,000 lobbyists simply "de-registered" after Congress imposed new reporting requirements for lobbyists in 2007.
With regard to Gingrich, Washington Times columnist Don Lambro writes:
Mr. Gingrich . . . is the quintessential Washington insider, peddling influence in government. . . . He denied he was lobbying, insisting that he was hired to be a historian, when he was selling his services to one of the richest bidders in government. He was being paid well out of Freddie Mac's coffers while it was sowing the seeds of a housing scandal that resulted in an economic meltdown that has hurt millions of Americans and cost taxpayers billions of dollars. In other words, as a paid insider, he was part of the problem, not part of the solution.
Cutting the size of government, reducing our debt, and balancing the budget are embraced rhetorically by candidates for public office. Once elected, however, many become part of the system they campaigned against. The incentive structure once in office is to raise money to stay in office, and the way to do this is to vote subsidies to those groups being called upon to contribute. Both parties are engaged in this behavior, and candidates of both parties are rewarded so that special interests will have a friend in office no matter who is elected.
Sadly, the action of Congress - and the lobbying enterprises of former members of Congress - are legal. This, of course, is because it is Congress itself that writes the laws. There was a time when members of Congress, when they retired or were defeated, returned home. Some still do. Many others, however, remain in Washington, getting rich trying to influence their former colleagues.
This enterprise, of course, is only part of why Congress is viewed in such negative terms by 84 percent of Americans. Narrow partisanship and a greater concern for politics than for the country's well being is another. All of this is on naked display in today's Washington. The public contempt has been well earned. Whether that public dismay with our current politics can be transformed into an effective effort to alter this behavior remains to be seen. Too many in Washington have a vested interest in today's corrupt system as it exists. How to change the incentive structure for those in political life is our real challenge.
On December 31, 2011, President Obama signed the National Defense Authorization Act, which was supported by both Republicans and Democrats in the Congress. This legislation allows for the indefinite detention of American citizens within the United States - without charging them with a crime.
Under this law, those suspected of involvement with terrorism are to be held by the military. The president has the authority to detain citizens indefinitely. While Senator Carl Levin (D-MI) said that the bill followed existing law, "whatever the law is," the Senate specifically rejected an amendment that would exempt citizens, and the administration has opposed efforts to challenge such authority in federal court. The administration claims the right to strip citizens of legal protections based on its sole discretion.
This legislation was passed by the Senate 93 to 7. "The only comparable example was Reconstruction in the South," says constitutional law scholar Bruce Fein.
That was 150 years ago. This is the greatest expansion of the militarization of law enforcement in this country since.
The opposition to this legislation assembled an unlikely coalition of liberal Democrats, the American Civil Liberties Union, constitutional conservatives, libertarians, and three Republican senators - Rand Paul (KY), Mark Kirk (IL), and Mike Lee (UT).
The law, argued Senator Paul:
. . . would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. I want to repeat that. We are talking about people who are merely suspected of a crime. And we are talking about American citizens. If these provisions pass, we could see American citizens being sent to Guantanamo Bay.
Senator Mark Udall (D-CO), who proposed a failed amendment to strip the language from the bill, said that these provisions would "authorize the military to exercise unprecedented power on U.S. soil.
Writing in The American Conservative, Kelley Beaucar Vlahos notes that:
Already the federal government has broad authority to decide whether terror suspects are detained and held by federal law enforcement agencies and tried in regular courts or carried off by the military under the Military Commissions Act. This new legislation would allow the military to take control over the detention of suspects first - which means no Miranda rights and potentially no trial even on U.S. soil, putting the front lines of the War on Terror squarely on Main Street.
Bruce Fein argues that the ambiguity of words like "associated groups" or "substantially supports" gives the military wide discretion over who is considered a terrorist. "It's a totally arbitrary weapon that can be used to silence people."
Rep. Justin Amash (R-MI), one of the leading critics of the bill in the House of Representatives, issued a fact-checking memo outlining how the language can be abused:
For example, a person makes a one-time donation to a non-violent humanitarian group. Years later, the group commits hostile acts against an ally of the U.S. Under the Senate's NDAA, if the President determines the group was "associated" with terrorists, the President is authorized to detain the donor indefinitely, and without charge or trial.
James Madison warned that, "The means of defense against foreign danger historically have become instruments of tyranny at home."
Senator Paul states that:
The discussion now to suspend certain rights to due process is especially worrisome, given that we are engaged in a war that appears to have no end. Rights given up now cannot be expected to be returned. So we do well to contemplate the diminishment of due process, knowing that the rights we lose now may never be restored. . . . This legislation would arm the military with the authority to detain indefinitely - without due process or trial - suspected al-Qaeda sympathizers, including American citizens apprehended on American soil. . . . There is one thing and one thing only protecting innocent Americans from being detained at will by the hands of a too-powerful state: our Constitution and the checks it puts on government power. Should we err and remove some of the most important checks on state power in the name of fighting terrorism, well, then the terrorists will have won.
In his dissent in Hamdi v. Rumfeld, Justice Antonin Scalia declared:
Where the government accuses a citizen a waging war against it, our constitutional tradition has been to prosecute him in federal court for treason or some other crime. . . . The very core of liberty secured by our Anglo-Saxon system of separated powers has been freedom from indefinite imprisonment at the will of the executive.
Jonathan Turley, professor of law at George Washington University, points out that:
In a signing statement with the defense authorization bill, Obama said he does not intend to use the latest power to indefinitely imprison citizens. Yet, he still accepted the power as a sort of regretful autocrat. An authoritarian nation is defined not just by the use of authoritarian powers, but by the ability to use them. If a president can take away your freedom or your life on his own authority, all rights become little more than a discretionary grant subject to executive will.
James Madison, Turley recalls,
. . . famously warned that we needed a system that did not depend on the good intentions or motivations of our rulers: "if men were angels, no government would be necessary." Since 9/11, we have created the very government the framers feared: a government with sweeping and largely unchecked powers resting on the hope that they will be used wisely. The indefinite-detention provision in the defense authorization bill seemed to many civil libertarians like a betrayal by Obama. While the president had promised to veto the law over that provision, Senator Levin, a sponsor of the bill, disclosed on the Senate floor that it was in fact the White House that asked for the removal of an exception for citizens from indefinite detention.
Historically, those who seek to expand government power and diminish freedom always have a variety of good reasons to set forth for their purposes. In the case of Olmstead v. United States (1927), Justice Louis Brandeis warned that:
Experience should teach us to be most on our guard to protect liberty when the government's purposes are beneficent. Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers. The greatest dangers to liberty lurk in the insidious encroachment of men of zeal, well meaning but without understanding.
In recent years, in the name of ecology, racial equality, public health, and a variety of other "beneficent" purposes, the power of government has grown and the freedom of the individual has diminished, just as Justice Brandeis feared it would. But it has also diminished in the name of national security, something many conservatives, usually alert to the growth of government power, tend to support - or to acquiesce in. This is a serious mistake, as we now face the new threat of indefinite detention of American citizens. Freedom cannot be preserved by taking it away.
Developments in the Middle East remain chaotic. In the wake of the Arab Spring we have seen the overthrow of autocratic regimes in Tunisia and Egypt, a virtual civil war in Syria, and challenges to such governments as those in Bahrain and Yemen. The brutal Libyan dictator Muammar Ghaddafi has been overthrown. What comes next in this volatile region is difficult to know.
In an important new book, The Invisible Arab, Marwan Bishara, senior political analyst for Al Jazeera's English language service and the editor of its flagship show "Empire," and a former lecturer at the American University of Paris, provides a thoughtful analysis on how Arabs broke their own psychological barrier of fear to kindle one of the first significant revolutionary transformations of the 21st century.
Bishara describes how the historic takeover of Tunisia's November 7 Square, Egypt's Tahrir Square, and Bahrain's Pearl Square, among others, was the culmination of a long social and political struggle: countless sit-ins, strikes, and demonstrations by people who risked and suffered intimidation, torture, and imprisonment. It was aided by the dramatic rise of satellite television networks, including Al Jazeera, which bypass attempts by governments to censor news and information.
"Like most revolutions," he writes,
. . . this one was a long time coming. . . . They were the culmination of a long social and political struggle - countless sit-ins, strikes, pickets, and demonstrations. . . . The story begins with the young Arabs whose networking and organizations brought the people out into the streets. The youth, who make up 60 percent of all Arabs, have been looked upon as a "demographic bomb," and "economic burden," or as a "reservoir for extremism." However, unlike previous generations, this group heralded change.
For decades, Bishara argues, these Arab citizens and their social and political movements
. . . have been either unfairly demonized or totally ignored by the West . . . who saw the region through the prism of Israel, oil, terrorists, or radical Islamism. But today's Arabs are presenting a stark contrast to the distortion . . . heaped upon them. Characterized as unreceptive to democracy and freedom, they are now giving the world a lesson in both.
The more difficult part of this revolutionary journey, he notes, will come as
. . . the Arabs, sooner rather than later, discover that democracy and freedom come with greater responsibility. Defeating dictators is a prerequisite for progress, but does not guarantee it, especially in the absence of functional state institutions, democratic traditions, and modern infrastructure. The prevalence of poverty, inequality, and rising regional and international competition present huge challenges.
The origins of what he calls "the miserable Arab reality" are not civilizational, economic, or philosophical per se. Instead,
. . . . The origins . . . are political par excellence. Like capital to capitalists, or individualism to liberalism, the use and misuse of political power has been the factor that defines the contemporary Arab state. Arab regimes have subjugated or transformed all facets of Arab society.
By the beginning of the 21st century, Arab autocracies represented some of the oldest dictators in the world. Zine el-Abidine Ben Ali's dictatorship in Tunsia, the most recently established in the region, ruled for 25 years, followed by 30 years for Egypt's Mubarak, and 33 years for Yemen's Ali Abdulla Sale, and 43 years for Ghaddafi in Libya. In Syria, the al-Assad dynasty has ruled for 43 years, and Saddam Hussein was removed in 2003 after 24 bloody years ruling Iraq. Only the authoritarian Arab monarchies precede these dictatorships in longevity. Bahrain, a repressive Sunni monarchy, has ruled a Shia majority since its independence from Britain in 1971.
Arab states, writes Bishara,
. . . were, for a lack of better words, turned into the private estates of the ruling families. While these regimes boasted of secular republicanism, they were run similar to the Kingdom of Saudi Arabia and the United Arab Emirates, where no political activism was allowed and where the ruling families dominated all facets of political life. . . . The energy-producing Arab states are sustained rentier-type economies, characterized by a trade-off between economic welfare and political representation. Whereas the modern democratic state was founded on the cry of "no taxation without representation" . . . the modern Arab state has turned that notion on its head. With free-flowing petro-dollars pouring into their countries, Arab leaders have been able to sell off national resources and enrich themselves without having to turn to their citizens for personal taxation. . . . It became a ritual in the wealthy monarchies for the kings, emirs, or princes to provide small sums of money to their "subjects," and the poor in particular, as a makrama or "generous gift" that was generated from the natural resources in their land.
According to the U.N. Development Program's (UNDP) first Arab Human Development Report, written exclusively by Arab experts,
. . . Arab countries have not developed as quickly as comparable nations in other regions. Indeed, more than half of Arab women are illiterate; the region's infant mortality rate is twice as high as in Latin America and the Caribbean. Over the past 20 years, income growth per capita has also been extremely low.
In virtually every Arab country, more than half the population is under 30 - more than 140 million people - while a quarter are between the ages of 15 and 29, making this generation the largest youth cohort in the history of the Middle East. This unemployed and increasingly angry demographic has given traction to the "youth bulge" theory, which posits that when population growth outstrips that of jobs, social unrest is inevitable.
The influence of the information revolution has been crucial to developments in the region. As a result, notes Bishara,
. . . The Arab youth were able to think for themselves, freely exchange ideas, and see clearly beyond their ruler's deception, vengeful jihadist violence, or cynical Western calculations. . . . At the beginning of 2011, there were 27 million Arabs on Facebook, including 6 million Egyptians. Within a few nights, 2 million more Egyptians joined, underlining the centrality of the medium to the changes in the country. More than 60 million people in the Arab world are online.
Yemeni activist and 2011 Nobel Peace Prize co-winner Tawakkol Karman described the use of social media:
The revolution in Yemen began immediately after the fall of Ben Ali in Tunisia. . . . As I always do when arranging a demonstration, I posted a message on Facebook, calling on people to celebrate the Tunisian uprising.
This new media, writes Bishara,
. . . had an important cultural, even sociological role to play in patriarchal Arab societies. It helped young people break free from social constraints. It propelled them into uncharted territory, and it helped them mold a certain type of individualism. They began to enjoy an uninhibited space where they could share information and experiences, join chat rooms, and participate with one another. Theirs is a new found egalitarianism. . . .
Bishara laments the fact that,
Arabs have been valued not for their embrace of freedom or respect for human rights, but rather in terms of their proximity to U.S. interests. A subservient ally and energy providing partner made for a good Arab regime, regardless of its despotic or theocratic rule. . . . Western leaders have talked in slogans . . . about democracy and Islam, but have always been as indifferent to the people of the region as their dictators.
What does the future hold? Bishara recognizes that there are great dangers:
Islamist movements, the likes of the Egyptian Brotherhood, have already opened dialogue with the military and with Western powers on the basis of mutual interest and respect. This might be seen as a positive development, that allows for a new sort of regional order on the basis of a new accommodation among Islamists, the generals, and Western leaders. However, this triangle could eventually be as oppressive and totalitarian as the previous dictatorships . . . the Islamists must make sure that they reconcile with the principles of democracy and modern statehood, not a division of labor with the military. . . . Many of the Islamists I spoke to reckon that if they have a majority they have a democratic right to change the constitution and govern as they see religiously fit. They don't recognize democracy as first and foremost a system of government based on democratic values that go beyond the right of the majority to rule, to ensure that the rights and privileges of the minorities are respected and preserved. . . .
The Invisible Arab is a thoughtful contribution to our understanding of the Middle East from one of its articulate new voices. He shows how the revolutions have evolved - and how it could all go terribly wrong. Marwan Bishara hopes for a free and democratic Middle East - and he has his fingers crossed. *
The summer of 2011 saw a proliferation of a phenomenon that has come to be known as "flash mobs." Organized largely through text messages and via Facebook and Twitter, the gangs of unruly youths, usually members of minority communities, have beaten and robbed citizens in Philadelphia, disrupted a fireworks display outside Cleveland, attacked fairgoers in Wisconsin, and looted a 7-Eleven in Germantown, Maryland.
Riots in London during the summer mirror some of the worst uprisings in modern U.S. history. Hundreds of stores across London, Manchester, Birmingham, and other British cities were torched or ransacked in four nights of mayhem after the police killing of a north Londoner named Mark Duggan, whose death was quickly overshadowed by the wave of recreational violence. "This is criminality, pure and simple," said Prime Minister David Cameron.
The looting was more than simply a race riot. While Duggan was black, and there are strong correlations between race and class in Britain, some of the worst violence happened in majority-white neighborhoods like Croydon. "This is much broader than race," says Caryl Phillips, a British writer with Afro-Caribbean roots. "This is about a whole group - black, white, and brown - who live just outside the law."
In the U.S., notes Jerry Ratcliffe, chairman of the Department of Criminal Justice at Temple University, and a former London police officer:
This is an old crime being organized with new tools. There's nothing new about groups of people assaulting people and robbing, but what's new is the technology. There's a fascination with the speed by which this can now take place. You can go from nothing happening to something happening in a matter of moments. Flash mobs take advantage of opportunities. Those opportunities are that the victims are outnumbered by the group and that there is an absence of law enforcement.
In Philadelphia, Mayor Michael A. Nutter, who is black, told marauding black youths, "You have damaged your own race." After imposing a strict curfew, Nutter told young people: "Take those God-darn hoodies down, especially in the summer. Pull your pants up and use a belt 'cause no one wants to see your underwear. . . ."
Mayor Nutter moved up the weekend curfew for minors to 9 p.m. and told parents that they would face increased fines for each time their child is caught violating the curfew.
The head of the Philadelphia chapter of the NAACP, J. Whyatt Mondesire, said it "took courage" for Mr. Nutter to deliver the message. "These are majority African-American youths and they need to be called on it."
In the past two years, Philadelphia has been the scene of a number of flash mobs in which youths meet at planned locations by texting one another and then commit assorted mayhem. In one episode, teens knocked down passersby on a Center City street and entered an upscale department store where they assaulted shoppers. In another incident, about 20 to 30 youths descended on Center City after dark, then punched, beat, and robbed bystanders. One man was kicked so savagely that he was hospitalized with a fractured skull. Police arrested four people, including an 11-year-old.
Speaking from the pulpit of his Baptist Church, Mr. Nutter delivered a 30-minute sermon about black families taking responsibility for the behavior. He said:
The Immaculate Conception of our Lord Jesus Christ took place a long time ago, and it didn't happen in Philadelphia. So every one of these kids has two parents who were around and participating at the time. They need to be around now.
The Mayor told parents:
If you're just hanging out out there, maybe you're sending them a check or bringing some cash by. That's not being a father. You're just a human ATM. . . . And if you're not providing the guidance and you're not sending any money, you're just a sperm donor.
Columnist Gregory Kane, who is black, writes:
What is particularly instructive in this instance is where the 11-year old (arrested in Philadelphia) ended up: in the custody of his grandmother. We don't know what the boy's mother and father are doing right about now, but we know what they aren't doing: parenting their son. . . . Excuses for the flash mobbers, many of whom are black, with some attacking whites at random . . . have been coming fast and furious. They need jobs, the excuse makers tell us. They need recreational facilities. What they need are parents who don't hesitate to put a foot squarely in their derrieres when a foot in that spot is needed.
In Mayor Nutter's view, the collapse of the black family is a key element in the problems we face.
Let me speak plainer: That's part of the problem in the black community. . . . We have too many men making too many babies they don't want to take care of, and then we end up dealing with your children.
In the U.S. at the present time, out-of-wedlock births are now at 41 percent of overall births, but there is a tremendous variation in illegitimate births by race. Such births are the norm in both the black (72 percent) and Hispanic (53 percent) communities, but less than a third of white births (29 percent) are illegitimate.
It is clear that there has been a racial component in the flash mob events this past summer. Columnist Gregory Kane states:
I don't know what else to call it when mobs of blacks single out whites to attack. But there still exists this notion that blacks can't be racists. Racism requires power, the thinking goes. Since blacks have no power, they can't be racists. Such nonsense is bad enough when left-wing loonies and black nationalist types parrot it. But the Rev. Jesse Jackson is a prominent black leader. He, at least, should know better. Alas, he does not. This, he declares, "Is nonsense."
The respected economist Thomas Sowell disputes the idea that the violence of flash mobs can be explained by disparities in income. In his view:
Today's politically correct intelligentsia will tell you that the reason for this alienation and lashing out is that there are great disparities and inequities that need to be addressed. But such barbarism was not nearly as widespread two generations ago, in the middle of the 20th century. Were there no disparities or inequities then? Actually there were more. What is different today is that there has been - for decades - a steady drumbeat of media and political hype about differences in income, education and other outcomes, blaming these differences on oppression against those with fewer achievements or lesser prosperity.
The fact that so many black voices are now being heard about the decline of the black family and the manner in which that decline has led to such events as the flash mobs of this past summer is a hopeful sign. No problem can be resolved unless it is properly understood. Hopefully, that understanding will grow and the real problems we face can then be addressed.
Crony capitalism - the close alliance of big business with government - leads not to free enterprise but its opposite, in which government, not the market, chooses winners and losers through subsidies and other forms of government largesse. Adam Smith, the great philosopher of capitalism, understood that businessmen want to maximize profits, and how it is done is of secondary interest. Indeed, he once said that when two businessmen get together, the subject of discussion is how to keep the third out of the market. Adam Smith - and more recent philosophers of the free market such as Hayek, Ludwig von Mises, and Milton Friedman - believed deeply in capitalism. Many businessmen, and many on Wall Street, do not.
Consider some of the recent manifestations of this phenomenon.
The U.S. Government guaranteed a $535 million loan for Solyndra, LLC, the now bankrupt California company that was the centerpiece of President Obama's "clean energy" future. There are a least 16 more such loan guarantees worth in excess of $10 billion.
From e-mails made public in mid-September by the House Energy and Commerce subcommittee on Oversight and Investigation, it is clear that key Solyndra loan decisions were guided primarily by political considerations.
President Obama was not in the White House when the proposal to back the company initially appeared in Washington, but two weeks before President George W. Bush left office, an Energy Department review panel unanimously recommended against making the loan. Even after Obama decided to support the proposal, career employees at the Office of Management and Budget cautioned against doing so. One predicted that Solyndra would run out of money and file for bankruptcy by September 2011. A Government Accountability Office report said that the Energy Department had circumvented its own rules at least five times to make the loan. The leading investors in Solyndra were two investment funds with ties to George B. Kaiser, a major fundraising "bundler" for Obama.
Both Republicans and Democrats supported the loan-guarantee program, which was approved by the Republican-controlled Congress in 2005. The loan guarantee program for alternative energy companies was created as part of the Energy Policy Act of 2005, sponsored by Rep. Joe Barton (R-TX), who has been a leader in the congressional probe of Solyndra's ties to the Obama administration.
Similarly, Senator Jim DeMint (R-SC) said in the Senate that the Solyndra case exposed the "unintended results when our government tries to pick winners and losers." This, of course, is quite true. Yet DeMint himself had been a supporter of the loan-guarantee legislation in 2005.
The fact is that solar companies are not the only energy companies getting federal loan guarantees. The power giant Southern Co. won a $3.4 billion loan guarantee from the Energy Department last summer. Yet, even some Republican critics of big government have supported this huge expenditure. Rep. Phil Gingrey (R-GA) declared that it was wrong to compare Southern to Solyndra because "Southern Co. owns Mississippi Power, Alabama Power, Georgia Power, among others, and employs literally thousands of people."
Washington Examiner columnist Timothy Careny notes that:
The implication was clear: Federal subsidies to big, established companies are fine. It's the handouts to these upstarts that are objectionable. So Gingrey is embracing the heart of Obamanomics - the proposition that government ought to be an active partner in shaping the economy and helping business. . . . If Republicans were willing to broaden their attack beyond criticizing this one (Solyndra) deal, they could indict the whole practice of government-business collusion.
Or consider the Export-Import Bank, supported by both Republicans and Democrats, which is a government agency that subsidizes U.S. exporters. Recently, it broke its record for the most subsidy dollars provided in a single year, primarily to Boeing.
Members of both parties have voted to bail out failed banks, auto companies, and other enterprises considered "too big to fail." Now, business interests are lining up to influence the work of the new congressional "supercommittee" that will help decide whether to impose massive cuts in spending for defense, health-care, and other areas. Nearly 100 registered lobbyists for big corporations used to work for members of the committee and will be able to lobby their former employers to limit the effect of any reductions. They are representing defense companies, health-care conglomerates, Wall Street banks, and others with a vested interest in the outcome of the panel's work. Three Democrats and three Republicans on the panel also employ former lobbyists on their staff.
The 12-member committee is tasked with identifying $1.5 trillion in spending reductions over a decade. "When the committee sits down to do its work, it's not like they're in an idealized platonic debating committee," said Bill Allison, editorial director of the Sunlight Foundation, which is tracking ties between lobbyists and the panel. "They're going to have in mind the interests of those they are most familiar with, including their big donors and former advisers."
General Electric, for example, has been awarded nearly $32 billion in federal contracts over the past decade, with much of that business going to lucrative defense and health-care subsidiaries. General Electric's chief executive, Jeffrey Imelt, also heads President Obama's Council on Jobs and Competitiveness. At least eight GE lobbyists used to work for members of the supercommittee.
Top donors to the deficit committee members include AT&T, $562,045; BlueCross/Blue Shield; $460,02; General Electric, $452,999; American Bankers Association, $421,883; Citigroup, $443,006; and National Association of Realtors, $418,000. Needless to say, they contribute to both parties.
A study last year from the London School of Economics found 1,113 lobbyists who had formerly worked in the personal offices of lawmakers. At least nine members of the 12-member supercommittee have scheduled fundraisers this fall, putting them in a position to take money from industry donors at the same time they are helping to decide what to cut from government spending. The most active fundraiser on the panel appears to be Rep. James Clyburn (D-SC) who has a least five donor events scheduled before the panel's Thanksgiving deadline. According to the Sunlight Foundation, contributions given during the time the supercommittee is meeting will not be disclosed to the Federal Election Committee until January - well after the final decision is made.
Sadly, free markets are genuinely embraced more often by intellectuals than businessmen. All too often, businessmen seek government subsidy, bailout, and intervention to keep competitors out of the market. When Congress acted to eliminate the Civil Aeronautics Board and the Interstate Commerce Commission and open up the airline and trucking industries to real competition, it was the industries themselves that opposed deregulation, for they had found a way to control the government agencies involved in their own behalf.
The old warning by the economist Friedrich Hayek that socialism in its radical form is not nearly as dangerous as socialism in its conservative form is worthy of serious reconsideration. When the advocates of state power and the advocates of corporate bigness become allies, government involvement in the economy - a form of socialism - is inevitable. The result is the crony capitalism we now face. *
In recent days, our country has been embroiled in three wars - Iraq, Afghanistan, and Libya.
Article I, Section 8, of the U.S. Constitution clearly gives Congress - not the executive - the power to declare war. Since the Constitution was signed in 1787, Congress has declared war five times: the War of 1812, the Mexican War, the Spanish-American War, and World Wars I and II. Yet, since 1787, the U.S. has been involved in numerous military conflicts without a declaration.
In the case of the Korean War, President Truman sent some 1.8 million soldiers, sailors, and airmen over a period of just three years and 36,000 lost their lives - but never sought or resolved a congressional declaration of war. Congress has not declared war since World War II, despite dozens of conflicts since then.
In 1973, Congress passed the War Powers Resolution, which was meant to counteract what Presidents Nixon and Johnson had done in Vietnam. Congress felt deceived, particularly since it was later discovered that the Gulf of Tonkin incident that precipitated a larger war had never, in fact, taken place.
The law, however, hardly reasserts Congress' very clear constitutional power to declare war. Instead, it simply asks for an authorization letter and then gives the President a three-month deadline. It requires the President to withdraw U.S. forces from armed hostilities if Congress has not given its approval within 60 days.
Even fulfilling the requirements of the War Powers Resolution appears to be too much for the Obama Administration. In fact, the President rejected the views of top lawyers at the Pentagon and the Justice Department when he decided that he had the legal authority to continue American military participation in the air war in Libya without congressional authorization.
Jeh C. Johnson, the Pentagon general counsel, and Caroline D. Krass, the acting head of the Justice Department's Office of Legal Counsel, told the White House that they believed that the U.S. military's activities in the NATO-led air war amounted to "hostilities" under the War Powers Resolution, that would require Mr. Obama to terminate or scale back the mission after May 20.
The President, however, adopted the legal analysis of the White House counsel, Robert Bauer, and several others who argued that the military's activities in Libya fell short of "hostilities." Under that view, Obama needed no permission from Congress to continue the mission unchanged.
Late in June, the House rejected a bill to authorize the U.S. military operations in Libya. The resolution to support the mission failed 295 to 123, with 70 Democrats joining Republicans in a rebuff to the President. Still, the House also defeated a measure that would have limited financing to support these efforts.
Rep. Jason Chaffetz (R-UT) said:
It didn't go far enough. Under that resolution, the president is still going to be engaged in the war. We've been inept and irrelevant on the war actions. We have not lived up to our constitutional duty.
In Libya, the goal of our mission appears to have changed from month to month. In March, the President said that U.S. intervention would be confined to implementing a no-fly zone. He declared that, "Broadening our mission to include regime change would be a mistake." By May, the mission was to make Libyans "finally free of 40 years of tyranny." By June, after more than 10,000 sorties, including those by attack helicopters, the strategy seems to boil down to an effort to eliminate Gaddafi himself.
While some have charged that opponents of the conflict in Libya are "isolationists," conservative columnist George Will notes that:
Disgust with this debacle has been darkly described as a recrudescence of "isolationism" as though people opposing this absurdly disproportionate and patently illegal war are akin to those who, after 1938, opposed resisting Germany and Japan. Such slovenly thinking is a byproduct of shabby behavior.
While men and women of good will may disagree about the merits of the U.S. intervention in Libya - or Afghanistan and Iraq - the larger question is whether one man, the President, can take the country to war without a congressional declaration, as clearly called for in the Constitution.
What we are dealing with is the dangerous growth of executive power. During the years of the New Deal, when the power of president was dramatically expanded, Republicans, who were in the opposition, objected to the growth of such power as a threat to freedom. Later, when Republicans held the power of the Presidency, they, too, expanded executive power, and Democrats, now in opposition, objected. This has been characterized as argument from circumstance, not principle. If you hold power, you expand it. No one in power has an incentive to cede back the power that has been assumed.
Even at the beginning of the Republic perceptive men such as John Calhoun predicted that government would inevitably grow, and those in power would advocate a "broad" use of power, and those out of power would always argue for a "narrow" use of power, and that no one would ever turn back government authority which has once been embraced.
Calhoun was all too prophetic when he wrote the following in "A Disquisition On Government":
. . . . Being the party in possession of government, they will . . . be in favor of the powers granted by the Constitution and opposed to the restrictions intended to limit them. As the major and dominant parties, they will have no need of these restrictions for their protection. . . . The minor or weaker party, on the contrary, would take the opposite direction and regard them as essential to their protection against the dominant party. . . . But where there are no means by which they could compel the major party to observe the restrictions, the only resort left then would be a strict construction of the Constitution. . . . To this the major party would oppose a liberal construction . . . one which would give to the words of the grant the broadest meaning of which they were susceptible.
It would then be construction against construction - the one to contract and the other to enlarge the powers of the government to the utmost. But of what possible avail could the strict construction of the minor party be, against the liberal interpretation of the major party, when the one and the other be deprived of all means of enforcing its construction? In a contest so unequal, the result would not be doubtful. The party in favor of the restrictions would be overpowered. . . . The end of the contest would be the subversion of the Constitution. . . the restrictions would ultimately be annulled and the government be converted into one of unlimited powers.
Our history shows that this is true. Republicans opposed big government when Democrats were in power, but spoke of concepts such as "executive privilege" when their own party held positions of authority. The Democrats have done exactly the same thing. The growth of government power has been a steady process, regardless of who was in office.
Those who want to rein in government power, to return to the federal system set forth in our Constitution, with its clearly defined separation of powers and checks and balances, would do well to turn their attention to the question of who has the power to take America to war. The Constitution did not give one man that power, although events in Afghanistan, Iraq, and Libya show us that this seems no longer to be the case. Concern over developments in Libya are a healthy sign that more and more Americans seem to be paying attention to the question of the war-making power.
Mounting evidence of a dramatic decline in American public education is leading to a renewed push for voucher programs across the country.
Details of decline are all around us. Some of the New York City high schools that received the highest grades under the Education Department's school assessment system are graduating students who are not ready for college. Of the 70 high schools that earned an "A" on the most recent city progress report and have at least one third of graduates attending college at City University of New York (CUNY), 46 posted remediation rates above 50 percent, according to reports sent to the city's high schools. Remediation rates - the percentage of students who fail a CUNY entrance exam and require remediation classes - rose to 49 percent in 2010 from 45 percent in 2007.
About three quarters of the 17,500 freshmen at CUNY community colleges this year have needed remedial instruction in reading, writing, or math, and nearly a quarter of the freshmen have required such instruction in all three subjects.
Fewer than half of all New York state students who graduated from high school in 2009 were prepared for college or careers, as measured by state Regents tests in English and math. In New York City, that number was 23 percent.
At LaGuardia Community College in Queens, where 40 percent of the math classes are remedial, faculty member Jerry G. Ianni says:
Most students have serious challenges remembering the basic rules of arithmetic. The course is really a refresher, but they aren't ready for a refresher. They need to learn how to learn.
About 65 percent of all community college students nationwide need some form of remedial education, with students' shortcomings in math outnumbering those in reading two to one, said Thomas R. Bailey, director of the Community College Research Center at Teachers University at Columbia University.
The New York State Department of Education released new data in June showing that only 37 percent of students who entered high school in 2006 left four years later adequately prepared for college, with even smaller percentages of minority graduates and those in the largest cities meeting that standard. In New York City, 21 percent who started high school in 2006 graduated last year with high enough scores on state math and English tests to be deemed ready for higher education, or well-paying careers. In Rochester County, it was 6 percent, in Yonkers, 14.5 percent.
Nearly one fourth of the students who try to join the U.S. Army fail its entrance exam, painting a grim picture of an educational system that produces graduates who can't answer basic math, sciences, and reading questions. The report by the Education Trust bolsters a growing worry among military and education leaders that the pool of young people qualified for military service will grow too small.
"Too many of our high school students are not graduating ready to begin college or a career - and many are not eligible to serve in our armed forces," Education Secretary Arne Duncan said. "I am deeply troubled by the national security burden created by America's underperforming education system."
The report found that 23 percent of recent high school graduates don't get the minimum score needed on the enlistment test to join any branch of the military. Questions are often basic, such as: "If 2 plus X equals 4, what is the value of X?"
The military exam results are also of concern because the test is given to a limited pool of people. Pentagon data shows that 75 percent of those aged 17 to 24 don't even qualify to take the test because they are physically unfit, have a criminal record, or don't graduate from high school.
"It's surprising and shocking that we still have students who are walking across the stage who really don't deserve to and haven't earned that right," said Tim Callahan with the Professional Association of Georgia Educators, a group that represents more than 80,000 educators.
The study shows wide disparities in scores among white and minority students, similar to racial gaps on other standardized tests. Nearly 40 percent of black students and 30 percent of Hispanics don't pass, compared with 16 percent of whites. The average score for blacks is 39 and for Hispanics is 44, compared to whites' average score of 55.
The decline in American public education has led to a renewed campaign for a voucher system which would give middle-class and poor parents the same freedom of choice of where to send their children to school that only well-to-do parents now have.
Early in May, Indiana Governor Mitch Daniels signed what is probably the broadest voucher law in the country. A few days later, Oklahoma approved the tax credits for those who contribute to a privately funded private school "opportunity scholarship" program. In New Jersey, in May, a voucher bill was approved by a Senate committee with bipartisan support. In Washington, D.C., the voucher program, which was killed by the Democratic majorities in the last Congress, is all but certain to be restored. In Wisconsin, Governor Scott Walker is pushing hard to broaden Milwaukee's voucher program to other cities and many more children.
According to the Foundation for Educational Choice, a pro-voucher group that lists Milton Friedman as its patriarch, more than 52 bills have emerged this year, some passed, some still pending, in 36 states - among them Arizona, Florida, Ohio, Oregon, and Pennsylvania - providing funding for vouchers, tax credits, or other tax-funded benefits for private education. "No year in recent memory," said foundation president Robert Enlow, has provided better opportunities for the cause."
Writing in The Nation, Peter Schrag, a liberal, declares that, "Milton Friedman's vision for school choice is becoming a reality around the country."
Early in April, a divided Supreme Court further heartened the movement by upholding Arizona's law providing tax credits for contributions to "school tuition organizations" - scholarship funds for private and religious schools.
Many forget that vouchers have never been an exclusively conservative issue. In the 1960s, liberal school reformers like Paul Goodman and John Holt, pushing for "free schools," the "open school," and other escapes from what they regarded as "over-bureaucratized, lockstep" school structures, embraced vouchers as a way of getting there.
Later, liberals like Berkeley law professor John Coons, who helped launch lawsuits seeking equity in school spending, became strong voucher advocates as a way to allow poor and minority children some way out of the ghetto schools.
Clearly, the time seems to have come for a voucher system - and genuinely free choice for parents with regard to where to send their children to school.
The U.S. Supreme Court, in a 7-2 ruling late in June, declared, in a decision written by Justice Antonin Scalia, that a California law that bars selling extremely violent videos to children violated children's First Amendment rights to buy interactive games in which they vicariously steal, rape, torture, and decapitate people to score points.
Justice Scalia said that the state had no compelling interest in limiting the sale of such violent videos. He made light of studies showing that violent videos correlate to aggressive behavior in some children and denied that reading about violence is different from participating in full-color, sound-filled interactive depiction in which the children themselves commit the violence.
Justices Stephen G. Breyer, a liberal, and Clarence Thomas, a conservative, filed the only dissents, arguing that the law was intended to empower parents, not erode the First Amendment. The law targets adults who sell this material to children. The goal, clearly, was not to disempower children but to curb predators.
In a concurring opinion, Justice Samuel Alito and Chief Justice John G. Roberts, argued that the law should be struck down because of vagueness, but added that:
The Court is far too quick to dismiss the possibility that the experience of playing video games (and the effects on minors of playing violent video games) may be very different from anything that we have seen before. . . . In some of these games, the violence is astounding. Victims by the dozens are killed with every imaginable implement . . . dismembered, decapitated, disemboweled, set on fire and chopped into little pieces. They cry out in agony and beg for mercy. Blood gushes, splatters, and pools. Severed body parts and gobs of human remains are graphically shown. In some games, points are awarded based, not only on the number of victims killed, but on the killing technique employed.
In his dissent, Justice Thomas declared that:
The Farmers could not possibly have understood the freedom of speech to include a qualified right to speak to minors. Specifically, I am sure that the founding generation would not have understood "the freedom of speech" to include a right to speak to children without going through their parents.
In his dissent, Justice Breyer quoted from a 1944 case, where the court recognized that the "power of the state to control the conduct of children reaches beyond the scope of its authority over adults."
Most adult Americans probably have no idea of the nature of the video games children are playing - and which the Supreme Court has now embraced as free speech. One such graphic game involves the player torturing a girl as she pleads for mercy, urinating on her, dousing her with gasoline and setting her on fire.
Among the most popular games is "Bloody Day," described this way:
Back alley butchering has never been so much fun. It's like having your own barrel with moderately slow moving fish. How many kills can you rack up?
Another is "Boneless Girl," which is presented in these terms:
Poke and pull this scantily clad babe all over bubble-land. You'll be amazed by the small spaces she can fit through, and throwing her across the screen never gets old.
Sadly, notes Joel Bakan, author of the forthcoming book Childhood Under Siege: How Big Business Targets Children, children:
. . . don't need to rent or buy casual games. They are available on computers, tablets, and cellphones - free. (California's law wouldn't have applied to these games, even if it had survived the court's scrutiny, because they are not rented or sold.) Many popular casual games contain as much violence as notorious video games like Postal 2 and Grand Theft Auto, if not more. But they tend to exist under the radar; they're part of an obscure world into which teenagers and children escape and about which parents are often in the dark. (I learned about them only after I asked my 12-year-old son what he liked to do online.)
Bakan reports that,
Nickelodeon's www.addictinggames.com, a premier casual game site, calls itself "the largest source of the best free online games." It attracts 20 million unique monthly users, mostly children and teens. . . . Like other leading casual game sites, www.addictinggames.com makes money by running advertisements. According to Viacom, the site's corporate owner, the aptly named site allows "junkies" to "gorge themselves" and to "fuel their addiction." Viacom's interest in promoting addiction helps explain why Nickelodeon, the award-winning children's network, might want to push brutal, violent entertainment. Violence sells. And it continues to sell to children, teens, and tweens "hooked" at an early age and hungry for more. . . . The games' use of graphic violence to generate profit is strategic and calculated.
In the 1949 case of Terminiello v. Chicago, Justice Robert H. Jackson, in a famous dissent, declared that, "The Constitution is not a suicide pact." He wrote that,
The choice is not between order and liberty. It is between liberty with order and anarchy without either. There is danger if the court does not temper its doctrinaire logic with a little practical wisdom, it will convert the constitutional Bill of Rights into a suicide pact.
Discussing the California video game case, Brown v. Entertainment Merchants Association, Robert Knight, senior fellow for the American Civil Rights Union, notes that:
The Constitution is the greatest political document in history and the guarantor of our God-given rights. The First Amendment has proved foundational to maintaining all of our freedoms. Exceptions should be few and necessary. But in the hands of America's ruling lawmakers and jurists, the First Amendment is sometimes misapplied as a free pass for dysfunction and decadence.
It is apparently perfectly within the law for movie theaters to refuse to sell tickets to minors to see R- or X-rated movies. What is the difference when it comes to violent video games? To protect children from material of this kind has always been viewed as a sign of civilization. The evidence that watching violent material has a serious impact upon young people is widespread. Consider the role such violent videos played in the lives of the perpetrators of the massacre at Columbine.
Why would the Supreme Court turn its back on such evidence - and normal common sense - to issue a ruling such as the one it did? This is difficult to understand, and we are fortunate that Justices Breyer and Thomas dissented. From their dissents, hopefully, we can revisit this decision in the future. *
Truth seems to be an increasingly rare commodity in the contemporary American society.
In our political life, the lies are legion. In June, after ten days of adamant denials, Rep. Anthony Weiner (D-NY), finally admitted to having sent sexually explicit photographs to various young women. After telling the nation that he "did not have sexual relations with that woman," former President Clinton finally admitted the truth. Former Senator John Ensign (R-NV) denied the facts of his relationship with a married staff member and the payoff by his parents to the woman's husband. Former Senator John Edwards (D-NC) had a staff member claim to be the father of his mistresses' child.
But lack of truth goes far beyond the personal lives of our politicians. Where were the weapons of mass destruction we were told Saddam Hussein possessed - and which were used as a justification for launching the war in Iraq? We now know that the Gulf of Tonkin incident, the precipitating event which led to President Johnson's launching the Vietnam War, did not really happen. Sadly, the list is a long one.
And it is not only in our political life that truth is hard to find. In an important new book, Tangled Webs: How False Statements Are Undermining America: From Martha Stewart to Bernie Madoff, James B. Stewart warns of the risks from an epidemic of perjury that has "infected nearly every aspect of society."
Citing prosecutors who speak of a recent surge of deliberate lying by sophisticated individuals, often represented by the best lawyers, he focuses on four cases involving well-known people: business executive and lifestyle guru Martha Stewart, convicted of lying to investigators about the reasons her ImClone stock was sold; former Dick Cheney adviser Lewis "Scooter" Libby, found guilty of perjury in conjunction with the leak of CIA operative Valerie Plame's identity; baseball star Barry Bonds, indicted for perjury related to illegal use of steroid drugs; and Bernard Madoff, who while conducting the greatest Ponzi scheme in history, and lying to investors and investigators, was never actually indicted for perjury.
Stewart is particularly outraged when it comes to the failure to indict Madoff for perjury. It was clear to Securities and Exchange Commission investigators in 2005 that he was lying about his investment business, but their superiors decided not to press the issue:
At the time of his sworn testimony in 2006, Madoff purported to have approximately $20 billion under management. By the time his scheme collapsed, he had $65 billion. Failing to pursue his lies cost innocent victims another $45 billion.
Stewart believes that lying is on the rise, threatening to swamp the legal system, and sow cynicism nationwide. In the end, he argues, "it undermines civilization itself."
Consider the case of Greg Mortenson. His best-selling books, Three Cups of Tea, and Stones into Schools are full of lies and evasions. He tells the story of how in 1993, he stumbled into the tiny Pakistani village of Korphe after a failed attempt at climbing K2. He explains how the kind villagers nursed him back to health with many cups of tea and how, as payment for their generosity, he returned to build a school. That school then became hundreds of schools across Pakistan and Afghanistan. Millions were inspired by the idea that a man could make such a profound difference in a desperate part of the world. Mortenson was nominated three times for the Nobel Prize. He was called a secular saint.
In April, as a result of an investigative report by bestselling author Jon Krakauer and a "60 Minutes" expose, we learned that Mortenson may very well be a charlatan. The most significant passages in the book seem to be fictitious, including the whole story about his recovery in Korphe. The "Taliban abductors" described in "Three Cups of Tea" were supposedly friendly villagers protecting him as a guest of honor. It was reported that his charity is apparently badly mismanaged and that many of its schools stand empty, some of them serving as storage sheds for hay.
In 2009, only 41 percent of donations to Mortenson's charity went to its work in Afghanistan and Pakistan. Much of the rest, charge Krakauer and "60 Minutes," went to Mortenson himself - to chartered jets, massive purchases of his books (at retail, so he would get the royalties and keep them on the bestseller list), and advertisements for them in The New Yorker at more than $100,000 each time.
More and more Americans are also claiming to have military honors they never earned. Joseph Brian Cryer, for example, is a former candidate for City Council in Ocean City, Maryland. He claimed to be an elite U.S. Navy SEAL and bragged online about having "77 confirmed kills" in 102 hours during a Libyan operation in 1986. To prove his own bona fides, he showed a government ID card that shows him to be 100 percent disabled and a Navy commander.
But Cryer is a fraud, said Don Shipley, a retired SEAL who makes it his business to expose false ones. Shipley has access to a database of all Navy SEALs since 1947. Since Navy SEAL Team 6 took out Osama bin Laden in April, he said he has received about 50 requests each day to investigate people who claim to be SEALs.
The list of those criminally charged for falsifying their military service is a long one. In one case, Command Sgt. Maj. Stoney Crump, the senior enlisted man at Walter Reed Army Medical Center, was fired for faking his record and wearing numerous unauthorized awards and decorations. He was sentenced to six months in prison.
In another case, former Marine Corps Sgt. David Budwah was sentenced in 2009 to 18 months confinement and fined $25,000 for pretending to be an injured war hero to get free seats at rock concerts and professional sporting events.
"Every society in history, since the caveman days, has revered its warriors," said B. G. Burkett, author of Stolen Valor. He has uncovered thousands of suspected fakes and says most lie out of lack of self-esteem. "They haven't done anything in their lives," he said. "But the second they say they're a warrior, everybody sees them in a different light."
Congress passed the Stolen Valor Act in 2006. The law makes it illegal for someone to falsely claim to hold military honors or decorations. But some of those who have faced criminal charges claim the law is unconstitutional, arguing that it violates the First Amendment. The law "has every good intention," said Ken Paulson, president of the First Amendment Center. "But courts have been reluctant to outlaw lying in America. It's just too prevalent to legislate."
This far, federal courts have split on the law's constitutionality. A federal judge in Virginia ruled this year that the First Amendment doesn't protect the false claims the act makes illegal. But the California-based 9th Circuit Court of Appeals found the law unconstitutional last year.
In May, Rep. Joe Heck (R-NV) introduced a revised Stolen Valor Act that would make it a crime of fraud to benefit, or intend to benefit, from lying about military awards. "It's not O.K. to misrepresent yourself as a physician and practice medicine," Mr. Heck said.
It's not O.K. to misrepresent yourself as a police officer. Why should you be able to misrepresent yourself as a member of the military, specifically if you're trying to gain something of value?
The widespread telling of untruths - and the claim that people have a legal right to engage in lying about basic credentials - is an indication of our society's current moral standards. In the end, more is involved than simply immoral behavior. Such behavior is, in fact, a threat to democratic self-government.
Edmund Burke, in his letter to a member of the French National Assembly in 1791, made a point we might well ponder today:
Men are qualified for civil liberty in exact proportion to their disposition to put chains upon their own appetites in proportion as their love of justice is above their rapacity; in proportion as their soundness and honesty of understanding is above their vanity and presumption; in proportion as they are more disposed to listen to the counsels of the wise and good in preference to the flattery of knaves. Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less there is of it within, the more of it there must be without. It is ordained in the eternal constitution of things that men of intemperate minds cannot be free. Their passions forge their fetters.
American students are less proficient in their nation's history than in any other subject, according to results of a nationwide test released in June, with most fourth graders unable to say why Abraham Lincoln was an important figure, and few high school seniors able to identify China as the North Korean ally that fought American troops during the Korean War.
Overall, 20 percent of fourth graders, 17 percent of eighth graders and 12 percent of high school seniors demonstrated proficiency on the exam, the National Assessment of Educational Progress. Fewer than a third of eighth graders could answer what was described as a "seemingly easy question," asking them to identify an important advantage American forces had over the British during the Revolution, the government's statement on the results said.
Diane Ravitch, an education historian who was invited by the national assessment's governing board to review the results, said she was particularly disturbed by the fact that only 2 percent of 12th graders correctly answered a question concerning Brown vs. Board of Education, which she called "very likely the most important decision of the U.S. Supreme Court in the past seven decades."
Students were given an excerpt including the passage, and were asked what social problem the 1954 ruling was supposed to correct:
We conclude that in the field of public education, separate but equal has no place, separate educational facilities are inherently unequal.
"The answer was right in front of them," Ms. Ravitch said. "This is alarming."
"The results tell us that, as a country, we are failing to provide children with a high-quality, well-rounded education," said Education Secretary Arne Duncan.
The evidence of our failure to teach our history is abundant. Fewer than half of American eighth graders knew the purpose of the Bill of Rights on the most recent national civics examination, and only one in 10 demonstrated acceptable knowledge of the checks and balances among the legislative, executive, and judicial branches, according to the test results released in April.
"These results confirm that we have a crisis on our hands when it comes to civics education," said Sandra Day O'Connor, the former Supreme Court justice, who last year founded icivics.org, a nonprofit group that teaches students civics through web-based games and other tools.
"The results confirm an alarming and continuing trend that civics in America is in decline," said Charles N. Quigley, executive director of the Center for Civic Education. "During the past decade or so, educational policy and practice appear to have focused more and more upon developing the worker at the expense of developing the citizen."
"We face difficult challenges at home and abroad," said Justice O'Connor.
Meanwhile, divisive rhetoric and a culture of sound bites threaten to drown out rational dialogue and debate. We cannot afford to continue to neglect the preparation of future generations for active and informed citizenship.
Historian David McCullough says that:
We're raising young people who are, by and large, historically illiterate. I know how much of these young people - even at the most esteemed institutions of higher learning - don't know. It's shocking.
McCullough, who has lectured on more than 100 college campuses, tells of a young woman who came up to him after a lecture at a renowned university in the Midwest. "Until I heard your talk this morning, I never realized the original 13 colonies were all on the East Coast," she said.
Some years ago, when 111 ninth graders in a Honolulu school were asked to write the Pledge of Allegiance, no one could do it correctly. One response described the United States as a nation "under guard" and dedicated "for richest stand." A teacher, who asked not to be identified so her students would not be embarrassed, called the results frightening. She said all the students had spelling problems and had little grasp of what the pledge words meant. The word "indivisible," for example, came out as "in the visible." The teacher said that 12 students had trouble spelling the word "America." The word appeared in some papers as "Americain," "Americai," "Amereca," "Amicra," and "Amica." The teacher said, "I'm sick. I don't know what to do or where to turn."
These trends were hardly new. More than twenty years ago, writing in Public Opinion magazine, author Ben Stein reported:
Recently a 19 year-old junior at the University of Southern California sat with me while I watched "Guadalcanal Diary" on T.V. It goes without saying that the child had never heard of Guadalcanal. More surprisingly, she did not know whom the U.S. was fighting against in the Pacific. ("The Germans?") She was genuinely shocked to learn that all those people were Japanese and that the U.S. had fought a war against them. ("Who won?") Another student at USC did not have any clear idea when World War II was fought. . . . She also had no clear notion of what had begun the war for the U.S. Even more astounding, she was not sure which side Russia was on and whether Germany was on our side or against us. In fact, I have not yet found one single student in Los Angeles, in either college or in high school, who could tell me the years when World War II was fought. Nor have I found one who knew when the American Civil War was fought.
Stein laments that:
Unless our gilded, innocent children are given some concept of why the society must be protected and defended, I fear that they will learn too soon about a whole variety of ugly ideas they did not want to know about. . . . People who do not value what they have rarely keep it for long, and neither will we.
Things have gotten far worse since Stein wrote those words. One reason for students' poor showing on recent tests underlines the neglect shown to the study of history by federal and state policy makers - both Republicans and Democrats - especially since the 2002 No Child Left Behind act began requiring schools to raise scores in math and reading, but in no other subject. This federal accountability law (surprisingly embraced by Republicans who previously argued that education was a state and local - not a federal - matter) has given schools and teachers an incentive to spend most of their time teaching to the math and reading tests, and totally ignoring history.
"History is very much being shortchanged," said Linda K. Salvucci, a history professor in San Antonio who is chairwoman-elect of the National Council for History Education.
Historian Paul Johnson points out that:
The study of history is a powerful antidote to contemporary arrogance. It is humbling to discover how many of our glib assumptions, which seem to us novel and plausible, have been tested before, not once but many times and in innumerable guises; and discovered to be, at great human cost, wholly false.
Free societies are rare in history. If their history and values are not transmitted to the next generation, their survival is questionable. As Cicero (106-43 B.C.) understood:
To remain ignorant of things that happened before you were born is to remain a child. What is human life worth unless it is incorporated into the lives of one's ancestors and set in a historical context?
As immigration problems - particularly among the large North African and Middle Eastern populations in France, Germany, the Netherlands, Great Britain, and other West European countries - rise to the surface, the idea of "multi-culturalism" is coming under increasing criticism.
Germany's chancellor, Angela Merkel, called it "a total failure," and France's president Nicolas Sarkozy, told an interviewer that immigrants should "melt into a single community." In a speech in Munich, Britain's Prime Minister, David Cameron, traces the problem of homegrown Islamist alienation and terrorism to "a question of identity."
"A passively tolerant society," Cameron said, "stands neutral between different values." But "a generally liberal country . . . says to its citizens, this is what defines us as a society: to belong here is to believe in these things."
The things Cameron went on to cite were freedom of speech and worship, democracy, and the rule of law, and equal rights. Much of this is not new, as concern over multi-culturalism has been growing. A year after the London bombings of July, 2005, Ruth Kelly, then the Labor Party minister in charge of community policies, asked whether - in its anxiety to avoid imposing a single British identity on diverse communities - multi-culturalism had encouraged "separateness."
In December 2006, Tony Blair gave a speech on multi-culturalism which included many of Prime Minister Cameron's points. Both prime ministers called for tighter controls on Muslim groups receiving public funds, an entry ban on foreign preachers with extremist views, a tougher position on forced marriages, and an expectation that all British citizens support common values, from the rule of law to a rejection of discrimination.
French president Sarkozy declared that:
If you come to France, you accept to melt into a single community, which is the national community, and if you do not want to accept that, you cannot be welcome in France. Of course, we must respect all differences, but we do not want . . . a society where communities coexist side by side.
Europe's dilemma is real, as is its need for immigrants. Deaths are expected to outnumber births this year in 10 of the European Union's 27 member states. As of 2015 the EU as a whole will experience negative natural population growth, demographers say, and the gap will grow to one million excess deaths a year by 2035. By 2050 the EU will have 52 million fewer people of working age, the European Commission warns. Businesses across Europe are already facing severe shortages of engineers, technicians, craftspeople, and other skilled professionals, with four million unfilled jobs across the continent.
For decades, most European countries have consigned immigrants to the margins. In Germany - which, until recently, continued to proclaim it was "not an immigrant society" - some professions were restricted to German citizens well into the 1990s, while eligibility for citizenship itself was based on bloodlines until a landmark reform in 2001. Millions of refugees were legally barred from working, which forced them into welfare dependency. Muslims, in particular, remain unintegrated and ghettoized in many European countries.
The attention now being focused on the need to integrate immigrants into European society is a hopeful sign. We have had this same debate in the U.S. for some time, but, for a variety of reasons, have done a better job in integrating immigrants into our society. Fortunately, our "melting pot" tradition has served us well.
What Americans have in common is not a common racial, ethnic, or religious background, but, instead, a commitment to the concept of individual freedom in a society established by the U.S. Constitution, which protects and preserves it.
We have, of course, had our advocates of "bi-lingual," "Afro-centric," and other forms of multi-cultural education. According to the multiculturalist worldview, notes Linda Chavez:
African-Americans, Puerto Ricans, and Chinese Americans living in New York City have more in common with persons of their ancestral group living in Lagos or San Juan or Hong Kong than they do with other New Yorkers who are white. Culture becomes a fixed entity, transmitted, as it were, in the genes, rather than through experience.
Historian Arthur M. Schlesinger, Jr. declared that:
Multiculturalists would have our educational system reinforce, promote, and perpetuate separate ethnic communities and do so at the expense of the idea of a common culture and a common national identity.
Afro-centric education and other forms of separate education for separate groups is the opposite of the traditional goal of civil rights leaders who wanted only to open up American education to all students, regardless of race. The distinguished black leader in the early years of this century, W. E. B. DuBois, disputed the multiculturalists of his own day. He said:
I sit with Shakespeare and he winces not . . . . Across the color line I move arm in arm with Belzac and Dumas. I summon Aristotle and Aurelius and what soul I will, and they come all graciously with no scorn or condescension. So, wed with Truth, I dwell above the veil.
To him, the timeless wisdom of the classical works of Western civilization spoke to all people and races, not just to whites of European ancestry.
Professor Seymour Martin Lipset of the Hoover Institution at Stanford University declares:
The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension, and tragedy. Canada, Belgium, Malaysia, Lebanon - all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons, and Corsicans.
European societies will resolve their difficulties with today's immigrants when they adopt the American model and recognize that their societies will no longer be homogeneous in the future and that their goal should be to assimilate new immigrants into the culture and civilization of France, England, and other Western European countries.
Some of today's Islamic immigrants may provide a greater challenge, but this should not prove insurmountable if the proper policies are adopted. Finally, Western European leaders seem to have come to the understanding that multiculturalism is not the way.
Immigrants leave their native countries for the West because there is something in the West they want. It is the responsibility of these Western European countries to transmit to their new immigrants the values, culture, and civilization of their societies. This has been going on in our own country for more than 300 years with great success. The first female black candidate for president, Rep. Shirley Chisholm (D-NY) once said, "We came over on different ships, but we're in the same boat now." Multiculturalism is a dangerous detour from the assimilation and acculturation of immigrants which should be the primary goal of those societies now receiving large numbers of new residents. Abandoning the notion that Western European countries are "not immigrant societies" is an important step forward. *