Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
This year marks the 40th anniversary of the publication of The Great Terror by Robert Conquest in 1968. In a preface to the 40th anniversary edition of the book, Conquest notes that:
The history of the period covered by "The Great Terror" sees the enforcement of Stalin's totally intolerant belief system -- with terror as the decisive argument. Terror meant terrorizing. Mass terror means terrorizing the whole population, and must be accompanied by the most complete exposure of the worst enemies of the people, of the party line, and so of the truth. We know the results.
In 1968, when the book first came out, Conquest notes that:
It was still true that, as the great historian Francois Furet noted, after the war and the demise of fascism, "all the major debates on postwar ideas revolved around a single question: the nature of the Soviet regime." He adds the paradox that Communism had two main embodiments -- as a backward despotism, and as a constituency in the West that had to be kept unaware of the other's reality. And, up to the last, this was often accompanied by a view of the Cold War as an even exchange -- with the imputation that any denigration of the Soviet regime was due to peace-hating prejudices.
Since the end of the Cold War, the reality of Communism's terror and brutality has been widely discussed. In 1999, for example, The Black Book of Communism, an 846-page academic study that blames Communism for the deaths of between 85 million and 100 million people worldwide, became a bestseller. It estimates that the ideology claimed 45 million to 72 million in China, 20 million in the Soviet Union, between 1.3 million to 2.3 million in Cambodia, 2 million in North Korea, 1.7 million in Africa, 1.5 million in Afghanistan, 1 million in Vietnam, 1 million in Eastern Europe, and 150,000 in Latin America.
Through all those years, many intellectuals in the West insisted on disassociating Communism from the crimes committed in its name. Incredibly, in retrospect, we see many Western academics, clergymen, journalists, and literary figures not resisting Communist tyranny, but embracing it, defending it, and apologizing for it.
Consider the German playwright Bertolt Brecht, who created the modern propaganda play. When he visited the Manhattan apartment of American philosopher Sydney Hook in 1935, Stalin's purges were just beginning. Hook, raising the cases of Zinoviev and Kamanev, asked Brecht how he could bear to work with the American Communists who were trumpeting their guilt. Brecht replied that the U.S. Communists were no good -- nor were the Germans either -- and that the only body that mattered was the Soviet party. Hook pointed out that they were all part of the same movement, responsible for the arrest and imprisonment of innocent former comrades. Brecht replied:
As for them, the more innocent they are, the more they deserve to be shot.
Hook asked, "What are you saying?" Brecht repeated: "The more innocent they are, the more they deserve to be shot." Hook asked, "Why, why?" Brecht did not answer, Hook got up, went into the next room and brought Brecht his hat and coat.
During the entire course of Stalin's purges, Brecht never uttered a word of protest. When Stalin died, Brecht's comment was:
The oppressed of all five continents . . . must have felt their heartbeats stop when they heard that Stalin was dead. He was the embodiment of their hopes.
Consider the French philosopher, Jean Paul Sartre. In a July 1954 interview with Liberation, Sartre, who had just returned from a visit to Russia, said that Soviet citizens did not travel, not because they are prevented from doing so, but because they had no desire to leave their wonderful country. "The Soviet citizens," he declared, "criticize their government much more and more effectively than we do." He maintained that, "There is total freedom of criticism in the Soviet Union."
Another intellectual defender of tyranny was Lillian Hellman, the American playwright. She visited Russia in October, 1937, when Stalin's purge trails were at their height. On her return, she said she knew nothing about them. In 1938 she was among the signatories of an ad in the Communist publication New Masses which approved the trials. She supported the 1939 Soviet invasion of Finland, saying:
I don't believe in that fine, lovable little Republic of Finland that everyone is so weepy about. I've been there and it looks like a pro-Nazi little republic to me.
There is absolutely no evidence that Hellman ever visited Finland -- and her biographer states that this is highly improbable.
The American Quaker H. T. Hodgkin provided this assessment:
As we look at Russia's great experiment in brotherhood, it may seem to us some dim perception of Jesus' way, all unbeknown, is inspiring it.
The case of New York Times correspondent Walter Duranty, who covered the Soviet Union in the 1930s, is also instructive. In the midst of the enforced famine in the Ukraine, Duranty visited the region and denied that starvation and death were rampant. In November 1932, Duranty reported that "there is no famine or actual starvation or is there likely to be."
In the Times of August 23, 1933, Duranty wrote:
Any report of a famine in Russia is today an exaggeration or malignant propaganda. . . . The food shortage which has affected almost the whole population last year . . . has, however, caused heavy loss of life.
He estimated the deaths at nearly four times the usual rate, but did not blame Soviet policy. What Americans got was not the truth -- but false reporting. But its influence was widespread. What Walter Duranty got was the highest honor in journalism -- the Pulitzer Prize for 1932, complimenting him for "dispassionate, interpretive reporting of the news from Russia." The citation declared that Duranty's dispatches -- which the world now knows to have been false -- were "marked by scholarship, profundity, impartiality, sound judgment, and exceptional clarity."
Walter Duranty was only one of many correspondents and writers in the 1920s and 1930s who fed their readers in the West a steady diet of disinformation about the Soviet Union. Louis Fischer, who wrote for The Nation magazine, was also reluctant to tell his readers about the flaws in Soviet society. He, too, glossed over the searing famine of 1932-33. He once referred to what we now know as the "Gulags" as "a vast industrial organization and a big educational institution." In 1936 he informed his readers that the dictatorship was "voluntarily abdicating" in favor of democracy.
Liberal intellectuals who were harsh in their judgment of the American society eagerly embraced the ruthless dictatorship of Joseph Stalin.
Concerning the forced collectivization of Soviet agriculture, author Upton Sinclair wrote:
They drove rich peasants off the land -- and sent them wholesale to work in lumber camps and on railroads. Maybe it cost a million lives -- maybe it cost five million -- but you cannot think intelligently about it unless you ask yourself how many millions it might have cost if the changes had not been made.
Journalist I. F. Stone, lionized by the media in recent years as the quintessential model of a newsman, commented on the new Soviet constitution of 1936:
There is only one party, but the introduction of the secret ballot offers the workers and peasants a weapon against bureaucratic and inefficient officials and their policies.
W. E. B. Du Bois, the black intellectual, thought that, "He (Stalin) asked for neither adulation nor vengeance. He was reasonable and conciliatory."
Since the Russian Revolution of 1917, the world was engaged for many years in a struggle between freedom and tyranny. Now that the reality of Communism's horrors are widely know, it is only proper that we remember those who defended liberty and those who did not. In that battle, sadly, many in the U.S. and other Western countries used their considerable abilities to advance not freedom but tyranny. In the forward to the 40th anniversary edition of "The Great Terror," Robert Conquest writes that:
One of the strongest notions put forward about Stalinism is that in the interests of "objectivity" we must be -- wait for it -- "nonjudgmental." But to ignore, or downplay, the realities of Soviet history is itself a judgment, and a very misleading one.
Maureen Faulkner's husband, Philadelphia police officer Danny Faulkner, was shot between the eyes on a cold December night in 1981. Mumia Abu-Jamal was unanimously convicted of the crime by a racially mixed jury based on the testimony of several eye-witnesses, his ownership of the murder weapon, matching ballistics, and Abu-Jamal's own confession.
After his conviction, a national anti-death penalty crusade was started to "Free Mumia." Mike Farrell, Ed Asner, Whoopi Goldberg, and Jesse Jackson rallied on his behalf. While on death row, Abu-Jamal published several books, delivered radio commentaries, was a college commencement speaker, and was named an Honorary Citizen of France.
In a new book, Murdered by Mumia (The Lyons Press), Maureen Faulkner, and popular radio talk show host and journalist Michael Smerconish, carefully lay out the case against Abu-Jamal and those who have elevated him to the status of political prisoner. Smerconish, an attorney, has provided pro bono legal counsel to Faulkner for over a decade, as appeal after appeal was brought by Abu-Jamal's lawyers. Smerconish declares that:
My reading of five thousand pages of trial transcripts starkly revealed that Abu-Jamal murdered Danny Faulkner in cold blood and that the case tried in Philadelphia in 1982 bears no resemblance to the one being home-cooked by the Abu-Jamal defense team.
The facts of the case, as determined in court, are clear. At 3:45 a.m., Dec. 9, 1981, Philadelphia police officer Daniel Faulkner stopped a beat-up Volkswagen driven by William Cook. Cook got out of the car and while officer Faulkner was using a flashlight to examine what was probably Cook's driver's license, Cook struck Faulkner. Faulkner responded by smacking Cook with his flashlight, spinning him around, and starting to frisk him.
As Faulkner searched Cook -- who was allegedly driving in the wrong direction on a one-way street -- a cab driver and one-time radio journalist and Black Panther activist named Mumia Abu-Jamal (born Wesley Cook) -- came up behind him and opened fire with a .38 revolver at close range. Faulkner was hit in the back, but he managed to return fire and hit his attacker in the lower chest. As Faulkner writhed on the ground, Abu-Jamal stood over the wounded officer and executed him with a point-blank shot to the head.
These facts led a Philadelphia jury to convict Mumia Abu-Jamal and sentence him to death. But for the last 26 years, these facts have been disputed by a powerful cadre of lawyers, liberal and radical politicians, the virulent anti-police radical group called Move, and anti-death penalty celebrities. And for 26 years Maureen Faulkner has been fighting back with every bit of energy and resources she could muster.
In the foreword to the book, Michael Smerconish notes that:
. . . there has always been plenty of evidence available to me and anyone else with a modem to suggest that Abu-Jamal murdered her (Maureen's) husband: There were several eyewitnesses; people of color were part of the jury; and Abu-Jamal was a known agitator who had advocated violence toward law enforcement (he wrote "Let's Write Epitaphs for Pigs, Signed Mumia" in a Black Panther publication in April of 1970). Moreover, Abu-Jamal has never explained what took place that night (which is certainly one of the most puzzling aspects of the case if one is inclined to side with him) and his own brother, William Cook, who was present at the murder, has himself never testified on Abu-Jamal's behalf. Common sense dictates that if one brother is on death row for a crime the other brother knows he didn't commit (because the second brother was himself present), he'd say so. But not William Cook. His silence has been deafening. Incredibly, this has never seemed to matter to Abu-Jamal's celluloid supporters.
The embrace of Mumia Abu-Jamal by many in the media, the academic world, Hollywood, and among political leaders, is incredible. Consider some of that support.
On August 9, 1995, a full-page ad appeared in The New York Times. It prominently listed such Hollywood supporters of Abu-Jamal as Alec Baldwin, Mike Farrell, Spike Lee, Susan Sarandon, and Oliver Stone. On July 14th, 1995, author E. L. Doctorow wrote a column of support. The Times ad also included the following signatories: Shana Alexander, Maya Angelou, Russell Banks, Derrick Bell, Noam Chomsky, Kerry Kennedy Cuomo, Ronald V. Dellums, David Dinkins, Henry Louis Gates, Danny Glover, Gunter Grass, Charles Rangle, Gloria Steinem, Alice Walker, and Cornel West.
During a 1995 court hearing of an appeal by Abu-Jamal, The Philadelphia Inquirer reported that:
Outside the courtroom, Harvard philosophy and religion professor Cornel West likened Abu-Jamal to jazz great John Coltrane and the Rev. Martin Luther King, Jr. . . . West compared the atmosphere in the courtroom to "Mississippi."
Judge Albert Sabo ruled that Mumia did not deserve a new trial.
National Public Radio decided to offer airtime to Mumia. His prison essays about life on death row were to be carried by the "All Things considered" program. A public backlash ensued. Arnold Gordon, the First Assistant District Attorney of Philadelphia, wrote to Delano Lewis, president and CEO of NPR, on the day that Abu-Jamal's commentaries were set to be aired. He declared:
You have rewarded the murderer of a 25-year-old police officer who left a grieving widow, and a mother, by giving him a platform from which to address perhaps millions of listeners. Who is your next media star -- Sirhan Sirhan? John Hinckley? Jeffrey Dahmer? Have you no sense of decency, no sense of what is right and wrong?
NPR scrapped the project, but Pacifica Radio, a radical media outlet, decided to air the already taped segments that had been banned from NPR. To this day, Abu-Jamal remains a commentator on Pacifica Radio.
HBO, on July 7, 1996, aired a one-hour documentary about Mumia entitled "A Case for Reasonable Doubt." On March 25, 1997, the Santa Cruz, California, City Council passed a formal resolution calling for a new trial for Abu-Jamal. Maureen Faulkner reports that:
The City of San Francisco . . . joined the pro-Abu Jamal parade and actually honored the man who murdered my police officer husband. And they did it in grand style. Three thousand supporters gathered at Mission High School's auditorium on Aug. 16, 1997, for the event. The key speakers at the function were Geronimo Ji Jaga (Pratt), a former Black Panther who spent 27 years behind bars for murdering a couple (a "political prisoner" if you believe the pro-Abu-Jamal literature), and author Alice Walker. The event raised $30,000 to help pay Abu-Jamal's continuing defense bills. San Francisco Mayor Willie Brown, Jr. presided at the event.
The certificate of honor read to the crowd declared:
The Board of Supervisors of the City and County of San Francisco hereby issues, and authorizes the execution of, the Certificate of Honor in appreciative pubic recognition of distinction and merit for outstanding service to a significant portion of the people of the city and county of San Francisco by: Mumia Abu-Jamal. In recognition of his struggle for justice, and the community rally calling for his freedom from imprisonment, and honor this struggle, designate August 16, 1998, Mumia Abu-Jamal Day in San Francisco.
Signed by Supervisor Tom Ammiano.
Abu-Jamal has even been a commencement speaker. Students at Evergreen State College on Olympia, Washington are given the opportunity to select their own commencement speakers. The class of 1999 wanted Abu-Jamal. One year after his address (on tape) at Evergreen State, students at Antioch College in Yellow Springs, Ohio, asked Abu-Jamal to be a speaker at their commencement on April 29, 2000.
Maureen Faulkner traveled to Antioch, and after a vigil she held:
. . . we were ushered to the actual graduation to be quarantined in a specific area, far from the actual ceremony, cordoned off by a blue ribbon. I had planned on attending a graduation ceremony at a college, but (I'm not exaggerating) it was like one of the rallies the Nazis staged in Nuremberg. The buildings surrounding the open-air stage and spectator seats were adorned with streaming banners. Oversized posts with Abu-Jamal's haunting grimace were everywhere and "Free Jamal" banners waved in the wind.
On December 4, 2001, the Paris City Council voted to name Pennsylvania's famous death row inmate an "Honorary Citizen" of Paris. The last time such an honor was bestowed was to artist Pablo Picasso in 1971.
When Pulitzer Prize winning journalist Buzz Bissinger (author of the Friday Night Lights) profiled Danny Faulkner's murder for Vanity Fair in the summer of 1999, he asked actor Ed Asner, a vocal Mumia supporter, if he'd read the original trial transcript. Asner replied, "Could I stay awake?" Maureen Faulkner writes:
That answer speaks volumes. . . . I have always been shocked by the readiness of Asner and others from the Hollywood left to attach their names to a murder case without reading every scrap of paper suggesting who did it. I think it is essential to read and appreciate the evidence in order to understand the extent to which the myth of Mumia Abu-Jamal separates from reality.
On June 1, 1995, Pennsylvania Governor Tom Ridge finally signed Abu-Jamal's death warrant. Maureen Faulkner recalls that:
There was an unprecedented level of combative Abu-Jamal support in the summer of 1995. The president of France and the foreign minster of Germany made public appeals on Abu-Jamal's behalf. In Rome, 100,000 people signed a petition to stop his execution. And four American cities -- Cambridge, MA, Ann Arbor and Detroit, MI, and Madison, WI -- passed resolutions demanding a new trial for Abu-Jamal. His defenders motivated by a tenacious brand of unfounded conviction, threatened to burn down Philadelphia if Abu-Jamal himself "burns." "Fire in the Skies if Mumia Dies" was the banner many of them held.
The murder was in 1981. In 1982, Abu-Jamal was convicted and sentenced to death. In 1989, his conviction and sentence were upheld by the Pennsylvania Supreme court. The Commonwealth's highest court also rejected subsequent appeals in 1995, 1996, and 1997. Now, with state appeals exhausted, Abu-Jamal has turned his attention to the federal courts. Maureen Faulkner states that:
I never could have imagined that seven years into the next century my family and I would still be taking time from our lives to attend appeals hearings. This process is obscene in the way it taints survivors' lives for so long. You can never move on. There's never any closure; just endless rounds of hearings and motions. . . .
This book tell us the story of a courageous woman, a flawed criminal justice system with its endless appeals, and a body of elite opinion, both in the U.S. and abroad, which is willing to overlook the facts of a criminal case, and make a martyr, and "political prisoner" of a cold-blooded killer. The proceeds from this book are going to a charity Mureen Faulkner started to fund scholarships for the children of murder victims, and for the children of people incapacitated by violent crime.
This is the first book to carefully lay out the case against Mumia Abu-Jamal and those who have elevated him to the status of political prisoner. As Abu-Jamal's lawyers contemplate their final appeal, this never-before-told account of one fateful night is compelling reading. *
"It is the madness of folly, to expect mercy from those who have refused to do justice; and even mercy, where conquest is the object, is only a trick of war; the cunning of the fox is as murderous as the violence of the wolf." --Thomas Paine
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
In 2005, the Republican-led Congress and President Bush backed a bill that required widespread ethanol use in motor fuels. This year, the Democratic-led Congress passed, and President Bush signed, energy legislation that boosted the mandate for minimum corn-based ethanol use to 15 billion gallons, about 10 percent of motor fuel, by 2015. This was strongly supported by farm-state lawmakers and those worried about energy security and eager to substitute a home-grown energy source for a portion of U.S. petroleum imports. To help this process, motor-fuel blenders receive a 51 percent subsidy for every gallon of corn-based ethanol used through the end of 2010; this year production could reach 8 billion gallons.
Now, however, we observe what can be called the unintended consequences of politically interfering with the marketplace. With food riots erupting in many parts of the world, with a weak dollar, high-energy costs, low crop yields in places such as Australia -- the subsidization of ethanol by diverting food to fuel has clearly contributed to and exacerbated the situation.
"The price of grain is now directly tied to the price of oil," says Lester Brown, president of Earth Policy Institute, a Washington research group. "We used to have a grain economy and a fuel economy. But now they're beginning to fuse."
Those who use corn to feed cattle, hogs, and chickens are being squeezed by high corn prices. In April, Tyson foods reported its first loss in six quarters and said that its corn and soybean costs would increase by $600 million this year. Egg producers are passing high corn prices on to consumers. The wholesale price of eggs in the first quarter soared 40 percent from a year earlier, according to the Agriculture Department. The retail prices of many food items, from cereal to salad dressing, are moving upward because of more expensive ingredients such as corn syrup and cornstarch.
The problem with subsidizing the production of ethanol is multi-dimensional, leading not only to increased food costs but also to increased energy costs. Economist Walter Williams declares that:
Ethanol contains water that distillation cannot remove. As such, it can cause major damage to automobile engines not specifically designed to burn ethanol. The water content of ethanol also risks pipeline corrosion and thus it must be shipped by truck, rail car, or barge. These are far more expensive than pipelines. Ethanol is 20 to 30 percent less efficient than gasoline, making it more expensive per highway mile. It takes 450 pounds of corn to produce the ethanol to fill one SUV tank. That's enough to feed one person a year. Plus -- it takes more than one gallon of fossil fuel -- to produce one gallon of ethanol. After all, corn must be grown, fertilized, harvested, and trucked to ethanol producers -- all of which are fuel-using activities.
In Williams' view:
Ethanol is costly and it wouldn't make it in a free market. That's why Congress has enacted major ethanol subsidies . . . which is no less than a tax on consumers. In fact, there's a double tax -- one in ethanol subsidies and another in handouts to corn farmers to the tune of $9.5 billion in 2005 alone.
A quarter of American corn is now turned into ethanol, and that is set to rise. Last year the federal government mandated that ethanol production grow five-fold by 2022. Ed Feulner, president of the Heritage Foundation, points out that:
The food crisis should surprise no one. When 25 percent of a staple crop is taken off the table, shortages result. . . . Unfortunately, the cornfield isn't the only place where federal policy causes troubles. Our country is also seeing a shortage of wheat -- partly because many wheat farmers have switched to corn, and partly because Washington pays them whether they grow wheat or not. . . . Corn is the answer to our food problems, not our fuel problems.
Last year, two economics professors predicted the current food shortage. C. Ford Range and Benjamin Senauer wrote in Foreign Affairs:
By putting pressure on global supplies of edible crops, the surge in ethanol production will translate into higher prices for both processed and staple foods around the world. Biofuels have tied oil and food prices together in ways that could profoundly upset the relationships between food producers, consumers, and nations in the years ahead, with potentially devastating implications for both global poverty and food security.
There are also negative potential implications for the environment involved with government subsidization of ethanol. Dr. William Laurance, a scientist with the Smithsonian Tropical Research Institute, reports that "Biofuel from corn doesn't seem very beneficial when you consider its full environmental costs." He notes that the $11 billion a year American taxpayers spend to subsidize corn producers "is having some surprising global consequences." That included Amazon forests being clear cut so farmers can plant soybeans.
Scientists are showing that ethanol will exacerbate greenhouse gas emissions. A February report in the journal Science found that "corn-based ethanol, instead of producing a savings, nearly doubles greenhouse emissions over 30 years. . . . Biofuels from switchgrass, if grown on U.S. corn lands, increase emissions by 50 percent." Princeton's Timothy Searchinger and colleagues at Iowa State University find that markets for biofuel encourages farmers to level forests and convert wilderness into cropland.
Editorially, the Wall Street Journal declares that:
Congress' ethanol subsidies are merely force-feeding an industry that is doing more harm than good. The results included distorted investment decisions, higher carbon emissions, higher food prices for Americans, and an emerging humanitarian crisis in the developing world. The last thing the poor of Africa and the taxpayers of America need is another scheme to conjure gasoline out of corn and tax credits.
The state of Texas is now in official opposition to the federal ethanol mandate. Governor Rick Perry has petitioned the Environmental Protection Agency for a one-year reprieve. Because of the federal mandate to add ethanol to gasoline, Texas ranchers are being forced into bidding wars with ethanol plants for the grains they feed their cattle. Governor Perry calculates that the mandate for ethanol may push the price of corn to $8 a bushel (it's at $6 now, up from $2 in 2004), and could cost the Texas economy nearly $3.6 billion this year.
In May, Senator Kay Bailey Hutchison (R-TX) called for a freeze on ethanol mandates and quickly got the support of two dozen of her Republican Senate colleagues, among them Senator John McCain -- who has traditionally opposed ethanol subsidies. Needless to say, farm state senators -- of both parties -- led by Senators Charles Grassley (R-Iowa) and Tim Johnson (D-SD) are defending ethanol as representing a small fraction in the rise of food prices. Senator Hillary Clinton (D-NY) has opposed ethanol subsidies in the past, but embraced them prior to the Iowa caucuses. Senator Barrack Obama (D-IL) proposed mandating a staggering 65 billion gallons a year of ethanol. His energy plan calls for "expanding federal tax credit programs" for ethanol and proposes "an additional subsidy per gallon of ethanol" for locally-funded ethanol plants. By mid-May, as the facts of ethanol's real impact upon the economy and food supply became increasingly clear, Obama suggested that perhaps helping "people get something to eat" was a higher priority than biofuels.
Finally some lawmakers are moving to suspend the law mandating the growth of ethanol production. Neither economically nor morally can we afford to subsidize the burning of so much corn while people go hungry. Did anyone who produced the ethanol subsidy not think ahead -- and foresee that farmers would plant a lot of new corn on acres where they once grew other food crops such as soybeans, and that they would sell all of the new corn to ethanol distilleries? The result, which should have been anticipated but was not, is that there are fewer acres devoted to food crops, and there is less corn available for feeding livestock at a time when worldwide demand for meat and milk is rising. Lower supply plus greater demand, they seem to have ignored, equals higher prices.
Congress -- both Republicans and Democrats -- has created an artificial demand for ethanol to satisfy the farm lobby -- and business interests such as Archer Daniels Midland (ADM), the country's largest producer of ethanol. Soaring food prices have sent farmers' incomes to record heights, yet Congress lavished additional welfare upon them by passing a new, five year $280 billion farm bill.
"When millions of people are going hungry," Palaniappan Chidambaram, India's finance minister declared, "it's a crime against humanity that food should be diverted to biofuels."
Though black conservatives are becoming the prominent voices within African American politics and culture, few realize that the black conservative tradition predates the Civil War and is an intellectual movement with deep historical roots.
In an important new book, Saviors or Sellouts (Beacon Press), Professor Christopher Alan Bracey, who teaches law and African and African American Studies at Washington University in St. Louis, traces the evolution of black conservative thought from its origins in antebellum Christian evangelism and entrepreneurialism to its contemporary expression in policy debates over affirmative action, law enforcement practices, and the corrosive effects of urban black artistic and cultural expression.
Dr. Bracey, no conservative himself, but a fair-minded scholar, examines black neoconservatives such as Shelby Steele and John McWhorter and reveals the philosophies of prominent political conservatives such as Clarence Thomas, Colin Powell, and Condoleezza Rice, as well as intellectuals such as Thomas Sowell, Anne Wortham, and Walter Williams. He has a revealing chapter on the infotainment effect of Bill Cosby, Chris Rock and a number of bloggers.
"Black conservatives are quickly becoming the most visible and prominent voices within African American politics, culture and society," writes Bracey.
The rising tide of black conservatism will invariably shape policy that will define the social, political, and economic future of African Americans as well as other socially disfavored groups. . . . I believe it vital that we all understand and appreciate the historic role of black conservatism in promoting racial empowerment in this country and give this tradition its proper respect.
The most prominent writings on American race relations in recent years, in Bracey's view, are by self-proclaimed black conservatives. Among them are: John McWhorter's Losing the Race: Self-Sabotage in Black America; Shelby Steele's The Content of Our Character: A New Vision of Race in America and White Guilt: How Blacks and Whites Together Destroyed the Promise of the Civil Rights Era; and Thomas Sowell's Black Rednecks and White Liberals. Beyond this, notes Bracey:
Even comedian Bill Cosby seems to have embraced conservatism with newfound fervor, offering a scathing critique of African American cultural practices that he deems destructive to the black community. Enter the blogosphere, and one finds a small but vibrant and growing community of black conservatives eager to present and exchange ideas on conservative strategies for racial empowerment. . . . For an increasing number of African Americans, conservatism has become a credible, compelling alternative to traditional liberal modes of political, cultural and economic empowerment.
Those who say that black conservatism is a fringe and inauthentic voice of the African American community, Bracey argues, ignore the real historical context:
This prevailing view . . . obscures the important, and often overlooked legacy of black conservatives that has existed since the Founding of the Republic. Indeed, from the Founding until the early 20th century -- nearly 150 years -- conservatism was the dominant mode of black political engagement with white society. Black conservatism, like any other intellectual movement, is perhaps best understood as a shifting, organic mood or consciousness developed over time by African Americans in response to specific lived conditions. . . . Black conservatism has proved remarkably durable and consistent over the years, and modern black conservatism ought to be viewed as part of the important black political tradition.
The touchstone of black conservative discourse, Bracey points out, has been the African American Protestant ethic -- a kind of middle-class morality. The foundation for success, in this view, are respectability, proper deportment, and a serious commitment to a healthy and productive lifestyle. In the colonial days, Richard Allen, whose leadership rested upon his role as pastor and status as a successful businessman, repeatedly told his congregation that hard work and "middle class propriety" was vital to free blacks, and that blacks were morally and spiritually obliged to make good use of the privileges of freedom.
"Perhaps the earliest example," writes Bracey:
. . . is David Walker's 1829 APPEAL, in which he recounts the various forms of white and black "wretchedness." Recent works of Shelby Steele, Thomas Sowell, and comments by comedian Bill Cosby express similar sentiments. What each of these exponents of the genre share is a twofold conviction that problems and obstacles faced by blacks can be best mitigated or resolved by blacks themselves, and that white racism is just an irritant that lacks the determinant power to define African Americans individually or collectively.
The Rev. Jupiter Hammon developed and advanced the precepts of black conservatism a full century before the best-known proponent of this philosophy, Booker T. Washington. Hammon, a slave his entire life, was born in 1711 and lived and died at Lloyd Manor House, an estate located in Oyster Bay, a small Long Island cove. Slaves owned by Long Island members of the Anglican Church were schooled by a handful of British missionaries. Hammon was the first published African American writer. Beginning in 1760, Hammon published four poems, two essays and one sermon that addressed the full range of sociopolitical questions facing African Americans.
According to Hammon, free blacks bore the responsibility to uphold moral standards and remain industrious in order to dispel prevailing notions about the natural inferiority of blacks and the concomitant inability to manage their personal affairs. For Hammon, living an ethical and productive life was the surest path to exposing the hypocrisy of white America's disrespect for blacks and failure to live up to its own ethical and religious ideals. "In Hammon," notes Bracey:
. . . and in much of black conservatism, one finds a preference for a slow, organic, and moralistic program of black improvement premised upon cooperation rather than confrontation and conflicts with whites.
In a direct response to the suggestion to colonize Africa offered by William Thornton on behalf of the American Colonization Society, Richard Allen openly declared his patriotism and desire to pursue racial empowerment on American soil:
. . . however unjustly her (Africa's) sons have been made to bleed, and her daughters to drink of the cup of affliction, still we who have been born and nurtured on this soil, we, whose habits, manners and customs are the same in common with other Americans, we can never consent to take our lives in our hands, and be the bearers of the redress offered by that Society to that much afflicted country.
In 1795, Allen opened a day school for 60 children, and in 1804 founded the "Society of Free People of Colour for Promoting the Instruction and School Education of Children of African Descent." By 1811 there were no fewer than 11 black schools in the city of Philadelphia.
Later, writes Bracey:
. . . a new and distinct form of black conservatism -- one grounded in the "southern way of life" -- emerged and eventually supplanted its northern counterpart as the dominant political philosophy in African American life in the early 20th century. The leading exponent of this new southern black conservatism was Booker T. Washington. Born a slave in Hale's Ford, Virginia, Washington worked in salt furnaces and coal mines as a child to help support his family after emancipation. At age 16 Washington left home and began formal schooling at the Hampton Institute in Virginia, where he supported himself by working as the school janitor. As a child of Reconstruction, Washington was imbued with a deep skepticism of political and legal rights. . . . After 1877, it became increasingly clear to black southerners that the bestowal of rights was far more limited and, indeed, mutable, than liberal proponents cared to admit. The gap between northern idealism and southern reality grew, with the erosion of newly acquired rights and the rise of racial terror and violence toward blacks.
For Washington, economic advancement seemed to be surer, less reversible means for blacks to progress. He saw progress for blacks taking place within southern black institutions, which by definition were less reliant upon the favor of whites. By the time of his death, Washington left behind a network of institutions that preserved his views on racial advancement. Of particular note were the Tuskegee Institute and the National Negro Business League. The Tuskegee Institute embodied Washington's pragmatic, anti-utopian philosophy of self-help, education, morality, entrepreneurship, and hard work. The National Negro Business League, founded in 1910, served as a coordination center for his vast network of confidants, political operatives and business leaders to spread black conservatism in both the North and South. With the blessings of leading conservative, white philanthropists such as John D. Rockefeller, Collis Pin Huntington, and Julius Rosenwald, Washington expanded his web of influence and power with the creation of the National Teachers Association, the National Negro Press Association, Negro Farmers Association, and the Negro Organization Society. The last organization, which he founded in 1913, epitomized Washington's commitment to self-help. Its motto was, "Better homes, better schools, better health, and better farms."
While many historians contend that Washingtonian black conservatism faded with his death in 1915, Bracey shows that though the focus of black political thought shifts from the accommodationism of Washington to the NAACP in the North and the rise of the civil rights movement, "the prevailing narrative fails to account for the extended legacy of black conservatism."
Bracey reports that black conservatives were skeptical, during the years of the Harlem Renaissance, of the argument that art produced by Negroes was racially distinctive in any meaningful sense. Her writes:
This argument was put most forcefully by conservative George S. Schuyler, a man who became the dominant voice of black conservatism in the middle of the 20th century.
In an article titled "The Negro Art Hokum," Schuyler argued that the category of Negro art was, at bottom an act of self-deception. According to Schuyler, there is nothing "expressive of the Negro soul" in the work of black artists whose way of life was not so different from that of other Americans. Schuyler argued:
He is not living in a different world as some whites, and few Negroes would have us believe. When the jangling of his Connecticut alarm clock gets him out of his Grand Rapids bed to a breakfast similar to that eaten by his white brother across the street . . . it is sheer nonsense to talk about "racial differences" between the American black and the American white man.
In Bracey's view:
The conservative cultural critique of the Harlem Renaissance in some ways foreshadowed modern conservative criticism of gangsta rap and other forms of urban artistic and cultural expression. . . . John McWhorter, a conservative black social critic, notes that "rap's core message seems to encourage a young black man to nurture a sense of himself as an embattled alien in his own land. It is difficult to see how this can lead to anything but dissention and anomie.". . . Juan Williams' observation that "behind the thumping beat is a message that building a family, a community, and political coalitions are all bad bets" and that rap also projects the idea that "parents nurturing children and believing in education as a long-term investment is also for suckers" reveals a normative commitment to racial empowerment that is strongly rooted in black conservative values -- values championed by traditional black conservatives nearly a century ago . . .
Bracey discusses a host of prominent black conservatives in American politics in recent years -- Clarence Thomas, Condoleezza Rice, Colin Powell, and Edward Brooke, among others. When he was the only black in the U.S. Senate, Senator Brooke (R-MA) maintained that racial empowerment could be brought about only when blacks developed the skills necessary to compete effectively with whites. "Unless a program is specifically designed to encourage outcasts to engage in individual competition," argued Brooke, "it cannot be successful." He also chastised liberals for ignoring the importance of individual self-development and focusing instead on group-based relief.
Discussing the post-civil rights era black conservatives, Bracey declares that they:
. . . retained the inward focus of early black conservative thought, directing much of their energy to the tasks of self-critique of the black community. In contrast to the liberal focus on external causes of black disempowerment, black conservatives of this era preferred to identify areas within the black community that could be strengthened spiritually morally, and/or culturally. . . . Black conservatives of this era genuinely believed that progress would only come about when blacks learned to do for themselves.
Professor Bracey is not a conservative, but the reader gets the feeling that as he pursued his subject he became increasingly positive in his assessment of the role black conservatism has played in history. He concluded that:
The longevity of black conservative thought and the increasing prominence of modern black conservatives in the American public sphere are indicative of the attractiveness of modern black conservatism. . . . First and foremost, black conservatism vindicates the deeply-held desire of blacks to view themselves as architects of their own destiny. . . . Many blacks today are weary of being viewed as victims and the perennial object of liberal charity. White liberals and the civil rights establishment remain deeply invested in the idea that blacks continue to suffer under the weight of racial oppression. Many blacks are increasingly turned off by this image of black society. . . . Liberals can no longer afford to dismiss modern black conservatism as marginal or inconsequential to public conversation on racial issues. . . . It is imperative to move beyond ideological wrangling and acknowledge that both liberals and conservatives possess a rich arsenal of ideas for racial empowerment. *
"Nothing is more terrible than ignorance in action." --Goethe
We would like to thank the following people for their generous support of this journal (from 5/7/08 to 7/10/08): William D. Andrews, Lee R. Ashmun, A. D. Baggerley, D. J. Cahill, Edward J. Cain, James R. Gaines, John Geismar, John H. Hearding, Norman G. P. Helgeson, Thomas E. Humphreys, E. J. Jacobson, Ralph Kramer, James A. Lee, Paul W. McCracken, Thomas J. McGreevy, Delbert H. Meyer, Clark Palmer, Samuel R. Putnam, David Renkert, Irene L. Schultz, Richard H. Segan, David L. Smith, Carol C. Weimann, John V. Westberg, Robert C. Whitten.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
William F. Buckley, Jr., the intellectual father of the modern American conservative movement, died at his desk in his Stamford, Connecticut, home on February 27 -- working on a column.
In 1949, six years before the founding of National Review, critic Lionel Trilling wrote in The Liberal Imagination that:
In the United States at this time liberalism is not only the dominant but even the sole intellectual tradition. For it is the plain fact nowadays that there are no conservative or reactionary ideas in general circulation.
This would soon change -- largely because of Bill Buckley. In 1951, he first came to public attention with his book God and Man at Yale, which critiqued his alma mater for its hostility to capitalism and religion. Four years later, at the age of 29, he founded National Review.
In its first editorial, Buckley promoted "the superiority of capitalism to socialism, (and) of republicanism to centralism." He vowed the magazine would be different: "It stands athwart history, yelling 'Stop,' at a time when no one is inclined to do so, or have much patience with those who urge it."
It was Buckley's achievement to create a united or "fusionist" conservative movement by bringing together its divergent constituencies -- anti-Communists like Whitaker Chambers, libertarians like Frank Meyer, and traditionalists such as Russell Kirk. He was instrumental in the founding of Young Americans for Freedom, and all of this quickly led to the nomination of Barry Goldwater in 1964 and the election of Ronald Reagan in 1980.
William Rusher, publisher of National Review for 31 years, said:
Unquestionably, he was the principal founder of the modern American conservative movement, who had a major influence on the country, the party, and the world. He was a wonderfully vivacious, effervescent friend, full of fun, a great sense of humor. He just changed the entire image of American conservatism.
In remarks at National Review's 30th anniversary in 1985, President Reagan joked that he picked up his first issue of the magazine in a plain brown wrapper and still anxiously awaited his copy every two weeks -- "without the wrapper." He said:
You just didn't part the Red Sea -- you rolled it back, dried it up and left exposed for all the world to see, the naked desert that is statism. And then, as if that weren't enough, you gave the world something different, something its weariness desperately needed, the sound of laughter and the sight of the rich, green uplands of freedom.
Buckley spread his ideas through National Review, his television program "Firing Line," dozens of books, and thousands of columns. "Before Bill Buckley, there was nothing -- there was no conservative movement," said William Kristol, editor of The Weekly Standard. "No Bill Buckley, no President Reagan. You can't overstate his importance."
President Bush, a target of Buckley criticism in recent years over the Iraq war, budget deficits, and the growth of executive power, said Buckley "helped lay the intellectual foundations for America's victory in the Cold War and for the conservative movement that continues to this day."
Sam Tanenhaus, who is writing a biography of Buckley, said that he brought together thinkers, activists, cultural figures, and politicians to form a movement of historic impact. At the same time, Tanenhaus said, Buckley fought against isolationism and Anti-Semitism, which had marred conservative politics before the 1950s. "He was the man who made conservatism serious and respectable in America," Tanenhaus said.
Buckley's life had many facets. In 1965, he ran for mayor of New York City under the Conservative Party banner. Asked what he would do if he won, Buckley said, "demand a recount." His works included a series of spy novels in which he reinterpreted the history of the Cold War through the adventures of CIA operative Blackford Oakes. It was at age 50, when he crossed the Atlantic Ocean in his sailboat, that he decided to become a novelist. Among his books is a historical novel with Elvis Presley as a significant character, and one about the Nuremberg trials.
Unlike his brother James, who served in the U.S. Senate from New York, Buckley generally avoided government posts. He did serve from 1969 to 1972 as a presidential appointee to the National Advisory Commission and as a member of the U.S. delegation to the United Nations in 1973. He was a recipient of the Presidential Medal of Freedom.
While so much of today's political debate is contentious and often personalized, Bill Buckley was able to disagree with others without being disagreeable. He became close friends with a number of his intellectual adversaries such as John Kenneth Galbraith and Murray Kempton. Editorially, The New York Times declared:
There are not many issues on which Mr. Buckley and this page agreed or would agree -- except, perhaps, the war in Iraq. . . . Yet despite his uncompromising beliefs, Mr. Buckley was firmly committed to civil discourse and showed little appetite for the shrillness that plagues far too much of today's political discourse.
In his last years, Buckley was increasingly discouraged with the drift of American conservatism. In an interview with The Wall Street Journal in 2005, he noted that "I think conservatism has become a little bit slothful." Editorially, The Wall Street Journal notes that:
In private, his contempt was more acute. Part of it, he believed, was what used to be living ideas had become mummified doctrines to many in the conservative political class.
At the Yale Political Union in November 2006, his last public audience, Buckley called for a "sacred release from the old rigidities" and "a repristinated vision." The challenge, he argued, was to adapt eternal principles to new realities.
Buckley's view of the world had an important transcendent component. In God and Man at Yale he wrote:
I believe that the duel between Christianity and atheism is the most important in the world. I further believe that the struggle between individualism and collectivism is the same struggle reproduced on an another level.
In Buckley's view, those who did not possess a spiritual dimension and a sense of awe at God's creation, lacked the proper imagination to make sense of the world. In his book, The Fish Can Sing, the Icelandic writer Halldor Laxness confronts one of his characters with a young man who believes in neither ghost stories nor any things unseen. In response, he states:
Mandkind's spiritual values have all been created from a belief in all the things the philosophers reject. . . . How are you going to live if you reject not only the Barber of Seville but also the cultural value of ghost stories. If it were to be proved scientifically or historically or even judicially that the Resurrection is not particularly well authenticated by evidence -- are you then going to reject the B-minor Mass? Do you want to close St. Peter's Cathedral because it has come to light that it is a symbol of a mistaken philosophy and would be more useful as a stable? What a catastrophe that Giotto and Fra Angelico should have become enmeshed in a false ideology as painters, instead of adhering to realism. The story of the Virgin Mary is obviously just another falsehood invented by knaves and any man is a fraud who allows himself to sigh, "Pietra Signor."
Too many men and women in contemporary America are guilty of what Quaker writer Elton Trueblood called "the sin of contemporaneity," thinking that modern thinking has replaced that of the past, and that the proper question to ask of a proposition is whether it is "modern" or "progressive," not if it is valid. Bill Buckley rejected such an approach, agreeing instead with C. S. Lewis when he said:
We must condemn . . . the uncritical acceptance of the intellectual climate common to our own age and the assumption that whatever has gone out of date is on that account discredited. You must find out why it went out of date. Was it ever refuted? And if so by whom, where, and how conclusively? Or did it merely die away as fashions do? If the latter, this tells us nothing about its truth or falsehood. From seeing this, one passes to the realization that our own age is also a "period" and certainly has like all periods its own characteristic illusions.
Any search for truth, Buckley believed, should not discount the wisdom of the past, what G. K. Chesterton called "the democracy of the dead." In his book Orthodoxy, Chesterton wrote:
If we attach great importance to the opinion of ordinary men in great unanimity when we are dealing with daily matters, there is no reason why we should disregard it when we are dealing with history. . . . Tradition may be defined as an extension of the franchise. Tradition means giving votes to the most obscure of all classes, our ancestors. It is the democracy of the dead. Tradition refuses to submit to the small and arrogant oligarchy of those who merely happen to be walking around.
This writer first met Bill Buckley in the late 1950s and had many pleasant encounters with him over the years. One of these occurred in 1972 when we met by chance in Hong Kong at a shop selling "Silk for Siam." Few individuals in 20th -- and 21st -- century American life have had as lasting and beneficial an influence upon our political and intellectual life as Bill Buckley. The Economist made the point that:
Few intellectuals change the political weather. Even the most successful . . . usually tilt into the prevailing wind and enjoy the sail. William F. Buckley . . . was a weather-changer.
The extreme declarations of the Rev. Jeremiah Wright, former pastor of Senator Barack Obama, have led to a widespread discussion of how familiar the senator was with Wright's views and how he could have remained a member of such a church for so long. There is, beyond this, the need to focus attention upon the black church and liberation theology which characterizes some of its most outspoken clergymen.
Some of the Rev. Wright's statements -- charging the U.S. Government with developing HIV/AIDS in order to destroy black Americans, or declaring that the CIA engaged in distributing drugs within the nation's inner cities to harm the black community -- are demonstrably false. His overall thesis, that ours is a racist society which prevents black Americans from advancing, represents something of a time warp, as if segregation had not been eliminated long ago, as if civil rights legislation -- going back to 1964 -- had not been passed and ensured the rights of all Americans.
"I've know preachers like the Rev. Jeremiah Wright, Jr.," wrote Jonetta Rose Barras, a political analyst for National Public Radio.
Like many of them, he no doubt sees his congregation as full of victims. . . . Once upon a time, I saw myself as a victim, too, destined to march in place. In the 1970s and 1980s, as a clenched-fist-pumping black nationalist. . . . I reflected the self-contempt in my speech. . . . More than a few times, I, too, damned America loudly, for its treatment of blacks.
As time went by, however, Barras declares that:
I turned away from such rhetoric. . . . That other African-Americans and I were able to overcome seemingly insurmountable hurdles is undeniably due, in part, to Wright-like prophetic speech. Like Negro spirituals, it helped us organize, motivate and empower ourselves. But just as spirituals eventually lost their relevance and potency . . . so, I believe, has Wright-speak lost its place. It's harmful and ultimately can't provide healing. And it's outdated in the 21th century.
Barras notes that she came to the realization gradually:
As I expanded my associations and experiences. . . . I came to know that we are all more alike than different. I saw that our dreams sat inside each other. All of us wanted a better America, not so much for ourselves as for our children, and their children. Achieving this meant we had to get beyond our past segregated lives and work together, inspiring the best in ourselves -- not the bitterness and the biases. . . . Today there is an entire generation of young people who know nothing of segregation, who see one another as individuals, not as symbols of a dark past. They do not look into white faces and see, as I once did, a burning cross, a white sheet, and a vicious dog on a police officer's leash. This is the coalition pushing for a new America.
Economist Walter Williams points out that Senator Obama's
. . . success is truly a remarkable commentary on the goodness of Americans and how far we've come in resolving matters of race. I'm 72 years old. For almost all my life, a black having a real chance at becoming president of the United States was at best a pipe dream. Mr. Obama has convincingly won primaries in states with insignificant black populations. As such, it further confirms what I've often said: The civil rights struggle is over and it's won.
In Williams' view:
While not every single vestige of racial discrimination has disappeared, Mr. Obama and Mr. Wright are absolutely wrong in suggesting racial discrimination is anywhere near the major problem confronting a large segment of the black community. The major problems are: family breakdown, illegitimacy, fraudulent education, and a high rate of criminality. Confronting these problems, that are not the fault of the larger society, requires political courage . . .
Discussing Afro-centric ministers such as Jeremiah Wright, E. Ethelbert Miller, an Afro-American studies expert, points out that "Some of these ministers are like some hip-hop artists. Their language is not healing."
Beyond not being healing, the notion that racial progress in recent years has not been dramatic, and that our society is mired in the racism and divisions that characterized earlier periods, is simply wrong.
The poverty rate for black men and women who finish high school or college, take a job and hold it, and have children only after 21 and married, is 6.4 percent. The overall poverty rate, based on 2002 census data, for black Americans was 21.5 percent. By 2004, the poverty rate for those who follow the formula of education, work, and marriage, was 5.8 percent while the overall poverty rate for blacks was 24.7 percent. In fact, white Americans have a higher poverty rate than blacks who finished high school, married, and worked for at least a year.
The problem with the inner city black community is not the result of white racism, but a breakdown of values within that community. In 2002, most black children -- 68 percent -- were born to unwed mothers. These numbers have real consequences. Thirty-five percent of black women who had children out of wedlock live in poverty. Only 17 percent of married black women are in poverty.
In 2005, 1.1 million black Americans over age 25 had advanced degrees -- compared to about 677,000 in 1995. In their recent book Come on People, comedian Bill Cosby and psychiatrist Alvin Poussaint, declare that, "The doors of opportunity are no longer locked and we have to walk through." In 2002, the number of black-owned businesses stood at 1.1 million -- a rise of 45 percent since 1997.
Sheryl McCarthy, a Distinguished Lecturer in Journalism at Queens College of the city of New York declares that:
The amount of progress African-Americans have made in this country over time is phenomenal. The indicators include the steady increase in the number of blacks with high school diplomas, who have college degrees or are attending college; the decline in African-Americans living in poverty; the increase in the number of black elected officials and the number who have held cabinet positions; the presence of four African-Americans as CEOs of Fortune 500 companies; the dominance of black athletes in pro-sports; and, whether or not one likes their music, or the roles they play, the high visibility of black performers in the music, television, and movie industries.
With regard to the education gap between blacks and whites, Prof. McCarty notes that:
It's important to acknowledge that the education gap is at least as much the result of the reluctance of blacks to engage in positive ways with the system as it is the result of underfunding or school officials' disdain for black children. And the vaunted "crisis" among black men is as much a personal development and community values issue as it is a discrimination issue.
Discussing the 40th anniversary of the murder of the Rev. Martin Luther King, Jr., Washington Post columnist Eugene Robinson argues that:
We sometimes talk about race in America as if nothing has changed. The truth is that everything has changed -- mostly for the better -- and that if we're ever going to see King's dream fulfilled, first we have to acknowledge that this is not an America he would have recognized.
Robinson points out that:
On April 4, 1968, it was possible to make the generalization that being black in this country meant being poor: fully 40 percent of black Americans lived below the poverty line. . . . Today, about 25 percent of Africans are mired in poverty. In many ways, being black and poor is a more desperate and hopeless condition now than it was 40 years ago. For those who managed to enter the middle class, however, most of the old generalizations no longer apply. There remains a significant income gap between whites and blacks in this country, although it shrinks when educational life is factored in. . . . Still, African-Americans control an estimated $800 billion in purchasing power. If that were translated into gross domestic product, a sovereign "Black America" would be the 15th or 16th richest nation on earth. Forty years ago, not even 2 percent of black households earned the equivalent of $100,000 a year, in today's dollars. Now, about 10 percent of black households have crossed that threshold. George and Louise Jefferson aren't so lonely anymore in that "deluxe apartment in the sky."
The Rev. Jeremiah Wright is, in many ways, an inauthentic figure. He presents himself as a spokesman for those who have suffered from poverty, deprivation and racism. Yet his own background was decidedly privileged. He lived in a tree-lined neighborhood of a large stone house in Philadelphia's Germantown section. Wright's father was a prominent pastor and his mother was a teacher and later vice principal of Philadelphia's High School for Girls, a distinguished school. Both of Wright's parents held earned Ph.D.s. Wright attended the highly selective, and overwhelmingly white, Central High School.
This inauthenticity is not unique. Middle-class blacks sometimes claim to be from the inner city to achieve success and "authenticity." Rapper Russell Jones -- known by a variety of names including ODB (Ol' Dirty Bastard) -- died at the age of 35. The New York Times reported that:
As ODB he was . . . uncomfortable spinning a public mythology, saying, for example, that he had grown up on welfare, or that he had not know his father.
Neither was true. "Our brother looked at things as selling records," said his sister Monique Jones. "So he dismissed whatever lies he told as just a way of getting publicity."
Jeremiah Wright's message has little relationship to the reality of today's America, which is far more open, tolerant and accessible to men and women of all races than his parishioners would ever imagine from the sermons of his which have been made known. *
"An unlimited power to tax involves, necessarily, a power to destroy; because there is a limit beyond which no institution and no property can bear taxation." --John Marshall
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
America appears to be in the process of moving beyond race, which is a healthy manifestation of the progress we have made in recent years.
U.S. News and World Report points out that there is a new generation of post-civil rights black political leaders:
. . . such as Sen. Barrack Obama, Newark's Mayor Cory Booker, and former Rep. Harold Ford. While they are ready to combat racism, they choose to accentuate the positive. In the context of dramatically reduced racial resentment, they espouse the traditional American virtues of self-reliance and personal responsibility.
Improved prosperity in the black community has been dramatic. In 1940, 58 percent of black women with jobs worked as maids; today it is only one percent. At the same time, the median income for black females has jumped from 36 percent of that of a white woman to about 95 percent today. In the case of men, median income has gone from 41 percent to about 72 percent of white earnings today. Blacks are now represented in the top echelons of American business -- over 25,000 of them CEOs.
Still, there remain serious problems in certain segments of the black community. There is a growing black underclass trapped in poverty by failed schools, broken families, and endemic crime. Today, some 70 percent of black children are born to single mothers compared with about 25 percent 40 years ago. In urban areas, more than 50 percent of black men do not complete high school.
An important new book, Come on People (Thomas Nelson) by comedian Bill Cosby and Alvin F. Poussaint, professor of psychiatry at Harvard Medical school, addresses and confronts these very real problems.
For more than three years, Cosby and Poussaint have been listening to community call-outs in cities across the country. Combining messages of personal responsibility with practical solutions, Come on People retells the stories shared at these call-outs. They tell a story about strong, resilient people who have overcome poverty and mistreatment.
With the demise in the 1960s of officially sanctioned forms of segregation and discrimination, there was a widespread feeling that black males would have greater access to the mainstream of American society. Cosby and Poussaint note that society:
. . . had fully expected that these young men would be in a better position in every way -- financially, psychologically, legally -- to sustain viable marriages and families. Instead, the overall situation has continued to go downhill among the poor who are mostly shut out from the mainstream of success. There is one statistic that captures the bleakness. In 1950, five out of every six black children were born into a two-parent home. Today, that number is less than two out of six. In poor communities, that number is lower still. There are whole blocks with scarcely a married couple, whole blocks without responsible males to watch out for wayward boys, whole neighborhoods in which little boys and girls come of age without seeing up close a committed partnership and perhaps never having attended a wedding.
A house without a father is a challenge, the authors note, but a neighborhood without fathers is a catastrophe. Government can hardly solve a problem of this magnitude:
We saw what happened in New Orleans when people waited for the government to help. "Governments" are things. Governments don't care. People care, and no people care like parents do -- well, except maybe grandparents and other caregivers, and thank God for them. . . . A mother can usually teach a daughter how to be a woman. But as much as mothers love their sons, they have difficulty showing a son how to be a man. A successful man can channel his natural aggression. Without that discipline, these sons often get into trouble at school because many teachers find it difficult to manage their "acting out" behavior.
The statistics of disarray within inner city black communities are stark:
* Homicide is the number one cause of death for black men between fifteen and twenty-nine years of age and has been for decades.
* Of the roughly 16,000 homicides in this country each year, more than half are committed by black men. A black man is seven times more likely to commit a murder than a white man, and six times more likely to be murdered.
* Ninety-four percent of all black people who are murdered are murdered by other black people.
* Although black people make up just 12 percent of the general population, they make up nearly 44 percent of the prison population.
The authors declare that:
This is madness. Back in 1950, there were twice as many white people in prison as black. Today, there are more black people than white in prison. We're not saying there is no discrimination or racial profiling today, but there is less than there was in 1950. These are not "political" criminals. These are people selling drugs, stealing, or shooting their buddies over trivia.
The media culture in which all Americans live is of particular concern to Cosby and Poussaint. They write that:
The Ku Klux Klan could not have devised a media culture as destructive as the one our media moguls, black and white, have created for black America. . . . What do record producers think when they churn out that gangsta rap with antisocial, women-hating messages? Do they think that black male youth won't act out what they have heard repeated since they were old enough to listen? Oh yes, then there's nigga a thousand times a day, every day. Martin and Malcolm and Medgar Evers must be turning over in their graves. They put their lives on the line. Why? So our young people can pick up where white people left off and debase themselves instead of being debased? Talk about lowering self-esteem.
Looking back historically, the authors point out that:
For all the woes of segregation, there were some good things to come out of it. One was that it forced us to take care of ourselves. When restaurants, laundries, hotels, theaters, groceries, and clothing stores where segregated, black people opened and ran their own. Black life insurance companies and banks thrived, as well as black funeral homes. African Americans also owned and prospered on family farms. Such successes provided jobs and strength to black economic well-being. They also gave black people that gratifying sense of an interdependent community with people working to help each other. In the era before welfare checks and food stamps and subsidized housing and Medicaid, families were strong too. They had to be. And if the nuclear family faltered, as sometimes happened, the extended family . . . reached in and lent a helping hand, because if they didn't no one else would. This was the world your authors were born into. It was far from free and even further from perfect, but it worked. As we all know, however, something happened, and it wasn't necessarily good.
Cosby and Poussaint are critical of those black spokesmen who tend to blame all problems facing the black community upon "white racism." They write:
Blaming white people can be a way for some black people to feel better about themselves, but it doesn't pay the electric bills. There are more doors of opportunity open for black people today than ever before in the history of America. Black people who thus far have not achieved must be made to realize that these doors are tall enough and wide enough for them to walk through with their heads held high.
Children today, they write:
. . . are coming into a country that, while imperfect, is one of the freest, most prosperous and most diverse in the history of the world. But with that freedom come temptations. . . . Yet, despite this opportunity we see so many of our black youth squandering their freedom. Crime, drugs, alcohol, murder, teen pregnancies, and drop-out rates in city high schools of 50 percent or more are devastating not only to black children but also to black communities, and the entire nation for that matter.
Education, Cosby and Poussaint point out, was once greatly valued in the black community:
On our path to victory, we have wandered off course. We were so busy worrying about the white man, we stopped paying attention to the black man. We remember the injustice of how slavers brought our people to America, but we have forgotten the brilliance of our response -- how we sneaked around late at night and taught ourselves to read, taught ourselves secret signals to resist, taught ourselves pride and will and love. We have to draw on that history of persistence. Black Americans have always used education as the chief weapon in their struggle for equal access to American society and civil rights. Education still provides -- as Malcolm X said -- our greatest hope for the future of African-American people. But illiteracy keeps people in chains -- our ancestors in real chains, our children in emotional ones. In either case, people who cannot read and write are more easily oppressed and are handicapped in their fight for freedom.
One of the complaints the authors commonly hear from children is that school is "boring." But, they ask:
. . . boring in relation to what? The Power Rangers? Our suspicions is that it is boring compared to the intense entertainment they are exposed to very early in life. Many kids have become entertainment junkies because the media have become such relentless pushers of addictive junk. . . . Excessive media violence plays a role in societal violence, but it is hardly a complete explanation. Many other factors -- like poverty, fatherlessness, motherlessness, unemployment, and the easy accessibility of guns -- contribute to the violence epidemic. Still, media violence is like a pollutant; it amplifies the toxic atmosphere that gives support to violence, and it undermines efforts at violence prevention.
Those in the gangsta rap industry, both black and white, are, the authors note:
. . . calling this a "culture." This co-called culture promotes the moral breakdown of the family. It deliberately influences women to become pregnant before they have finished their education and influences men to shuck their responsibility when this happens. . . . The gangsta rapper is saying, "I am somebody because my mother is a drug addict, and I don't know who my father is. I have been in three or four foster homes, and I have been in trouble, and that is okay because the rappers are saying that's who I am." The truth behind this kind of antisocial swagger is that swagger is all there is. It is no more than a cover for a life of sadness and frustration.
Sadly, as racism has declined, the dissolution of the inner city black community has steadily advanced. In 1954, the year of the Brown v. Board of Education decision, about 98,000 African Americans were in prison. Today, there are nearly ten times as many black people in prison. According to the Sentencing Project, 32 percent of the black men born today will go to prison at some point in their lifetime.
In the view of Cosby and Poussaint:
A lack of basic education severely limits your life options. No one can stop you from getting educated other than yourself. . . . Parents and caregivers, have you heard a kid say, "Well, I can either flip burgers or go out here and make some real money selling drugs"? When you hear that, do you stop that child and say, "Wait a minute, fool. You don't flip burgers for the rest of your life. You flip them to become the manager of the place. You flip burgers to move from manager to owner of the franchise"? You have to say this to your kids more than once. So do their teachers. . . . Please remind young people that there is no shame in hard work. . . . An unpleasant job usually leads to a better job, as young people develop working skills that are useful on any job. . . . The unemployment rate for black people is twice that of white people -- this has to change.
The notion that "poverty" has driven young people to violent crime is, Cosby and Poussaint declare:
. . . too much like those folks in times past who would claim that "the devil made me do it." The victim posture -- gussied up with words like "disadvantaged" and "at-risk" -- leads people to deny personal responsibility for self-defeating behaviors. Such attitudes overlook the great advances made by black communities when they have adopted the philosophy of self-help even as they fought racism. We have too many examples over the centuries of black achievement under hardship to deny our own capabilities and to embrace a victim mentality.
Addressing the black community in particular, Cosby and Poussaint state that:
The most important thing that is within the reach of just about everyone is to make sure that every black child has two active parents. . . . A two-parent home is less likely to be poor, and the children it produces are much less likely to end up in prison. If, a generation from now, every black child grew up in a functional two-parent home, the problems of crime and poverty in black communities would greatly diminish. . . . This is the base we build on. Children who are loved will have the confidence to succeed in school, to succeed on the job, to succeed in life. . . . By doing things we can do, we can make the future much brighter for poor black youth, much brighter for everyone. No more excuses, no more delays. Come on people.
This is a call to arms well worth heeding from two distinguished black Americans who have devoted their lives to helping men, women, and children take advantage of the opportunities now provided by our free society to all of its citizens. All of us will be the losers if the steps they urge are not taken.
Visiting post-Communist Poland, as this writer recently did, is to witness a long-repressed society coming to life and embracing the freedom, democracy, and openness which it has long been denied.
Poland, as many have pointed out, has had the geographic misfortune of being located between Germany and Russia. The Polish society lives in the shadow of World War II and the post-war Communist era. Rather than acquiesce in the Nazi occupation, Poles fought back. On August 1, 1944, the Polish Home Army attacked the German garrison in Warsaw. It lasted for 63 days with the Russian Army passively observing it from across the Vistula River. When the army capitulated on October 2, 1944, the casualties to the civilian population amounted to around 180,000 and 85 percent of the city was totally destroyed. No nation in Europe was to suffer more than Poland during World War II. Twenty-five percent of its population was destroyed.
In the Russian occupied part of Poland, the NKVD began arrests of the "enemies of the people," including officers of the Polish Army. They were sent to Russian equivalents of German concentration camps. More than 1.5 million Poles were deported to remote parts of Russia in what we would now call "ethnic cleansing."
The Russians captured 15,570 Polish officers, including 800 medical doctors. All but 448 were murdered in April 1940 on the orders of the Soviet government. Two years later, the bodies of the officers were accidentally discovered in a forest near the village of Katyn. The Soviet government denied any complicity in what came to be known as the scene of the greatest single war crime committed in modern history. It was only after the collapse of the Soviet Union the Russian authorities admitted responsibility for the crime.
Now, with democracy progressing, with a free press and freedom of religion, Poland is emerging as a modern European society. The re-emergence of a vibrant Jewish life in Poland -- where more than 3 million Jews were killed during the Holocaust, the most of any country in Europe -- is one important example of the progress which is being made as Poles confront their long, complex, and often tragic history.
Poland is now witnessing what the Washington Post has called "a small but remarkable renaissance of Jewish life." Poland now has a chief rabbi, Michael Schudrich, a native New Yorker who has returned to the country of his ancestors. When he moved to Warsaw in 1990, he described the country's Jews as "a broken population." Many doubted whether Warsaw -- home to 393,000 Jews prior to the Nazi invasion of Poland, but only 5,000 in 1945, after the Nazis were driven out, would ever have a visible Jewish population again. Slowly, however, with the end of Communism, Poland's Jews have slowly rediscovered themselves.
Though community leaders are reluctant to provide estimates, it is believed that there are perhaps 30-40,000 Poles who identify themselves as Jews. A Jewish primary school has opened in Warsaw, as have several Jewish kindergartens, youth centers, and summer camps across the country. Eight rabbis have been assigned to Poland to serve the revived population.
In many ways, Poland is moving to embrace its Jewish past. The government, alongside the Polish Jewish community, is planning to build a $58 million museum of the history of Polish Jews in Warsaw. Ewa Junczyk-Ziomecka says:
Our goal is to return to the light of memory the thousand-year history of Jews in Poland which has almost been forgotten. . . . The museum will show that not only were people killed, but their culture was killed, and even their memory was destroyed.
The Financial Times reports that:
The museum is part of a wider trend in Poland of nostalgia towards the Jewish presence, coupled with a decline in anti-Semitism and anti-Semitic incidents. . . . Krakow is now home to one of the most vibrant festivals of Jewish culture in the world, and museums are springing up across the country, most of them founded and run by non-Jews to remember that Jews were once an integral part of Poland. . . . For centuries Poland's relatively free society made it home to the largest Jewish community in the world and took in immigrants who had been expelled from Spain, England, Russia, and other European countries.
Feliks Tych, head of the Warsaw-based Institute of Jewish History, argues that critics have too harshly judged the role of Poles during the Second World War. In Warsaw, some 30,000 Jews were hidden by Poles during the war, implying that some 100,000 Poles risked execution to help hide them. However, most of the 300,000 Polish Jews who survived the war left the country when they were faced with anti-Semitism by Poles who did not want to return Jewish property obtained during the war.
In June, President Lech Kaczynski broke ground on the museum of the History of Polish Jews. The Jewish newspaper The Forward reported that:
The scene would have been hard to imagine just a generation ago, when Israeli Prime Minister Yitzhak Shamir famously quipped that Poles suck anti-Semitism in with their mothers'milk. Indeed, the convergence of Jewish and Polish support for a museum dedicated to Polish-Jewish history -- built in large part with public Polish funds, no less on the former site of the Warsaw Ghetto -- has drawn little protest from two peoples long distrustful of each other. In a land that for many Jews is synonymous with Auschwitz, the common vision for the Museum of the history of Polish Jews extends all the way to the decision to give minimal curatorial attention to the Shoah. The Museum's mission even passes muster with Marek Edelman, the man with perhaps the greatest claim to guardianship over the museum site.
Edelman, the last surviving leader of the Warsaw Ghetto uprising, states:
I don't know what percent -- 15 to 20 percent -- of the museum's exhibits should focus on the Holocaust, but I know it has to be a museum about the entire long history of Polish Jews. The rich history has to be recalled, not only the disaster.
"The museum will help people all over the world learn not just how we died in Poland, but how we lived for nearly 1,000 years," said Stephen Solender, co-chairman and president of the museum's North American Council. "Visitors will also learn what contribution Polish Jews made to Poland and the world."
Former foreign Minister Wladyslaw Bartoszewski says, "I remember the time when 30 percent of Warsaw's population was Jewish. The world must know that this museum is being raised by Polish hands."
The history of Jews in Poland is long and complex. In his book Heritage: Civilization and the Jews, Abba Eban writes that:
Jews must have been among Poland's 12th century German colonizers: excavations in Poland have unearthed coins for the period with Hebrew inscriptions. This suggests that at least some Polish Jews must have enjoyed considerable wealth and influence. Another general upsurge of German immigration came after the Mongols overran Poland in 1240-41. No doubt many Germans and Jews at this time crossed the border into Poland because there was no centralized authority to stop them.
But the Jews also came by invitation. In 1264, the Polish King Boloslav V issued a charter protecting the Jews and guaranteeing their right to take part in commerce. The charter was ratified and then extended by Casimir III ("Casimir the Great," 1333-1370), a formidable administrator who befriended Jewish settlers, as well as the peasantry. According to legend, Casimir had a Jewish mistress and their daughters were raised as Jews.
The role Jews played in Poland placed them in an often tenuous position, between the nobility and the peasants. The fact that Polish noblemen nearly always had Jews as managers of their estates and collectors of their taxes drew down upon the Jews the resentment and hatred of the peasants. When the Ukrainian Cossack leader Chmielnicki attacked Poland and conquered the Polish Army in 1648, the half-enslaved peasants took the opportunity to rebel. Among their first victims were the Jews, and they treated them with inhuman cruelty, venting all the hate they felt for their lords and the lords' Jewish agents. Jews were massacred by the thousands.
The epidemic of religious hatred penetrated into Poland. During the 15th century, regional church councils periodically issued anti-Jewish decrees that forbade Jews to have any social intercourse with Christians and forced them to pay a special tax for the support of the churches. As in other parts of Europe, the church laws forced the Jews into ghettos.
Poland's history has been a tragic one. In 1654, Russia invaded Poland and expelled or slew all Jews in the towns she conquered. On the West, Poland was being attacked by another enemy, Charles X of Sweden. The Swedes had no special enmity to the Jews, but Jews suffered from both the Russians and the Poles. In the ten years of warfare from 1648-1658, over 100,000 Jews lost their lives. Poland itself was doomed. Its king was nearly powerless, its Diet (or Congress of Noblemen) was inefficient. In the 19th century, the occupying powers abolished many of the laws that protected the Jews, but the patriotic feelings toward Poland of most Jews prevailed and they supported the struggle for independence, fighting in the Kosciuszko Uprising (1794), the November Uprising (1832) and the People's Spring (1848-49).
Out of this bleak physical and spiritual landscape, Abba Eban writes:
. . . arose a Jewish religious survival that would change the face of Judaism as no sectarian movement had since Roman times. The movement was Hasidism -- literally pietism -- and its father was Israel ben Eliezer (c. 1700-1760), better known as the Ba'al Shem Tov ("Master of the Good Name"). . . . Hasidic tradition has it that he was born in Ckop, a small town of Podelia, a region then part of southeastern Poland and today within the Ukraine. The legends emphasize his love of nature and of solitary contemplation. . . . Unlike the rabbis of his day, he believed that even a simple unlearned man could approach God directly through prayer and worship. Some recent writers have pointed to the influence of Polish peasant ways on his thinking.
The most influential of all the yeshivot (Jewish religious schools) in the world at the time of the Renaissance was the one in Krakow, made famous by the educator Moses Isserles (1520-1572). A man of considerable secular culture and strong character, he defied the religious fundamentalism of Polish Jewry in his time. He introduced to the curriculum not only the study of astronomy, history, and mathematics, but also the highly controversial Aristotelian philosophic method of Maimonides, the foremost Jewish thinker of the Middle Ages, for whom he had unbounded admiration.
The tragic events of World War II had a dramatic impact upon the Jews of Poland. Both heroic efforts to rescue Jews, and harsh anti-Semitic acts were witnessed. The revelation in 2001 that the Jewish residents of the town of Jedwabne were killed by Polish villagers, not occupying Nazis, shocked Polish society. Some Polish intellectuals say that the country has now started to face its past. "This really happened with Jedwabne," said Stanislaw Krajewski, a professor of logic at Warsaw University and a member of the Union of Polish Jewish Communities.
Everything has been said, there are no taboos. All the things of Poles murdering their neighbors have been discussed, and no one can say "I haven't heard about it," as they could have even two years ago.
In 2001, President Aleksander Kwasniewski traveled to Jedwabne to make a formal apology to the Jews on behalf of Poland.
Poland has honored its citizens who tried to aid victims of Auschwitz. At a ceremony outside the former concentration camp in January 2007, residents of Oswiecim, where Auschwitz was located, and Holocaust survivors listened to a letter from President Lech Kaczynski praising the efforts of those who risked their lives to help those persecuted by the Nazis. "World public opinion has often held that the residents of the area were completely indifferent to the fate of the prisoners," Kaczynski said in a letter read on the 62nd anniversary of the liberation of Auschwitz. A presidential aide awarded medals to some 40 people from Oswiecim and surrounding villages to honor them for trying to help Nazi victims.
In March 2007, a 97-year-old woman credited with saving 2,500 Jewish children during the Holocaust was honored by Parliament at a ceremony during which Poland's president said she deserved the Nobel Peace Prize. Irena Sendler, who lives in a nursing home in Warsaw, was too frail to attend the special session in which senators unanimously approved a resolution honoring her and the Polish Underground's Council for Assisting Jews.
The new movement of awareness of Poland's rich Jewish heritage comes, in part, in Rabbi Schudrich's view, from the teachings of Pope John Paul II against anti-Semitism, and Poland's admiration of the United States, where anti-Semitism is scorned. "A third reason," he says:
. . . is more speculative. Among the younger generation there is a rejection of everything their parents and grandparents stood for. They believe the opposite of the older generation, which was Communist and anti-Semitic. . . . There is a growing understanding that the Jews were part of the Polish landscape and that the Germans killed them, to some extent with Polish collaboration. This also leads to a feeling of obligation to perpetuate Jewish memory.
Jewish life is experiencing new growth not only in Warsaw and Krakow but also in Lodz, Katowice, Wroclaw, and several other cities. As Poland emerges from the occupation of the Nazis during World War II and the tyranny of Communism during the post-war years, let us hope that a new and hopeful chapter in Poland's often tragic history is now under way. And we in the U.S. and other Western countries should do what we can to assist its progress. *
"The government solution to any problem is usually at least as bad as the problem." --Milton Friedman
We would like to thank the following people for their generous support for this journal (from 1/11/08 to 3/7/08): Charles A. Bacon, Reid S. Barker, Douglas W. Barr, Gordon D. Batcheller, Arnold Beichman, Erminio Bonacci, Priscilla L. Buckley, William G. Buckner, David G. Budinger, George F. Cahill, Cliff Chambers, W. Edward Chynoweth, William D. Collingwood, Linda Driedger, Neil Eckles, Edwin J. Feulner, Reuben M. Freitas, Jerome C. Fritz, Gary D. Gillespie, Joseph H. Grant, Hollis J. Griffin, Alene D. Haines, Daniel J. Haley, Ted L. Harkins, Bernhard Heersink, John A. Howard, Thomas E. Humphreys, David Ihle, Burleigh Jacobs, Robert R. Johnson, Steven D. Johnson, Robert E. Kelly, Robert E. Kersey, Robert M. Kubow, John S. Kundrat, Eric Linhof, Herbert London, Ronald B. Maddox, Curtis Dean Mason, John Nickolaus, King Odell, Harold Olson, Harold B. Owens, David Pohl, Gregory J. Pulles, Jeanne I. Reisler, Steven B. Roorda, Matthew J. Sawyer, Mr. & Mrs. Richard P. Schonland, Irene L. Schultz, Richard L. Sega, Thomas E. Snee, John A. Sparks, John R. Stevens, Carl G. Stevenson, Charles B. Stevenson, Clifford W. Stone, Jack E. Turner, Johanna Visser, James J. Whelan, Robert L. Wichterman Sr., Gaylord T. Willett, Charles W. Wilson, Robert W. Wilson.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
The story of the life of Clarence Thomas, as set forth in his memoir, My Grandfather's Son (Harper), is destined to become an American classic, not dissimilar to the autobiographies of Frederick Douglass and Booker T. Washington.
This book chronicles an extraordinary life and describes, as well, the education of an inquiring mind, seeking to understand the complex reality around him, and to make sense of the racial politics and ideological divisions which confront him during the turbulent 1960s and 1970s.
In an era when "Identity" politics dictated a particular political, economic, and social stance for black Americans, those individuals who persisted in thinking for themselves and followed an often lonely path to discover their own view of reality and truth were isolated and bitterly attacked.
Describing Clarence Thomas as "the freest black man in America," Shelby Steele, a fellow at the Hoover Institution, notes that:
For minorities . . . their group identity will often be the enemy of their individuality. In its insecurity, the group is naturally threatened by the impulse of some of its members to think for themselves. . . . People who veer from the group masks -- who evolve by their own lights -- start to lose their moral authority as blacks. This is why President Bush got no credit for having two black secretaries of state. Naively, he selected two black individuals. Still, the black individual is now emerging as something of a new archetype in American life -- not someone who disowns his group but someone who rejects it as master. Today there is no more quintessential embodiment of this new archetype than Supreme Court Justice Clarence Thomas.
Clarence Thomas was born in rural Georgia on June 23, 1948. He was abandoned by his father and didn't even meet him until he was eleven years old. His mother was left to raise him and his brother and sister on the ten dollars a week she earned as a maid. At the age of seven Thomas and his six-year-old brother were sent to live with his mother's father, Myers Anderson, and her stepmother in their Savannah home. This was a move that would change Thomas' life.
His grandfather, whom he called "Daddy," had a strict work ethic. He owned his own fuel-oil business and he immediately subjected the two boys to a regime of sacrifice -- no school sports, very little television, self-development and hard work. His response to the poverty and segregation of black Savannah was the American ethic of self-help, faith in God, delayed gratification and individual initiative.
Thomas writes:
In every way that counts, I am my grandfather's son. I even called him Daddy because it was what my mother called him. He was the one hero in my life. What I am is what he made me.
Before going to live with his grandfather, the squalor of Thomas' life was overwhelming:
Nowadays most people know Savannah from reading Midnight in the Garden of Good and Evil. To them it is an architectural wonderland full of well-heeled eccentrics and beautifully preserved 18th and 19th century houses. I didn't live in that part of town. When I was born, Savannah was hell. . . . The only running water in our building was downstairs in the kitchen, where several layers of old linoleum were all that separated us from the ground. The toilet was outdoors in the muddy backyard.
After moving in with his grandfather, life changed:
In return for submitting to Daddy's iron will . . . I lived a life of luxury, at least by comparison. . . . The home that Daddy and Aunt Tina built for themselves (and in which my mother now lives) had two bedrooms, one bathroom, separate living and dining rooms, a den and a kitchen. . . . We had our own beds, plenty to eat. . . . I had never before seen a house with such conveniences, or with an indoor porcelain toilet that worked. I flushed it as often as I could in my first months on East 32nd Street.
Daddy had become a Roman Catholic, and sent Clarence and his brother to St. Benedict the Moor Grammar School. It was run, Thomas writes:
. . . by the Missionary Franciscan Sisters of the Immaculate Conception, most of whom were Irish immigrants. . . . They expected our full attention and made sure they got it, dispensing corporal punishment whenever they saw fit. . . . We learned that God made us to know, love, and serve Him in this world, and to be happy with Him in the next. The sisters also taught us that God made all men equal, that blacks were inherently equal to whites, and that segregation was morally wrong. This led some people to call them "the nigger sisters."
A few months before his 16th birthday, Thomas decided he wanted to enter St. John Vianney to prepare for the priesthood. At that time, the schools in Savannah were still segregated, including St. John Vianney. The Catholic Church, Thomas recalls:
. . . maintained separate parishes and parochial schools for blacks, and the local church fathers seemed in no hurry to lift the color bar. No doubt they thought they were being prudent, but before long I would come to see their caution as cowardice.
Once admitted to the seminary, Thomas began to slowly adjust to the new, predominantly white, environment. He observed racism on the part of his fellow seminarians:
. . . a small group of students was given permission to move the school t.v. into a classroom to watch the Cleveland Browns play. Those were the days when Jim Brown was on the team. Midway through one of his celebrated runs, one student yelled, "Look at that nigger go." I felt as if my soul had been pierced. . . . Worse yet was the time when the same student who'd called Jim Brown a "nigger" passed me a folded note during history class. "I like Martin Luther King . . ." it said on the outside. I unfolded the piece of paper. Inside was a single word: ". . . dead."
Despite these incidents, Thomas' grades improved steadily and toward the end of the first term he won a Latin bee. At the end of his sophomore year, he asked the other seminarians to sign his yearbook. One senior wrote, "Keep on trying, Clarence, one day you'll be as good as us."
The Church's failure to take a strong stand on racism became of growing concern to Thomas:
It seemed self-evident . . . that the treatment of blacks in America cried out for the unequivocal condemnation of a righteous institution that proclaimed the inherent equality of all men. Yet the Church remained silent, and its silence haunted me. I have often thought that my life might well have followed a different route had the church been as adamant about ending racism then as it is about ending abortion now.
What finished off Thomas' religious vocation was the day Martin Luther King was shot and a fellow seminarian said, "That's good. I hope the son of a bitch dies." Daddy, however, could not accept the decision to leave the seminary. "You've let me down," he said. "I'm finished helping you. You'll have to figure it out yourself. You'll probably end up like your no-good daddy."
On his own, Thomas entered the College of the Holy Cross in Massachusetts, one of six blacks in his class. He went through many political transformations -- from altar boy to seminary student to a campus radical and racial militant, before coming back to the values his grandfather taught him and an understanding of society which he acquired on his own.
He slowly came to oppose race-based affirmative action programs:
I didn't think it was a good idea to make poor blacks, or anyone else, more dependent on government. That would amount to a new kind of enslavement, one that ultimately relied on the generosity -- and the ever-changing self-interests -- of politicians and activists. It seemed to me that the dependency it fostered might ultimately prove as diabolical as segregation, permanently condemning poor people to the lowest rungs of the socioeconomic ladder by cannibalizing the values without which they had no long-term hope of improving their lot. . . . I began to suspect that Daddy had been right all along, the only hope I had of changing the world was to change myself first.
Thomas remembers that:
The more I read, the less inclined I was to conform to the cultural standards that blacks imposed on themselves and on one another. Merely because I was black, it seemed, I was supposed to listen to Hugh Masekala instead of Carole King, just as I was expected to be a radical, not a conservative. I no longer cared to play that game. . . . The black people I knew came from different places and backgrounds . . . yet the color of our skin was somehow supposed to make us identical in spite of our differences. I didn't buy it. Of course we had all experienced racism in one way or another but did that mean we had to think alike?
After Holy Cross, Thomas attended Yale Law School, graduating in 1974. His education continued, not only in the law but also in the racial political environment around him. "Like every other black law student," he writes:
. . . I was uncomfortably aware that blacks failed to pass the bar exam at a much higher rate than whites, and that the NAACP Legal Defense and Education Fund had filed lawsuits alleging that the exams they took were racially discriminatory. Lani Guinier, one of my classmates, was involved with the Legal Defense Fund, so I asked her to supply me with information about the extent of the problem. . . . At first I assumed that the disproportionate black failure rate was conclusive evidence of racial discrimination, but the more closely I looked at the facts, the more apparent it became that I was wrong. At that time each question on the bar exam was graded separately by a different scorer and each completed exam identified solely by number, thus making it impossible for the graders to tell which examinees, if any, were black.
To the Legal Defense Fund's "adverse impact theory," which held that if a neutral examination produced disparate results among the races, then it could be considered discriminatory. Thomas responds:
But I didn't buy that . . . knowing that no measurement of any part of our lives ever produced identical results for all racial or ethnic groups. To argue otherwise, I thought, diverted attention from the real culprits, the people who were responsible for the useless education these young people had received.
After law school Thomas went to work for John Danforth, who was serving as Missouri's attorney general. When Danforth was elected to the U.S. Senate, Thomas followed him to Washington and later worked at the Department of Education and as head of the Equal Employment Opportunity Commission, before being named a federal judge.
Along the way he discovered the writings of leading black conservatives such as Thomas Sowell and Walter Williams. He reports that:
One of the first people in Washington who talked sense to me abut race was Jay Parker, the editor of a new magazine called The Lincoln Review. . . . Jay was friendly, energetic, unflappable and unapologetically conservative. I'd never known a black person who called himself a conservative, and it surprised me that we rarely disagreed about anything of substance.
Thomas provides this assessment of the black conservatives who had influenced his thinking and became his friends:
Walter Williams, Thomas Sowell and Jay Parker were all smart, courageous, independent-minded men who came from modest backgrounds. Politics meant nothing to them. All they cared about was truthfully describing urgent social problems, then finding ways to solve them. Unhampered by partisan allegiances, they could speak their minds with honesty and clarity. They were my kind of black men. . . . I'll never forget the time when Jay reminded me that freedom came from God, not Ronald Reagan. For Jay politics was a part of life, not a way of life. It was an attitude I sought to emulate.
There is, of course, much in this book about Clarence Thomas' personal life, his difficult first marriage, raising his son, and his happy and fulfilling marriage to Virginia. There is, as well, a lengthy description of the Supreme Court confirmation hearings in the Senate, and the charges of sexual harassment by Anita Hill, which he convincingly refutes. It is his view, which the evidence supports, that he was targeted with such a vicious assault precisely because he was a black man who persisted in thinking for himself, and rejected the political correctness and liberal orthodoxy which the civil rights establishment and white liberals sought to impose upon him.
"The more I reflected on what was happening," he writes:
. . . the more it astonished me. As a child of the Deep South, I'd grown up fearing the lynch mobs of the Ku Klux Klan; as an adult, I was starting to wonder if I'd been afraid of the wrong white people all along. My worst fears had come to pass not in Georgia but in Washington, D.C., where I was being pursued not by bigots in white robes but by left-wing zealots draped in flowing sanctimony. For all the fear I'd known as a boy in Savannah, this was the first time I'd found myself at the mercy of people who would do whatever they could to hurt me -- and institutions that once prided themselves on bringing segregation and its abuses to an end were aiding and abetting in the assault.
During the hearings, journalist Juan Williams wrote in The Washington Post:
Here is indiscriminate, mean-spirited mud-slinging, supported by the so-called champions of fairness: liberal politicians, unions, civil rights groups, and women's organizations. They have been mindlessly led into mob action against one man by the Leadership Conference of Civil Rights. . . . To listen to or read some news reports on Thomas over the past month is to discover a monster of a man, totally unlike the human being full of sincerity, confusion, and struggles whom I saw as a reporter who watched him for some ten years. He has been conveniently transformed into a monster about whom it is fair to say anything, to whom it is fair to do anything. President Bush may be packing the court with conservatives, but that is another argument, larger than Clarence Thomas. In pursuit of abuses by a conservative president, the liberals become the abusive monsters.
Fortunately, Clarence Thomas survived the assault upon him and triumphed over his adversaries. If the goal of our society is for each individual to go as far as his ability will take him, for each person to come to his own conclusions about political, social, and economic issues, then Clarence Thomas has lived this American Dream. This book is an eloquent testimonial to both that life and that dream, and should be read not only by those who agree with Clarence Thomas' views but, more important, by all who cherish a society in which genuine independence of thought is respected and in which excellence is given its proper reward. *
"These hostile hungers have taken their turn in dominating the history of modern man: the hunger for liberty, to the detriment of equality, was the recurrent theme of the nineteenth century in Europe and America; the hunger for equality, at the cost of liberty, has been the dominant aspect of European and American history in the twentieth century." --Will Durant
We would like to thank the following people for their generous support of this journal (from 11/15/07 to 1/11/08): John G. Barrett, Carol & Bud Belz, Charles Benscheidt, Aleatha W. Berry, Veronica A. Binzley, Jan F. Branthaver, Terry Cahill, William C. Campion, Mark T. Cenac, Garry W. Croudis, John D'Aloia, Peter R. DeMarco, Richard A. Edstrom, Nicholas, Falco, The Anderson Foundation, James R. Gaines, Jane F. Gelderman, Robert C. Gerken, Robert L. & Carol J. Gilmore, Lee E. Goewey, Joyce Griffin, Violet H. Hall, Paul J. Hauser, John H. Hearding, Richard L. Herreid, Arthur H. Ivey, Marilyn P. Jaeger, David A. Jones, Edgar Jordan, Michael Kaye, Allyn M. Lay, Alan Lee, Gregor MacDonald, Bernard L. Poppert, Richard O. Ranheim, Howard J. Romanek, Harry Richard Schumache, Dave Smith, Frank T. Street, John West Thatcher, Clifford F. Thies, Julian Tonning, Elizbeth E. Torrance, Thomas Warth, Robert D. Wells, Eric B. Wilson.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
As the Bush Administration enters its final period, more and more conservatives are expressing dismay about its legacy.
Alan Greenspan, who served as Federal Reserve chairman for 18 years and was the leading Republican economist for the past three decades, levels harsh criticism at President Bush and the Republican Party in his new memoir, The Age of Turbulence: Adventures in a New World. He argues that Bush abandoned the central conservative principle of fiscal restraint:
My biggest frustration remained the president's unwillingness to wield his veto against out-of-control spending. Not exercising the veto power became a hallmark of the Bush presidency. . . . To my mind, Bush's collaborate-don't-confront approach was a major mistake.
Greenspan accuses the Republicans who presided over the party's majority in the House until 2006 of being too eager to tolerate excessive federal spending in exchange for political opportunity. The Republicans, he says, deserved to lose control of the Senate and House in the 2006 elections. "The Republicans in Congress lost their way," he writes. "They swapped principle for power. They ended up with neither."
He singles out J. Dennis Hastert, the Illinois Republican who was House Speaker, and Tom DeLay, the Texas Republican who was majority leader until he resigned after being indicted for violating campaign finance laws in his home state. Greenspan writes:
House Speaker Hastert and House majority leader Tom DeLay seemed readily inclined to loosen the federal purse strings any time it might help add a few more seats to the Republican majority.
When Bush and Cheney won the 2000 election, Greenspan declares:
I thought we had a golden opportunity to advance the ideals of effective, fiscally conservative government and free markets. . . . I was soon to see my old friends veer off to unexpected directions. Little value was placed on rigorous economic policy debate or the weighing of long-term consequences.
He notes that the large, anticipated federal budget surpluses that were the basis for Bush's initial $1.35 trillion tax cut "were gone six to nine months after George W. Bush took office..." So Bush's goals "were no longer entirely appropriate. He continued to pursue his presidential campaign promises nevertheless."
Greenspan provides this assessment of the Bush presidency:
I'm just very disappointed. Smaller government, lower spending, lower taxes, less regulation -- they had the resources to do it, they had the knowledge to do it. They had the political majorities to do it, and they didn't. In the end, political control trumped policy, and they achieved neither political control nor policy.
In a column entitled "The Republican Collapse," conservative New York Times columnist David Brooks explains how philosophical drift had led to political decline:
The Modern conservatism begins with Edmund Burke. What Burke articulated was not an ideology or a creed, but a disposition, a reverence for tradition, a suspicion of radical change. . . . Over the past six years, the Republican Party has championed the spread of democracy in the Middle East. But the temperamental conservative is suspicious of rapid reform believing that efforts to quickly transform anything will have, as Burke wrote, "pleasing commencements" but "lamentable conclusions."
Brooks notes that,
The world is too complex, the Burkean conservative believes, for rapid reform. Existing arrangements contain latent functions that can be neither seen nor replaced by the reformer. The temperamental conservative prized epistemological modesty, the awareness of the limitations on what we do and can know, what we can and cannot plan. Over the past six years, the Bush administration has operated on the assumption that if you change the political institutions in Iraq, the society will follow. But the Burkean conservative believes that society is an organism; that custom, tradition, and habit are the prime movers of that organism; and that successful government institutions grow gradually from each nation's unique network of moral and social restraints.
In recent years, the vice president and the former attorney general have sought to expand executive power as much as possible in the name of protecting Americans from terror. This has also produced a reaction from conservatives who have long feared the power of government and have been committed to the constitutional system of checks and balances and divided authority between the executive, the legislature, and the judiciary.
In October 2003, Jack Goldsmith, a legal scholar with strong conservative credentials, was hired to head the Justice Department's Office of Legal Counsel, which advised the president and the attorney general about the legality of presidential actions. As he was briefed on counterterrorism measures the Bush administration had adopted in the wake of September 11, Goldsmith says he was alarmed to discover that many of those policies "rested on severely damaged legal foundations," that the legal opinions that supported these counterterrorism operations were, in his view, "sloppily reasoned, overbroad, and incautious in asserting extraordinary constitutional authorities on behalf of the president."
Mr. Goldsmith eventually withdrew several key department opinions -- including two highly controversial "torture memos," dealing with the authority of the executive branch to conduct coercive interrogation -- but only after contentious battles with administration hardliners led by David Addington, then Vice President Cheney's legal adviser and now chief of staff.
In his book The Terror Presidency, Goldsmith recounts how he and his Justice Department colleagues, in consultation with lawyers from the State Department, the Defense Department, the CIA, and the National Security council, reached a consensus in 2003 that the fourth Geneva Convention (which governs the duties of an occupying power and the treatment of civilians) affords protection to all Iraqis, including those who are terrorists. When he delivered this decision to the White House, he recalls, Addington exploded: "The president has already decided that terrorists do not receive Geneva Convention protections. You cannot question his decision."
Mr. Goldsmith, who resigned from the Office of Legal Counsel in June 2004 -- only nine months after assuming the post -- describes a highly insular White House obsessively focused on expanding presidential power and loathe to consult with Congress; a White House that sidelined Congress in its policymaking and pursued a "go-it-alone approach" based on "minimal deliberation, unilateral action, and legalistic defense."
Noting that "the president and the vice president always made clear that a central administration priority was to maintain and expand the president's formal legal powers," Goldsmith says that lawyers soon realized that they "could gain traction for a particular course of action -- usually going it alone -- by arguing that alternative proposals would diminish the president's power."
Mr. Goldsmith is also critical of the manner in which the Bush administration went about side-stepping the 1978 Foreign Intelligence Surveillance act, which required the president and government agencies to obtain warrants from a special court before conducting electronic surveillance of people suspected of being terrorists or spies. Although he says he shared many of the administration's concerns on this issue, he "deplored the way the White House went about fixing the problem."
He quotes Mr. Addington saying of the surveillance act in court: "We're one bomb away from getting rid of that obnoxious court." And he observes that top Bush officials dealt with the act "the way they dealt with other laws they didn't like: they blew through them in secret based on flimsy legal opinions that they guarded closely so no one could question the legal basis for the operations."
Mr. Goldsmith concludes with the observation that unlike Lincoln and Franklin D. Roosevelt -- two presidents who also presided over the nation at times of crisis -- President Bush has relied only on "the hard power of prerogative," ignoring:
. . . the soft factors of legitimation -- consultation, deliberation, the appearance of deference, and credible expressions of public concern for constitutional and international values -- in his dealing with Congress, the courts, and allies.
As a result, Goldsmith says, even if President Bush's "accomplishments are viewed more charitably by future historians than they are viewed today," they will "likely always be dimmed by our knowledge of his administration's strange and unattractive views of presidential power."
The combination of deficit spending, the growth of executive power, and the increasingly unpopular war in Iraq has left many conservatives completely disillusioned. "I've never seen conservatives so downright fed up," says long-time conservative leader Richard Viguerie.
The Economist notes that:
Mr. Bush has . . . presided over the biggest expansion in government spending since his fellow Texan, Lyndon Johnson, provoking fury on the right. His prescription-drug benefit was the largest expansion of government entitlements in 40 years. He has increased federal education spending by about 60 percent and added some 7,000 pages of new regulations. Pat Toomey, the head of the Club for Growth, says the conservative base feels "disgust with what appears to be a complete abandonment of limited government."
William F. Buckley, a leading conservative spokesmen and the founding editor of National Review, says that if Mr. Bush were the leader of a parliamentary system, "it would be expected that he would retire or resign." Bruce Bartlett, a former Reagan administration economist, accuses him of "betraying the conservative movement."
In The Economist's view:
The Republicans have failed the most important test of any political movement -- wielding power successfully. They have botched a war. They have splurged on spending. And they have alienated a huge section of the population.
As the 2008 presidential campaign gets under way, Republican contenders are in the process of keeping their distance from the Bush presidency. The Washington Post reports:
For all the candidates, the unspoken problem is the same: how to establish a clear break from the legacy of President Bush and his sagging poll numbers without alienating the party faithful. . . . All the candidates have sought to make subtle distinctions with Bush. Thompson and McCain say the U.S. should have mobilized more troops for the invasion of Iraq, while Romney and McCain say the response to Hurricane Katrina should have been handled better -- leaving no doubt that they have some concerns about the Bush presidency, but stopping short of attacking his leadership. They have failed against excessive federal spending and say they would be willing to veto spending bills, in contrast to Bush, who this week vetoed a bill over spending concerns for only the second time in his presidency.
While there is a broad consensus that the party must find a way to move beyond Bush's legacy, according to The Post:
. . . there are widespread divisions within the GOP over the solutions. Some see a priority in the need to address ballooning spending, while others view social issues such as abortion and traditional marriage as paramount in determining the party's course.
Peter Wehner, a former White House official who worked under Karl Rove as director of strategic initiatives, says that:
Conservatism is going through an interesting moment. It's still a center-right country, but conservatism is going through an interesting moment when it's got to consider fresh the issues it's going to put forward and the politics it's going to advance.
He described his party as "at sea" on domestic policy such as health care.
Historian George H. Nash says that, "American conservatism has become middle-aged," and that with middle age has came a midlife crisis. In the book The Future of Conservatism (Edited by Charles W. Dunn, ISI Books), Nash identifies four distinct strands of modern American conservatism. Traditionalists value continuity, order, and hierarchy; libertarians prize personal freedom and social spontaneity; neoconservatives blend the New Deal's idealistic spirit with conservatism's muscular nationalism; and religious conservatives fight relativism, secularism, and immorality.
Jonathan Rauch, a senior writer with National Journal, and a guest scholar at the Brookings Institution, writes that:
Given their differences, the surprise is that these four heads ever joined atop one political beast. Yet Ronald Reagan, Soviet Communism, and hostility to the excesses of the 1960s brought together a vibrant coalition. Today Reagan and the Soviet Union are gone, and conservatism in power has produced excesses of its own, bringing the movement's cultural contradictions to the fore. Libertarians and traditionalists disagree on the relative importance of liberty and virtue; many neocons care not a fig about abortion, while religious conservatives seem to care about little else.
In Rauch's view:
Unexpectedly, George W. Bush, Reagan's would-be heir, has divided the conservative movement. . . . In his obsession with marginalizing the Democrats, and in his determination to be a "transformational" president, Bush embraced an activism that unmoored the party from its libertarian preference for small government and its traditionalist preference for orderly incrementalism. Libertarians' disenchantment has become obvious; less widely appreciated is that "there is now an apparently unbridgeable divide between traditional conservatives and the Bush administration on major policy matters," writes George E. Carey, a professor of government at Georgetown.
In the area of foreign policy, Bush broke decisively with the more cautious conservatism of Dwight D. Eisenhower and Bush's own father. Political scientist Daniel J. Mahoney argues that Bush and the neoconservatives "misplace one-sided emphasis on democracy" -- their "undemocratic monomania":
. . . marks a break with an older conservative tradition which always insisted that Western liberty draws on intellectual and spiritual resources broader and deeper than that of modern democracy. . . . The best conservative thinkers of the last two centuries have been wary of unalloyed democracy.
Social and religious conservatives are also unhappy. Their rise to political power has redirected cultural currents much less than they had hoped, and Bush's Republican Party, the family scholar Allan C. Carlson complains, has sided with big business over families.
Where American conservatism is headed after two terms of an administration which proclaimed itself "conservative" but presided over huge budget deficits, a growth in government power, increased claims of executive authority, and a costly war which the majority of Americans believe was unnecessary -- all policies which are the opposite of what conservatives have always advocated -- is difficult to say.
At the present time, disillusionment among traditional conservatives is growing as is their view that the Republican Party may no longer be viewed as a vehicle through which to advance the program of smaller and limited government, balanced budgets, and a prudent foreign policy. Some are now speaking of launching a third party alternative.
Whatever happens, it is clear that the American political landscape has been dramatically altered as conservatives ponder their -- and the nation's -- future.
A Virginia state panel has sharply criticized decisions made by Virginia Tech before and after last April's murder of 32 people, saying university officials could have saved lives by notifying students and faculty members earlier that there had been killings on campus. Because university officials misunderstood federal privacy laws as forbidding any exchange of a student's mental health information, the report concludes, they missed numerous indications of the gunman's mental health problems.
The Virginia Tech tragedy makes it clear that our privacy laws must be carefully examined and that changes are in order. When Seung Hui Cho, the shooter, was in high school, Fairfax County, Virginia, school officials determined that he suffered from an anxiety disorder so severe that they put him in special education, and devised a plan to help, but Virginia Tech was never told of the problem. Fairfax officials were forbidden from transmitting this information to Virginia Tech by federal privacy and disability laws that prohibit high schools from sharing with colleges private information such as a student's special education coding or disability. Those laws also prohibit colleges from asking for such information.
At Virginia Tech itself, it became clear that Cho had mental problems. Campus authorities were aware of his troubled mental state17 months before the massacre. More than one professor reported bizarre behavior. Campus security tried to have him committed involuntarily to a mental institution. There were complaints that Cho made unwelcome phone calls and stalked female students. Walter Williams, professor of economics at George Mason, University states that:
Given the university's experiences with Cho, they should at least have expelled him, and their failure or inability to do so was the direct cause of the massacre.
Beyond this, argues Williams, we must consider a federal law known as the Family Educational Rights and Privacy Act of 1974 (FERPA). As Virginia Tech's registrar reports:
Third Party Disclosures are prohibited by FERPA without the written consent of the student. Any persons other than the student are defined as Third Party, including parents, spouses, and employers.
College officials are required to secure written permission from the student prior to the release of any academic record information.
Williams writes:
That means a mother, father, or spouse who might have intimate historical knowledge of a student's mental, physical, or academic problems, who might be in a position to render assistance in a crisis, is prohibited from being notified of new information. . . . Alternatively, should the family member wish to initiate an inquiry as to whether there have been any reports of mental, physical, or academic problems, they are prohibited from access by FERPA.
Under our present law, the only way Virginia Tech officials could have known about Cho's mental problems would have been from Cho himself. Yet, experts point out, asking for help is almost impossible for someone with his condition, which has been described as selective mutism. Robert Schum, a clinical psychologist and expert in selective mutism says that:
Children with selective mutism don't want to be the center of attention. They don't like to sit on Santa's lap. They don't like their photo taken on picture day. They don't want kids to sing to them at their birthday celebration. They just want to be left alone. So when you put the responsibility on them and ask them to draw attention to themselves by asking for help . . . that's really tough.
Cho's parents, although cooperative with Fairfax school officials, might not have fully understood what was wrong, and that their son needed help in college as well. As recently as the summer before the April shooting, Cho's mother had sought out members of One Mind Church in Woodbridge, Virginia, to purge him of what the pastor there called the "demonic power" possessing him.
Richard Crowley, coordinator of guidance services for Fairfax County, said high schools generally send transcripts to colleges with only a student's courses, grades, and test scores. Even the number of times a student has been suspended is considered an optional piece of information. Moreover, many colleges say they don't want to know because of the potential liability. Barmak Nassirian, with the American Association of Collegiate Registrars and Admissions Officers, says that:
In soliciting a student's history of psychiatric treatment or diagnoses by treating physicians, you basically open a Pandora's box. Even if you should decide, for reasons that have nothing to do with medical circumstances, not to accept a student, you most certainly will have a case that will be litigated.
Washington Post columnist Mark Fisher notes that:
I've read the 260 pages filed by the review panel . . . and I understand that the university should have stepped in to help Seung Hui Cho but didn't, and that the state mental health system should have acted more decisively but didn't. . . . In the hundreds of interviews the panel conducted, why didn't they ask all those people whose job it is to care for students one question: how would you have handled Cho if you had let your conscience, not privacy laws, guide you? . . . If the mental health professionals, police and college administrators, had acted as if their own child were involved, there might not have been any need for an investigation.
In Fisher's view:
The culprit here is the culture of privacy that we have allowed to pervade certain areas of life, especially health and education. . . . By walling off mental illness, we prevent the power of light from reaching those who are suffering. Privacy laws leave everyone, from health workers to college administrations, confused and defensive about what they may do and say. They react by doing less than they would if left to their own empathy and common sense. . . . Ultimate responsibility for the shootings rests squarely on Cho. But that does not absolve others of the need to act when something goes very wrong. Parents, as Virginia Governor Tim Kaine said, cannot "just drop your child off on campus." Rather, they must seek out resident advisers and counselors and say "Let me tell you about my precious child." And colleges must exhibit the same care toward young adults that parents, friends, or good bosses do -- no matter how much the law may seek to separate us from our human obligations.
Governor Kaine said it was hard to understand why more was not done about a student who once showed fascination with the 1999 Columbine High School shootings and considered the two students who committed it martyrs. "Look, I'm troubled that a student who had talked about Columbine at an earlier point in his life, that that information was unknown to anybody on the Tech campus."
The Virginia Tech Review Panel concludes that, "The current state of information privacy law and practice is inadequate. The privacy laws need amendment and clarification."
While colleges require students to submit immunization records, all records of emotional problems are sealed. "Perhaps students should be required to submit records of emotional or mental disturbance . . . after they have been admitted, but before they enroll," the report says. "Maybe there should be some form of 'permanent record.'"
In an English paper Virginia Tech did not disclose until The Washington Post revealed its existence, Cho wrote in an English class:
I hate this. I hate all these frauds. I hate my life. . . .This is it. . . . This is when you damn people die with me.
Panel member Roger Depue, a longtime FBI profiler, says, "Just writing fantasies isn't the problem. It's the combination of disturbing writing and all the other danger signs."
There can be no doubt that public safety must trump privacy rights, particularly in a university setting. Cho's dysfunction had been noted and treated by his high school counselors -- but this was never communicated to Virginia Tech. As his problems intensified at college, his parents were never alerted. Cho spoke to employees in the campus counseling center three times in fifteen days in late 2005 and early 2006, but they failed to follow up and treated his case in an indifferent manner.
Virginia Tech Provost Mark G. McNamee said the university will push for changes in the privacy laws: "I think we are moving into a new era, a new national dialogue" about how individual privacy rights are weighed against public safety, McNamee said. "Safety has clearly risen to a higher profile."
It is sad indeed that we need a tragedy of this magnitude to restore a common sense approach to our treatment of troubled and potentially violent students on the nation's college campuses. *
"He who steals from a citizen, ends his days in fetters and chains; but he who steals from the community ends them in purple and gold." --Marcus Porcius Cato
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
Heading::: Conservatives and Neo-Conservatives: The Foreign Policy Differences Are Significant and Worthy of Serious Debate
As the presidential election campaign of 2008 gets under way, and as America's role in the world undergoes increasing scrutiny both at home and abroad, it would be useful for a serious exploration of what that role should be in the post-Cold War World.
The "democratic globalism" which has been promoted by those who call themselves neo-conservatives is quite different from the traditional conservative approach to foreign policy. The war in Iraq and the effort to spread "democracy" to the Middle East are the products of this world view. The results of this crusade are less than clear. The intellectual underpinnings of this effort, however, are far from what conservatives have generally believed about foreign policy. George Will, the conservative political commentator, notes that:
On foreign policy, conservatism begins, and very nearly ends, by eschewing abroad the fatal conceit that has been liberalism's undoing domestically-hubris about controlling what cannot, and should not, be controlled. Conservatism is realism about human nature and government's competence . . .
Those who would base U.S. foreign policy upon the internal governmental organization of a particular country rather than its foreign policy and international actions, traditional conservatives argue, may misunderstand the very nature of what U.S. policy is meant to achieve. Pat Buchanan notes that:
The point here is quite simple: Because a nation has a free press, free elections, and a bicameral legislature does not alone make it a valued ally of the United States; and because a nation is ruled by an autocrat, a king, or a general does not make it an enemy. When Americans were dying in Vietnam, one recalls, NATO merchant ships were hauling goods to Hanoi, and Swedish diplomats were harassing us at the United Nations. Meanwhile, South Korean soldiers were fighting alongside ours. Not all our friends are democratic; and not all democrats are our friends.
On June 7, 1988, a conference entitled "Promoting Democracy Abroad," was held at the American Enterprise Institute in Washington, D.C. It was sponsored by the Philanthropic Roundtable, a group of foundations and grant-makers. Writing in Philanthropy (Spring 1987), Carl Gersham of the National Endowment for Democracy, declared that:
The job of assisting democratic political developments cannot be left solely to governments. Democracy cannot be sustained without the existence of countless private organizations and institutions. . . . Beyond its ability to mobilize financial resources, private philanthropy can engage private institutions and individuals whose knowledge and technical skills are vital resources for those seeking to develop new democratic structures.
Presenting a conservative rejoinder to the philosophy of promoting global democracy was Paul Gottfried, professor of political science at Elizabethtown College and author of a number of important books, including The Conservative Movement. Dr. Gottfried states at the outset that he shared the views of the other speakers concerning the "inhumanity endemic to Communist control anywhere" and noted that he would "find nothing wrong with having our leaders call attention to particularly heinous acts committed by other states or shamelessly tolerated within their borders," in the case of non-Communist regimes. He rejected the idea that opposition to the notion of spreading democracy around the world was rooted in "ethical relativity."
In this regard, Gottfried states that:
. . . generations of statesmen who believed in revealed religion and prescribed morality accepted a politically pluralistic universe. The 18th century Whig reformer and advocate of the American colonies, Edmund Burke, wrote to the sheriffs of Bristol in 1777 on the affairs of America: "I was never wild enough to conceive that one method would serve for the whole, that the natives of Hindustan and those of Virginia could be ordered in the same manner. . . . I was persuaded that government was a practical thing, made for the happiness of mankind, and not furnish out a spectacle of uniformity to gratify the schemes of visionary politicians.". . . Burke held a basic unquestioned assumption of Western political thought from Aristotle to Montesquieu and beyond: that there are different good or at least tolerable regimes adapted to the needs and histories of different peoples.
Those who advocate a global democratic revolution, Gottfried argued:
. . . reject that axiom not because they preach a higher morality than did Burke or Montesquieu. They are addicted to a 20th century enthusiasm that denies respect to all forms of government but their own.
To the argument that the U.S. should export democracy throughout the world because democratic regimes will feel a natural affinity for the U.S., Gottfried points to a long historic record that would challenge such a thesis:
It was Athens' bullying of a fellow-democracy, Corinth, that contributed to the entry of Sparta, Corinth's ally, into the Peloponnesian War. Democracies and, in the early 20th century, constitutional monarchies -- e.g., England and Imperial Germany and Habsburg Austria-Hungary -- have in fact fought each other.
Because, on particular occasions, the U.S. may "sometimes be called upon to make political-structural changes in far-off lands in pursuit of our national interest," Gottfried concluded, "it does not follow that we should dictate the constitutional-social arrangements by which the entire world must live."
Traditional conservatives point to the manner in which the U.S. policy of "democratization" brought the Sandinistas to power in Nicaragua. Almost from the beginning of his presidency, Jimmy Carter tightened the screws on Nicaragua. By executive decree, the president prohibited the sale of military equipment. The president's representative on the International Monetary Fund twice blocked badly-needed standby credits for Nicaragua. When financing for Nicaragua's hydro-electric dam project was obtained through other nations, President Carter pressured those nations to cancel the financing arrangements.
Under orders from the White House, the U.S. Department of Agriculture gave instructions to beef inspectors to shut down Nicaraguan beef exports to the U.S. The U.S. Embassy in Managua called and advised businessmen for the opposition political party to transfer their dollars from Nicaragua to the U.S. At one point, the U.S. Ambassador to Nicaragua, William Bowdler, told President Somoza that "the Carter policy was to see that all of the right-wing governments in Central America were replaced and that Nicaragua would be the first to go."
Under serious attack from rebels armed by Cuba, President Somoza was unable to purchase needed military supplies from the U.S. When he was finally able to purchase vital ammunition from Israel, Somoza later said that as the ship approached the Nicaraguan coast:
. . . it turned back to Israel. We suspected the reason for the sudden change in shipping plans, and later our suspicious were verified. U.S. intelligence had learned the destination of this ship and the cargo she carried. Under extreme pressure applied by the U.S., Israel made the decision to return the ship. . . . When Carter says the U.S. played no role in the death of my government, and when he says he didn't know international law was being violated, he is lying. . . . At the time of my departure, we must have had close to 20,000 men who wanted to fight the enemy. These men were never defeated by international invaders; they simply did not have the means with which to fight.
Thus, in the view of traditional conservatives, it was U.S. policy that put the Sandinistas in power -- just as it was U.S. policy which assisted the Ayatollah Khomeini in coming to power in Iran and Robert Mugabe in assuming power in Zimbabwe.
More recently, of course, particularly under the influence of neo-conservatives, the Bush administration has embarked upon an ideological crusade of "democratizing" the Middle East, most prominently in Iraq.
Not only have the reasons for going to war in Iraq proven to be less than persuasive, but the Bush administration's assessment of what was necessary for success appears to have been misleading and far from reality.
On May 1, 2003, President Bush gave a speech aboard the aircraft carrier U.S.S. Lincoln beneath a large "Mission Accomplished" banner. Kenneth Adelman, head of the Arms Control and Disarmament Agency during the Reagan administration, predicted that the mission would be a "a cakewalk." Other advocates of the war were equally optimistic. It would be like Paris in 1944, with the Iraqis greeting American troops as liberators, not occupiers. In April 2003, columnist Mark Steyn predicted that "in a year's time Baghdad and Basra will have a lower crime rate than most British cities." Furthermore, there would be "no widespread resentment or resistance of the Western military presence."
Those who worried about the deep ethno-religious divisions in Iraq were summarily dismissed. On April 1, 2003, William Kristol, editor of The Weekly Standard, wrote that:
. . . there's been a certain amount of pop sociology in America . . . that the Shi'a can't get along with the Sunni, and the Shi'a in Iraq just want to establish some kind of Islamic fundamentalist regime. There's almost no evidence of that at all.
In May, Washington Post columist Charles Krauthammer stated, "The U.S. is in a position to bring about a unique and potentially revolutionary development in the Arab world: a genuinely pluralistic, open and free society." Department of Defense planners assumed the U.S. troop levels would be down to 50,000, or even lower, by the end of 2003. Some military experts, however, warned that such optimism was unwarranted. Gen. Eric Shinseki, the Army chief of staff, predicted that the occupation would require "several hundred thousand troops" for a period of "many years." Deputy Secretary of Defense Paul Wolfowitz flatly rejected Shinseki's assessment in congressional testimony and, for his candor, Shinseki was pressured into early retirement.
Wolfowitz also rejected the idea that the occupation would be a financial drain. He predicted that Iraq's oil revenues would pay for the entire cost of reconstruction. Those officials who did not share such an optimistic view were removed from office. Larry Lindsey, chairman of the President's Council of Economic Advisers, warned that the cost of the Iraq occupation would exceed $200 billion. He was quickly pressured out of his post. Even Lindsey's estimate was low. The Iraq war has cost far more, as of March 2007 it was $350 billion and counting. That figure does not include long-term, indirect costs, such as the continuing medical care and rehabilitation expenses for more than 22,000 service personnel who have been wounded. Former Rep. Lee Hamilton, co-chairman of the Iraq Study Group, has stated that the costs could exceed one trillion dollars in the near term.
In January, 2002, more than a year before U.S. troops entered Iraq, Ted Galen Carpenter, vice president for defense and foreign policy studies at the Cato Institute, cautioned that:
No matter how emotionally satisfying removing a thug like Saddam may seem, Americans would be wise to consider whether that step is worth the price. The inevitable U.S. military victory would not be the end of America's military troubles in Iraq. . . . Washington would become responsible for Iraq's political future and the U.S. would be entangled in an endless nation-building mission beset by intractable problems.
As war grew nearer, other experts in the traditional conservative "realist" school of foreign policy echoed such warnings. On September 26, 2002, 33 prominent foreign affairs scholars published an advertisement in The New York Times under the headline "War in Iraq Is Not in America's National Interest." They noted that the administration of George H. W. Bush "did not try to conquer Iraq in 1991 because it understood that doing so could spread instability in the Middle East. . . . This remains a valid concern today." They added: "Even if we win easily, we have no plausible exit strategy. Iraq is a deeply divided society that the U.S. would have to occupy for many years to create a viable state." Those who signed that ad included University of Chicago Professor John Mearsheimer, MIT Professor Barry Posen, Columbia University Professor Richard K. Betts, and the dean of Harvard University's Kennedy school of Government, Stephen M. Walt.
Writing in Chronicles, the Cato Institute's Ted Galen Carpenter notes that:
Not only did the administration and other proponents of war ignore such warning, they refused later on to recognize growing evidence that the mission was going badly. Even as the security environment deteriorated, the chorus of optimism scarcely diminished. In early 2005, Vice President Dick Cheney confidently asserted that the insurgency was "in its last throes." By late 2006, though, the evidence of massive disorder in Iraq was irrefutable. Instead of admitting error, most of the hawks have redoubled their efforts to give advice about future strategy. . . . The increasingly shrill neo-conservatives argue that the Bush administration had launched the mission with too few troops -- even though most of the lobbyists for war had argued exactly the opposite at the time. (Indeed, some of them, including Wolfowitz, had proposed going in with even lighter force -- no more than 40,000 or 50,000 troops). Now, they insist that even the existing force of 145,000 is insufficient.
In Carpenter's view:
Except when the survival of the nation is at stake, all military missions must be judged according to a rigorous cost-benefit calculation. Iraq has never come close to being a war for America's survival. Even the connection of the Iraq mission to the larger war against Islamic terrorism was always tenuous at best. For all of his odious qualities, Saddam Hussein was a secular tyrant, not an Islamic radical. Indeed, radical Muslims expressed nearly as much hatred for Saddam as they did for the U.S. Iraq was an elective war -- a war of choice, and a bad choice at that. . . . Alarm bells should be ringing when the people who pushed America into the folly of a nation-building mission in Iraq are now advocating a redoubled effort.
Assessing the role of neo-conservatives in the formulation of U.S. policy toward the Middle East, Professor Andrew J. Bacevich of Boston University, writing in the The American Conservative, declares:
Neo-conservatives . . . believe that the U.S. is called upon to remake the Middle East, bringing the light of freedom to a dark quarter of the world. Pseudo-realists like Baker (James Baker, co-chairman of the Iraq Study Group) believe that the U.S. can manipulate events in the Middle East, persuading others to do our bidding. Both views, rooted in the conviction that Providence has endowed America with a unique capacity to manage history, are pernicious.
In Bacevich's view:
The way forward requires abandoning that conviction in favor of a fundamentally different course. A sound Middle East strategy will restore American freedom of action by ending our dependence on Persian Gulf oil. It will husband our power by using American soldiers to defend America rather than searching abroad for dragons to destroy. A sound strategy will tend first to the cultivation of our own garden. A real course change will require a different compass, different navigational charts, and perhaps above all different helmsmen, admitting into the debate those who earn their livelihoods far from the imperial city on the Potomac.
The tragedy of neo-conservatism, argues The Economist,
. . . is that the movement began as a critique of the arrogance of power. Early neocons warned that government schemes to improve the world might end up making it worse. They also argued that social engineers are always plagued by the law of unintended consequences. The neocons have not only messed up American foreign policy by forgetting their founders' insights. They may also have put a stake through the heart of their movement.
Needless to say, all Americans should hope for as successful an outcome in Iraq as possible. However mistaken the arguments presented in behalf of the war, it is in the interest of our own country and our friends in the region that Iraq is left better than we found it. Beyond the events of the moment, however, what is required is a careful revisiting of the different foreign policy perspectives of traditional conservatives and the neo-conservatives who have been so influential in the current administration. That debate has been postponed for far too long.
Life in Zimbabwe continues its dramatic decline as Robert Mugabe's cult of personality continues to grow.
In June, Mugabe threatened to seize foreign companies, including mines, which have raised prices and cut output in what he called an economic "dirty tricks" campaign to oust his government. Zimbabwe's businesses -- including a dwindling number of local subsidiaries of multinational companies, older white-owned firms, and black-owned companies that prospered after independence in 1980 -- are already struggling in what the World Bank calls the fastest-shrinking economy for a country not at war.
Analysts said that approval of the bill could deepen the economic crisis, which has pushed Zimbabwe to the brink of collapse, with inflation now believed to be more than 10,000 percent a year.
Yet, as Zimbabwe's people are struggling to survive in the face of growing food and fuel shortages, and the prospect of power cuts for up to 20 hours a day, President Robert Mugabe, who has led the country since 1980, is spending almost $4 million on a grandiose project -- a monument to himself. Work has already begun on a museum dedicated to his life in his home district of Zvimba. Construction of the grand edifice, which will cover the area of a soccer field and has been dubbed the "Mugabe Shrine" is being supervised by the local government minister, Ignatius Chomvo. Mugabe's extravagance is well known. Besides his five official residences, he owns a number of private houses including the most recent edition, a palatial three-story, 25-bedroom, $15.8 million residence in the exclusive Harare suburb of Borrowdale.
A recently published book, My Life with an Unsung Hero by Vesta Sithole, the widow of an early leader in the fight for Zimbabwean independence, the Rev. Ndabaningi Sihthole, places some of the latest developments in that country in historical perspective.
Vesta Sithole was a participant in the battle for independence and democracy in the former Rhodesia from the very beginning. She helped to recruit people for military training, sheltered freedom fighters and raised funds. As a result of her active participation in Zimbabwe's road to freedom, the author was harassed jailed and subject to mistreatment by both the Rhodesian forces and her fellow citizens. In 1967, she married Tanzanian banker and economist Jackson Mwakalyelya, with whom she had four children. She was widowed in 1972 and in 1980 married the liberation fighter and founder of the Zimbabwe African National Union (ZANU), the late Rev. Sithole.
We see from this memoir that Robert Mugabe was committed from the beginning to total control and that his verbal commitment to democracy and freedom was far from the reality of what he had in mind.
Mrs. Sithole writes that:
March 3, 1979, was the day Rev. Sithole, Bishop Muzorewa, Chief Chirau, and Mr. Smith signed the Internal Agreement. After the agreement was signed, an interim government was put in place and the country's name officially changed to Zimbabwe/Rhodesia. Mr. Smith and his delegation could not imagine the country without the name "Rhodesia" in it. . . . The Executive Council was made up of Rev. Sithole, Bishop Muzorewa, Chief Chirau, and Ian Smith. The Executive Council would rotate every month, and each leader would be the chairman of the council and run the affairs of the state before the scheduled election took place. It was not a perfect constitution, but it was a beginning.
Robert Mugabe, then outside the country, fought the internal settlement, in part because it did not elevate him to the power he sought. To the end, after a series of assassinations of his opponents, Robert Mugabe was sworn in as prime minister of independent Zimbabwe. Mrs. Sithole notes that:
Rev. Sithole was not even invited to the independence celebrations held in the country. . . . However, he did hope that Mr. Mugabe would give the Zimbabwean people what they fought and died for -- freedom to speak, freedom to associate, and freedom to be Zimbabweans. He publicly said this at our Waterfalls home soon after Mr. Mugabe was sworn in. He added that he would oppose Mr. Mugabe if he did not rule well.
While living in the U.S. the Rev. Sithole wrote two books, The Secret of American Success -- Africa's Great Hope and Hammer and Sickle Over Africa. He was convinced that the example of American freedom and free enterprise was the path Africa should take, not that of Communism. He declared:
It should be noted that what is Africa's great hope is not America, but the secret of American success. This is to say the principles that explain America's success are Africa's great hope, The constant success factors that have served the U.S.A. during the last 210 years are Africa's/Zimbabwe's great hope if she will adopt the same. . . . Allow the people free enterprise and they will succeed beyond belief. Deny them free enterprise, and you kill real success among them. Africa/Zimbabwe is more suited to a free economy than a nationalized economy, which tends to benefit mostly the ruling elites, and to retard ordinary people.
For Sithole, individual freedom was the path for Zimbabwe's future success:
Let everyone in Africa/Zimbabwe have his own dream, not another's dream. Let him become what he himself wants to become. In colonial days people were forced to become subjects of the colonial powers. In present-day Africa people are still forced to become Marxists, Marxist-Leninists, Communists, or Socialists. In other words, they are forced not to become themselves, but carbon copies of others.
After returning to Zimbabwe, the Sitholes suffered under Robert Mugabe's rule. Their home and property was confiscated and the Rev. Sithole were arrested and charged with treason. Finally, for health reasons, he was permitted to leave for medical care in the U.S., where he died.
Mrs. Sithole notes that:
The arrest of my husband on treason charges was neither the beginning nor the end of arrests by Mr. Mugabe of his political opponents. It is my observation that it was, and still is, Mr. Mugabe's modus operandi to charge with treason those who oppose him. Many will recall that Joshua Nkomo was accused of trying to overthrow Mr. Mugabe's government. . . . Fearing for his life, Mr. Nkomo escaped to Botswana disguised as a woman, and later made it to the United Kingdom. . . . Bishop Abel Muzorewa was imprisoned for 10 months after being accused of trying to kill Mr. Mugabe. Even Edgar Tekere, once his secretary general and right-hand man, was accused of trying to kill Mr. Mugabe when they become estranged.
Now living in the U.S., Vesta Sithole remains committed to speaking out against the injustices her fellow citizens of Zimbabwe have suffered at the hands of Robert Mugabe. Having devoted her life to the struggle for a free Zimbabwe, she believes that the people of her country deserve the freedom they fought for. The world that was eager for Zimbabwean independence, however, now seems indifferent to its fate.
In August, Southern African leaders did not urge Zimbabwean President Robert Mugabe to enact reforms in his country during a regional meeting in Lusaka, Zambia, and appeared satisfied with his human rights record.
Zambian President Levy Mwanawasa, then new chairman of the Southern Africa Development Community (SADC) said the group of countries had relied on a report submitted by South Africa on Zimbabwe's crisis and had not raised the issue with Mr. Mugabe.
South African President Thabo Mbeki, who has been mediating between the Zimbabwean government and its opposition, submitted the report. "We are quite happy that Mr. Thabo Mbeki was capable enough and was moving in the right direction," Mr. Mwanawasa said.
Asked if the SACD had pressed Mugabe on accusations of widespread human rights abuses during the summit, Mr. Mwanawasa said, "We have discussed them (abuse claims) and we are satisfied with the answers which were given."
It is clear that Southern African nations do not have the resolve to confront the regime of Robert Mugabe. Earlier, Mr. Mwanawasa had raised hopes that African states would break their public silence on Robert Mugabe when he became the first leader of the continent to publicly criticize him. But his description of Zimbabwe as a "sinking Titantic" during a trip to Nambia earlier this year has faded and he has softened his position.
Despite his brutal regime, Robert Mugabe, somehow, remains immune to criticism from his neighbors. More than this, the New York Times reported that at the Lusaka meeting:
Mr. Mugabe arrived . . . to a fusillade of cheers and applause from attendees that . . . overwhelmed the polite welcomes for other heads of state. . . . Mr. Mugabe was unrepentant and the comments of leaders of neighboring states about Zimbabwe's descent were notably bland.
Zimbabwe's seven-year decline is a result of increasingly repressive policies imposed by the Mugabe regime that have driven away foreign investment and stoked hyperinflation. In June, the government ordered all businesses to roll back prices, effectively declaring inflation illegal. Since then, factories have curtailed production, workers have been laid off, and store shelves have emptied of basic goods. In Bulawayo, Zimbabwe's second-largest city, a security guard and a child were reported to have been killed in August after a line of shoppers awaiting a rare delivery of sugar mobbed a storefront, toppling a brick wall on top of them.
While the rest of Africa -- and most of the world -- turns away, Zimbabwe is in a state of collapse. Water shortages have worsened because of pump breakdowns, and a senior government official said kidney patients were dying for lack of dialysis machines. Power, water, health, and communications systems are collapsing, and there are acute shortages of staple foods and gasoline. Unemployment is around 80 percent.
As 11 million people in Zimbabwe descend into destitution, a tiny slice of the population is becoming ever more powerful and wealthy at their expense. Zimbabwe, critics charge, is fast becoming a kleptocracy, and the government's seemingly inexplicable policies are in fact preserving and expanding it.
"Their sole interest is in maintaining power by any means," said David Coltart, a white opposition member of Parliament. "I think their calculation is that the rest of Africa is not going to do anything to stop them, and the West is distracted by Iraq and Afghanistan. The platinum mines can keep the core of the elite living in the manner they're accustomed to -- just in a sea of poverty."
According to New York Times correspondent Michael Wines, writing from Zimbabwe:
In interviews, Mr. Coltart's view was widely shared by blacks and whites alike, many with no political ax to grind. Even a governing party politician allowed that whatever the aims of Mr. Mugabe's policies, their execution had gone terribly awry. Zimbabwe's farm seizures destroyed the nation's rich agriculture industry, and, as a form of patronage, vast tracts of land were handed over to party elites with little experience or interest in farming. The looming take over of business is expected to produce the same result.
Zimbabwe's falling currency -- 200,000 Zimbabwe dollars now buy a single American dollar on the black market -- has rendered the salaries of working Zimbabweans all but worthless. Yet the official exchange rate is not 200,000 to 1, but 250 to 1. Those with connections to the government's reserve bank are widely said to buy American dollars cheap and sell them dear -- and reap an 800-fold profit on currency transactions.
Things have become so bad that the Archbishop of Bulawayo, Pius Ncube, has called on Britain to invade his country and engage in regime change against Robert Mugabe. Bishop Neube warns that millions of his countrymen face starvation without outside intervention. There is "massive risk to life" said the leader of Zimbabwe's one million Roman Catholics. "People in our mission hospitals are dying of malnutrition," says Bishop Ncube.
We had the best education in Africa and now our schools are closing. Most people are earning less than their bus fares. There's no water or power. Is the world just going to let everything collapse in on us?
Thus far, sadly, the answer appears to be yes. And those in the U.S., Britain, and elsewhere who so eagerly embraced Robert Mugabe in 1980, and bear some responsibility for his ascent to power, seem totally indifferent to the results of their efforts. *
"Liberty cannot be caged into a charter or handed on ready-made to the next generation. Each generation must recreate liberty for its own times. Whether or not we establish freedom rests with ourselves." --Florence Ellinwood Allen
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
In the 1960s, Senator Daniel Patrick Moynihan (D-NY), then Assistant Secretary of Labor, produced a report entitled "The Negro Family: The Case for National Action." He found that a quarter of all black children were born to unmarried women and the percentage was rising. The tangle of poverty and despair was bleak, and Moynihan predicted that is would get worse. It has. Today, among non-Hispanic blacks, the out-of-wedlock birth rate had reached 69.5 percent. Beyond this, trends he found troubling within the black community are now to be found in the larger society as well. The illegitimacy rate for Hispanics has reached 47.9 percent and the rate for non-Hispanic whites now exceeds 25 percent.
For his efforts, Moynihan was sharply criticized by liberals for "blaming the victim," a catchphrase that was coined by his critics and has now entered the lexicon. Now, however, that notion is being increasingly discredited. In an important new book, Ghetto Nation (Doubleday), Cora Daniels, an award-winning journalist who is now a contributing writer for Essence and has served as a commentator on CNN, BET, and NPR, and is the author of the widely praised book, Black Power, Inc., argues that it is the "ghetto mindset" that is harming the future of residents of the nation's inner cities, and that corporate America bears a share of the responsibility for promoting this destructive mindset. Daniels joins a long list of thoughtful black critics of the prevailing ghetto culture, among them comedian Bill Cosby and author Juan Williams.
For Daniels, "ghetto" is a condition -- an addiction, even -- that has spread through American popular culture. It is an impoverished mindset defined by conspicuous consumption and irresponsibility. She writes that:
Ghetto no longer refers to where you live; it is how you live. It is a mind-set. . . . The jump from an impoverished physical landscape to an impoverished mental one is harder to trace. . . . there is no denying that these days ghetto, as it is used, had indeed made that leap. I did not reposition ghetto from noun to adjective -- we all did that. . . . As a black woman surviving, and drowning, in Ghettonation, I am defining ghetto as a mindset. . . . A mindset that thinks it is fine to bounce, baby, bounce, in some video, as if that makes it any different from performing such a display on a table, on a pole, on some john's lap, or on the corner. And a mindset that thinks a record deal and a phat beat in the background makes it to okay to say . . . well I do know what bad language is, so I won't say. Most of all, ghetto is a mindset that embraces the worst. It is the embodiment of expectations that have gotten dangerously too low.
A number of academic studies show how this mindset has had harmful ramifications. Professors from M.I.T. and the University of Chicago wanted to see what was in a name. They answered thousands of real newspaper want ads sending in identical resumes except for the name. One group of resumes was sent to the same companies with names like Latoya, Shamika and LaShawn. The Brads and Kristens were 50 percent more likely to be called back for an interview than the Shamikas and LaShawns with the exact same resumes. Daniels writes:
The researchers thought with the results of their LaShawn and Shamika resumes they were making a statement about race [but in reality] they really uncovered the side effects of the ghetto.
Several years ago, anthropologist John Ogbu, who coined the term "acting white" to explain why some black students seemed to shun doing well in school, released an even more explosive study about black middle-class students in suburban Shaker Heights, Ohio, outside of Cleveland. In pointing the finger for poor performance in school back at parents instead of at "the system," the late scholar drew criticism from both his colleagues and the community. It is Daniels' view that there is much to learn from Ogbu's efforts.
She notes that
Ogbu was invited to Shaker Heights not by school district administrators or teachers but by black parents themselves. They wanted to know why their children -- the sons and daughters of doctors, lawyers, judges, business execs -- were doing so much worse in school than their white classmates. Although black students in the middle-class district were some of the best students in the state, they were still lagging behind whites in the district in terms of grade point averages, standardized test scores, and enrollment in advanced-placement courses. . . . The situation sounded like my own high school experience. While my good school was approximately 60 percent black, I would typically find myself among the same small bunch of black students in all of my AP and honors classes. By the time I was a senior, when my schedule was filled with nothing but demanding classes, I had virtually no black classmates, even though I was attending a black school.
In his 2003 book, Black American Students in an Affluent Suburb: a Study of Academic Disengagement, John Ogbu concluded that black students in Shaker Heights weren't doing as well as their white counterparts because their parents didn't stress education enough. It was disengagement of the worst kind. Not discounting other factors such as low teacher expectations and prejudiced personnel, Ogbu failed the efforts of the black students and their parents. Despite parents' obvious concerns, illustrated by the fact that they invited the anthropologist to come and have a look to begin with, Ogbu concluded that many of the same black parents did not stress homework, attend teacher conferences, or push their children to enroll in the most challenging classes as much as their white counterparts did. In addition, he suggested that the black students suffered from what he termed "low effort syndrome," meaning that they didn't work as hard even though they knew how much work was needed to succeed in the Shaker Heights schools.
Ogbu said:
What amazed me is that these kids who come from homes of doctors and lawyers are not thinking like their parents; they don't know how their parents made it. . . . They are looking at rappers in ghettos as their role models; they are looking at entertainers. The parents work two jobs, three jobs, to give their children everything, but they are not guiding their children.
A 2004 study by the Civil Rights Project at Harvard University and the Urban Institute concluded that only 50 percent of black students graduate from high school and 53 percent of Hispanic students are doing so. For male students, the figures are even worse. Only 43 percent of black males and 48 percent of Hispanic males graduated. The reality, Daniels shows, may be even worse:
Researchers concluded that the official data on dropout rates were misleading, arguing that in reality there are actually more dropouts than schools report. The study charged that school districts and states routinely try to depress their dropout numbers by pushing out and eliminating problem students from school rosters, especially right before state tests, because higher scores translate into extra funding and certification credentials. In a study of the 100 biggest school districts in the country, in almost half the schools sampled, the size of the senior class had shrunk by more than half compared to the class size back in the ninth grade, four years earlier. In one year in Chicago, 15,653 students graduated from high school while 17,404 dropped out.
Sadly, corporate America has devoted a great deal of its resources promoting the ghetto mindset. Daniels notes that:
Madison Avenue has certainly put its cash behind the tomorrow-doesn't-matter message. . . . The "I am What I am" billboard that I saw most often . . . featured 50 Cent with his stale . . . frown. His quote, displayed against a police fingerprint sheet, read: "Where I'm from there is no Plan B. So take advantage of today because tomorrow is not promised.". . . In Ghettonation, living within your means just isn't done. There is no need to when you think tomorrow doesn't matter.
Middle-class blacks often claim to be from the inner city to achieve success. Rapper Russell Jones -- known by a variety of names including ODB (Ol' Dirty Bastard) -- died at the age of 35. In its multi-day coverage of his death, The New York Times reported that: "As ODB, he was also uncomfortable spinning a public mythology, saying, for example that he had grown up on welfare, or that he had not known his father." Neither was true. "Our brother looked at things as selling records," said his sister Monique Jones. "So he dismissed whatever lies he told as just a way of getting publicity."
Cora Daniels points out that:
Hip-hop is the music that trumps through black neighborhoods and encompasses the black images that are spit across the world. It is what our children are bouncing their heads to. So as a black woman living within earshot of what is coming out of the mouths of young black folks over the radio these days, I can't afford not to say something. Especially when the ho-ing of black women has become big business.
In addition to hip-hop, music is "street fiction" which has been a constant strand in black literature for decades. The first of such writers was probably Iceberg Slim, who after being released from ten months of solitary confinement in Cook County Jail penned Pimp: The Story of My Life, published in 1969. Graphic in both language and subject matter, the book broke narrative ground by capturing Slim's life as a pimp in Chicago in the 1950s. In 2003, Pimp briefly graced U.P.I.'s top ten mass-market paperback list alongside To Kill A Mockingbird, The Hobbit, and Fahrenheit 451. Donald Goines, also a pioneer in street lit, first read Iceberg Slim while doing his last stint behind bars. The result for Goines, a heroin addict who pimped and robbed to support his habit, was the birth of his first book, Dopefiend, an instant ghetto classic published in 1970 that still sells 200,000 copies a year.
James Fugate, owner of Eso Won Books, a black bookstore in South Central Los Angeles, has one word for ghetto lit: disturbing. "I'm sick of talking about it," he says.
To me people can read what they want to read. I've never been opposed to books by Donald Goines and Iceberg Slim. But those books were bridges to other literature.
The ghetto lit being written today, he says, is mostly "mindless garbage about murder, killing, thuggery."
Dr. Todd Boyd, a member of the faculty at the School of Cinema and Television at the University of Southern California, was asked why ghetto lit is the fantasy so many readers are choosing. "The ghetto is drama," he said. "The ills of poverty are far more dramatic than the angst of middle-class life."
Daniels writes that:
It struck me just how universal Boyd's truism was when author James Frey's credibility began to shatter into a million pieces in the winter of 2006 when his best-selling memoir, A Million Little Pieces, about his drug addiction and rehab struggles was found to be soaked with untruths. The interesting things about Frey's embellishments is that he didn't lie to make his life seem better but to make it seem worse. "I am an alcoholic and I am a drug addict and I am a criminal," he wrote repeatedly. One of the most blatant lies that Frey wrote about was the criminal part. He claimed he did a three-month stint in jail for beating up a cop. It never happened. . . . Remember when folks used to lie their way up? Like claiming we made more money that we did, had better jobs than we had, were the hero instead of the sidekick, raised extraordinary beings instead of average ones? Now folks are lying their way downward. And why not? Frey's book was the second biggest seller of 2005 behind only the new Harry Potter. Being a "criminal" sells. Ghetto!
All too often, the black establishment has embraced those who promote this ghetto mentality. The NAACP, for example, nominated rapper R. Kelly for an Image award after the singer had already been charged with child pornography. In 2005, one of the most celebrated independent films was "Hustle & Flow," a movie about a pimp turned rapper in Memphis. The film earned Terrence Howard an Oscar nomination for playing the pimp. Daniels writes that:
In one of the clearest signs of pimp praise, "Hustle & Flow's" title song, "It's Hard Out There for a Pimp," by Three 6 Mafia, won an Academy Award in 2006 for best song. Upon receiving their Oscar, Three 6 Mafia had to be bleeped by network censors several times during their acceptance speech.
Fortunately, Daniels reports, more and more prominent black figures are beginning to speak out against the ghetto mindset. Professor Orlando Patterson, a sociologist at Harvard, says that it is a culture of self-destructiveness that is holding black men back. According to Patterson, a so-called "cool-pose culture" that includes "hanging out on the street, dressing sharply, sexual conquests, party drugs, hip-hop music" was just too gratifying to give up. Culture was making the pull of the ghetto attractive. "Not only was living this subculture immensely fulfilling," wrote Patterson. "It also bought them a great respect from white youths."
To those who decry black spokesmen challenging ghetto mindset as washing the community's "dirty laundry" in public, comedian Bill Cosby responds:
Let me tell you something, your dirty laundry gets out of school at 2:30 every day, it's cursing and calling each other nigger as they're walking up and down the street. They think they're hip. They can't read; they can't write. They're laughing and giggling and they're going nowhere.
Like Bill Cosby's initial, and much-discussed comments about the problems within the black community today, Cora Daniels' book should trigger widespread interest and heated debate. And it is not only the black community that is involved. She laments that, "Ghetto is also packaged in the form of music, T.V., books, and movies, and then sold around the world . . . ghetto is contagious and no one is immune."
Germany today boasts the fastest growing population of Jews in Europe. The streets of Berlin abound with signs of a revival of Jewish culture. In September, 2006, Germany ordained its first rabbis since World War II, an event hailed as a milestone in the rebirth of Jewish life in the country where the Holocaust began. German President Horst Koehler declared:
After the Holocaust many people could never have imagined that Jewish life in Germany could blossom again. That is why the first ordination of rabbis in Germany is a very special event indeed.
In an important new book, Being Jewish in the New Germany (Rutgers University Press), Professor Jeffery N. Peck of Georgetown University, who is also a senior fellow in residence at the American Institute for Contemporary German Studies, explores the diversity of contemporary Jewish life and the complex struggles within the community over history, responsibility, culture, and identity. He provides a glimpse of an emerging, if conflicted, multicultural country and examines how the development of the European Community, globalization, and the post-9/11 political climate play out in this context.
Today there are more than 100,000 registered members of the official German Jewish community and many more Jews who are not affiliated. While the Jewish population is still relatively small in relation to the total German population, its moral and political significance outweighs its size. In 1933, it was estimated that Germany had about 500,000 Jews. At the end of World War II, Germany's and Europe's Jewish population was decimated to a mere remnant of survivors.
"Now in the new millennium," writes Professor Peck:
. . . there is all the more reason to celebrate the triumph, as many see it, over Hitler's Final Solution. Germany's Jewish population has gained prominence as the fastest-growing Jewish community in Europe and the third largest overall. Jewish Berlin has become a popular tourist site and home of major international and Jewish organizations. The leaders of institutions, like the American Jewish Committee (AJC), understand the importance of a sustained Jewish life in Germany. Far ahead of the Jewish public, they have established positive relations with Germany for decades. Yet, American Jews are often unable to overcome old stereotypes and prejudices. Many American Jews still feel uncomfortable traveling to Germany or even buying German products. In their minds, all Germans, even those born after the war, are tainted by genocide.
Today, Peck argues:
Being "German" and being "Jewish," separated identities that have a long history, especially since the Holocaust, are no longer mutually exclusive. Before the Second World War, most German Jews thought of themselves as Germans. . . . After the war, the terminology separating Germans and Jews connoted the alienation and separation for those Jews remaining, most of whom were not "German" but displaced persons from Eastern Europe who came to be known by those ignominious initials as DPs. Then, it was simple, the Germans were the perpetrators and the Jews were the victims. As a postwar Jewish community took shape, albeit until recently very small, the term "Jews in Germany" became the dominant description of a people who were not fully comfortable or integrated into the society around them. My own prognosis looks toward a potentially new categorization, a "new" German Jewry that will represent a different status in both historical and contemporary terms.
At the end of the war, about 6,500 Jews survived through marriage to non-Jews, living underground, and other means. About 2,000 returned to Berlin from the concentration camps. Most importantly, approximately 200,000 Jews came to Germany as DPs. Housed in camps established by the U.S. military authority, most of these people did not want to stay in Germany. They were only in transit on their way to Israel, the U.S., or Canada. The few thousand who remained formed the basis of the postwar Jewish community.
In January 1991, after German reunification, Soviet Jews were admitted under the quota refugee law granting them rights ironically only accorded to the so-called ethnic Germans, whose relationship to contemporary Germany after many generations of living in remote areas of the Soviet Union was more imagined than real. This legislation was a turning point for Soviet Jewish immigration since it allowed masses of Jews to enter Germany as immigrants rather than on tourist visas as had previously been the informal practice.
Most of the newcomers, an estimated 80 percent, are not "Jewish" by Orthodox law, meaning they do not have Jewish mothers or do not fulfill other requirements, such as Orthodox conversion. The intolerance of Germany's organized Jewish community is something that Professor Peck laments and finds counterproductive:
Religion, which defines the Jewish Community, allows little diversity; Orthodoxy dominates, some Liberal (Conservative) synagogues fill out the picture, and no official Reform offerings are available, except for the controversial World Union of Progressive Judaism (WUPJ) still fighting for recognition. . . . I know of one story of an esteemed young Jew who was dismissed from a religious Jewish institution in Berlin when it was discovered that his mother's conversion was not properly Orthodox. Whatever the reasons . . . halakhic (Orthodox) regulation remains . . . powerful in a community that some might say can ill afford to be so conservative.
Peck is clearly optimistic about the future of Jewish life in Germany. He writes of the symbolism involved in the official opening of the Jewish Museum of Berlin in September 2001, under the directorship of Michael Blumenthal, an American Jew who escaped Germany through Shanghai as a child, and served as Secretary of the Treasury in the Carter Administration. In his "Welcome" printed in the book Stories of an Exhibition: Two Millennia of German Jewish History, the official documentation of the museum, Blumenthal states:
The Jewish Museum of Berlin is no ordinary museum. As a major institution supported by the Federal Government, the State of Berlin, all political parties, and a broad cross-section of the public, its mission has sociopolitical meaning that far transcends the story it tells of the 2000-year history of German Jewry. It symbolizes, in fact, a widely-shared determination to confront the past and to apply its lessons to societal problems of today and tomorrow.
By presenting a chronological history of the Jews from their earliest settlement in the areas that became the German-speaking lands through the present, it reminds the visitor, especially non-Jewish Germans, Peck points out:
. . . that the Jew is not an "other," an exoticized foreigner who does not belong, but an integral part of a historical German identity. While the urge to harmonize German-Jewish identity is understandable, I would suggest that this exhibition is also a contentious site for definitions of what it means to be German as well as Jewish.
Throughout the exhibition, Jews in their many historical and religious incarnations are presented as an integral part of the German tradition, one that stretches back thousands of years to the time, where the exhibition begins, when the "Children of Israel were expelled from the Holy Land and first came to the Germanic lands." One sees how Jews not only tried to participate in German society that would create the so-called German-Jewish symbiosis that was destroyed by the Nazis, but also that this interrelationship was longstanding. The exhibit declares that, "From the beginning, the history of what is now Germany was a German-Jewish history" and "Jewish merchants were among the first inhabitants of medieval German cities. Often they were among their founders."
Germany's Jewish community, Peck believes,
. . . has a symbolic value beyond the numbers. Although the new community may not be enough to guarantee that Jews will never be the targets of prejudice or attack . . . its mere presence carries weight and makes a powerful statement. It represents the defeat of Hitler's Final Solution, and hope for acceptance of diversity in a country that, unlike the U.S., defined itself for a long time as only white and Christian.
On a January 1996 visit to Germany, Israeli President Ezer Weizman declared that he "cannot understand how 40,000 Jews can live in Germany," and asserted that "The only place for Jews is in Israel. Only in Israel can Jews live full Jewish lives."
Professor Peck rejects such a notion, and shows how German Jews reacted to it. Ignatz Bubis, then head of Germany's main Jewish organization, responded, "I have lived here since 1945 and have met two new generations who simply do not identify with the Nazis. This is a new generation."
Arguing that a Jewish presence in Germany prevents Hitler from achieving his posthumous victory of a "Judenrein" Germany, Bubis declared:
The full revival of the Jewish community in post-war Germany is important. . . . There is no reason to say that Jews cannot live in Germany.
While a proud Jew, he was also proud to be as the title of his memoir emphasizes, "A German citizen of the Jewish faith." Peck Writes:
As Israelis like Weizman tried to isolate German Jews Bubis often had to remind non-Jewish Germans that he did not want to be turned from a German into an Israeli simply because he was Jewish.
Alice Brauner, one of a group called "The Young Jews of Berlin" exemplifies this positive attitude: "I won't emigrate. On the contrary, our roots in this country cannot be broken." The daughter of film producer Arthur Brauner who returned to Germany after the war calls herself a "Jewish Berliner with German citizenship." Brauner calls Germany home: "We stay because we are at home here and feel at home here."
The optimism of those who are working to enhance Jewish life in today's Germany is reported through the many people with whom Jeffrey Peck spoke in the preparation of this book. The Lauder Foundation, now housed in its own Lehrhaus (learning center) -- named after an institution founded by Franz Rosenweig in the 1920s -- is headed by a young rabbi, Josh Spinner, who was born in Baltimore and grew up in Hamilton, Ontario. Peck reports that:
He was lively, engaged, energetic, and clearly committed to the future of Jewish life in Germany. He felt that God had brought the Jews to Germany, and it was his job to do what he could to make a Jewish life possible, no matter where it might be.
Another rabbi active in Berlin is Yehuda Teichtel, a native of Brooklyn who heads Chabad Lubavitch. Teichtel's plans for an enlarged Chabad Center in a new building testify to the future he sees for a flourishing Jewish life in Germany. He believes that the greatest service to the six million killed is the establishment of Jewish life on German soil to prove Hitler wrong.
Interviews with prominent leaders of the Jewish community in Germany and with American directors of important Jewish institutions now located in Berlin convince Peck that:
Jewish life, even with its many problems . . . offers optimism and potentially a vehicle for improving transatlantic relations. The sheer presence of a Jewish population with a variety of religious orientations . . . paints a picture of a thriving and vibrant community.
Transatlantic relations might be improved if more Americans, especially Jewish Americans, know more about the complexities of Jewish life in Germany and the role of German politics and culture in European-wide relations. During the Office of Security and Cooperation in Europe (OSCE) 2004 conference in Berlin on anti-Semitism, the fact that Berlin was the capital of Nazi Germany and now the location of such a meeting, was cited by almost all of the prominent participants. German Foreign Minister Joschka Fischer welcomed the guests with these words:
The German government has invited you all to this conference in Berlin -- in our capital, in the city that almost seventy years ago, not far from here, the destruction of European Jewry was decided, planned and instituted. We, as hosts, want to acknowledge the historical and moral responsibility of Germany for the Shoa. The memory of this monstrous crime against humanity will also influence German politics in the future.
Dr. Peck laments that:
Unfortunately, Germany is still often identified in the U.S. exclusively with past Nazi horrors rather than with its postwar democratic and liberal successes. The site of the OSCE Conference was to demonstrate dramatically Germany's commitment to combating anti-Semitism.
In his description of the rebirth of the German Jewish community, Jeffrey Peck argues that there is, indeed, a vibrant and significant future for Jews in Germany. He also speculates that contemporary European Jewry can transform Judaism to be more inclusive, which he feels would be an important step forward. It is essential, Peck declares, that Americans in general and American Jews in particular see today's Germans and contemporary Germany as they really are, and not as reflected in memories of the Nazi era. It is also essential, he believes, that Israelis abandon the idea that Jews living outside of its borders are, somehow, in "exile," and that genuine Jewish life cannot exist in the larger world and, in particular, in a country so burdened with history as Germany.
Sharing the idea that a thriving and growing Jewish community in Germany is, indeed, a final defeat for Hitler, Jeffrey Peck is optimistic about the future. His book deserves widespread attention. *
"Many a time I have wanted to stop talking and find out what I really believed." --Walter Lippmann
We would like to thank the following people for their generous support of this journal (from 5/11/07 to 6/9/07): Nancy M. Bannick, Reid S. Barker, James M. Broz, David G. Budinger, D. J. Cahill, Robert T. Cristensen, Michael D. Detmer, Nicholas Falco, John B. Gardner, Robert W. Garhwait, Joyce Griffin, Daniel J. Haley, Weston N. Hammel, Mrs. Thomas E. Heatley, Mr. & Mrs. Ken E. Kampfe, Gloria Knoblauch, Daniel Maher, Paul Maxwell, Thomas J. McGreevy, Delbert H. Meyer, Clark Palmer, David Renkert, Patrick L. Risch, Heidi Schumache, Frank T. Street, Thomas H. Webster, Gaylord T. Willett.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
The surge in violent crime that began in 2005 accelerated in the first half of 2006, the FBI reported in December, providing the clearest signal yet that the historic drop in the U.S. rate had ended and is being reversed.
Reports of homicides, assaults, and other violent offenses surged by nearly 4 percent in the first six months of 2006 compared with the same period in 2005, according to the FBI's Uniform Crime Report. The numbers included an increase of nearly 10 percent for robberies, which many criminologists consider a leading indicator of coming trends. The results follow a 2.5 percent jump in violent crime for 2005, which at the time represented the largest increase in 15 years.
Homicides increased 20 percent or more in cities including Boston, Cincinnati, Cleveland, Hartford, Memphis, and Orlando. Robberies went up more than 30 percent in places including Detroit, Fort Wayne, and Milwaukee. Aggravated assaults with guns were up more than 30 percent in cities like Boston, Sacramento, St. Louis, and Rochester. In fact, 71 percent of the cities surveyed had an increase in homicides, 80 percent had an increase in robberies, and 67 percent reported an increase in aggravated assaults with guns.
According to criminal justice experts, many communities, particularly those in urbanized areas, may be headed into a period of sustained crime increases. David A. Harris, a law professor who studies crime trends at the University of Toledo, says:
This confirms what law enforcement has been seeing and saying on a more anecdotal level: that crime is on the way up. While it is still too early to be sure, you've certainly got things pointing in one direction.
Police officials say that arguments that 20 years ago would have led to fistfights now lead to guns. "There's really no rhyme or reason with these homicides," said Edward Davis, the police commissioner in Boston.
An incident will occur involving disrespect, a fight over a girl. Then there's a retaliation aspect where if someone shoots someone else, their friends will come back and shoot at the people who did it.
Chris Magnus, the police chief in Richmond, California, north of San Francisco, said he would often go to the scene of a crime and discover that 30 to 75 rounds had been fired. "It speaks to the level of anger, the indiscriminate nature of the violence," he said.
I go to meetings, and start talking to some of the people in the neighborhoods about who's been a victim of violence, and people can start reciting: "One of my sons was killed, one of my nephews. . . . It's hard to find people who haven't been touched by this kind of violence.
These numbers come amid heightened criticism of the federal government from many police chiefs and state law enforcement officials who complain that the federal government has retreated from fighting traditional crime in favor of combating terrorism and protecting homeland security. Justice Department officials dispute those contentions and point to an ongoing study designed to identify solutions to the rise in violent crime.
The Justice Department inspector general's office has reported sharp declines in the number of FBI agents and investigators dedicated to traditional crime since the Sept. 11, 2001, terrorist attacks. In addition, the International Association of Chiefs of Police says that law enforcement programs at the Justice Department have been cut by more than $2 billion since 2002 and that overall funding for such programs has been reduced to levels of a decade ago.
James Alan Fox, a criminologist at Northeastern University in Boston who has been critical of the Bush administration's crime-fighting strategies, said the overall rise in violent crime should be expected given dramatic cuts in assistance to local police and simultaneous increases in the population of males in their teens and 20s. He said:
We have many high-crime areas where gangs have made a comeback, where police resources are down, and where whatever resources there have been have shifted to anti-terrorism activity.
Justice Department officials have repeatedly rejected such criticism, arguing that the causes and trajectory of crime increase is still unclear. Still, Attorney General Alberto Gonzales has launched a series of anti-drug and anti-gang initiatives at Justice, and he acknowledged at a crime conference in Boston in December that local police are struggling with "increased responsibilities" since Sept. 11, 2001.
One factor in increasing crime rates may well be the decline of traditional family life, particularly in inner city urban neighborhoods. Married couples with children now occupy fewer than one in every four households--a share that has been cut in half since 1960 and is the lowest ever recorded by the census. As marriage with children becomes an exception rather than the norm, social scientists say it is also becoming the self-selected province of the college-educated and the affluent. Isabel V. Sawhill, an expert on marriage and a senior fellow at the Brookings Institution said:
The culture is shifting, and marriage has almost become a luxury item, one that only the well-educated and well-paid are interested in.
Out-of-wedlock births exceeded 1.5 million in 2005 for the first time ever, representing 36.8 percent of all births in the U.S. Among non-Hispanic blacks, the out-of-wedlock birth rate reached a staggering 69.5 percent. For non-Hispanic whites, it reached a new milestone, exceeding 25 percent. The illegitimacy rate for Hispanics reached 47.9 percent. The rise in unwed births is "disastrous, about as big a leap as we've ever had," said Robert Rector, welfare analyst at the Heritage Foundation. He noted that unwed birth figures leveled off and seemed to stabilize for a time after Congress passed welfare reform in 1996. However recent increases in these numbers "clearly show that the impact of welfare reform is now virtually zero, and we are going back to the way things were before welfare reform."
The percentage of children in households with two parents continues to fall. The National Marriage Project's "State of Our Unions 2006" shows that children in two-parent households have dropped to 67 percent. In minority communities, the majority of children live in one-parent households. There is, it seems clear, a correlation between the decline in family life and the rise in crime. By 2004, federal data showed that black Americans--13 percent of the population--accounted for 37 percent of the violent crimes, 54 percent of arrests for robbery, and 51 percent of murders. Most of the victims of these violent criminals were their fellow black Americans.
In an article in Philadelphia Magazine (November 2006), reporter Gregory Gilderman rode along with police officer Dennis Stephens in crime-ridden North Philadelphia's 22nd district. Gilderman writes:
Everything you need to know about Philadelphia's current murder wave-the out-of-control nature of it, the futility of our response to it--may be encapsulated in this fact: within 24 hours of Mayor Street's emergency televised address last July about the city's surging homicide rate, in which he urged the city's youth to put down their guns, five murders were committed. . . . It's been that kind of year in Philadelphia.
According to Gilderman:
One thing that makes fighting crime difficult in North Philadelphia, as well as in the nation's other inner cities, is the hostility felt toward the police and the lack of citizen cooperation. This is one of the more demoralizing aspects of policing this city: the culture of the street hates the cops. Never mind that most of the officers are African-American, and that more than a few of them grew up in this neighborhood. Perfectly law-abiding teenagers wear "STOP SNITCHIN" t-shirts, cops are taunted for being sellouts or "trying to be white," and witnesses and victims won't talk at crime scenes, let alone show up at court. Because of this, a vast swath of the criminal element-muggers, rapists, even murderers, see charges dropped or reduced to the one crime for which a police officer's testimony alone just might provide leverage for plea-bargained prison time: possession of a firearm. This is especially frustrating for veteran officers. A dangerous police district is like a small town: Very few new faces show up, and the same career criminals are arrested over and over. They are returned to the street over and over.
Professor Pamela Smock of the University of Michigan, co-author of the recent review of patterns of marriage, points out that class is a better tool than race for predicting whether Americans marry. "The poor aren't entering into marriage very much at all," said Smock. She reports that young people from these backgrounds often do not think they can afford marriage. Arguments that marriage can mean stability do not seem to change their attitudes, she said, noting that many of them have parents with troubled marriages.
To reverse the latest trends in crime we must not only consider the role of law enforcement agencies but the culture out of which such crime emerges. One key element, particularly in minority communities, is the breakdown of the family and the increasing out-of-wedlock birth rate. Unless current trends are reversed, our increase in crime is likely to continue.
Bowing to a national outcry and internal protest, CBS Radio decided to end Don Imus' morning program after he called the Rutgers University women's basketball team "nappy-headed hos."
The Imus case reflects the coarsening of American culture that is all around us. It reflects as well an extraordinary level of hypocrisy, both on the part of CBS and such black spokesmen as Al Sharpton and Jesse Jackson.
Columnist Jay Ambrose provided this assessment:
The president of CBS has explained that he fired Don Imus because of how "deeply upset and repulsed" the network has been by the vile, on-air statements of the radio host about a group of fine young women, and you will excuse me while I roll on the floor laughing at the hypocrisy. . . . Don Imus has been saying mean, bigoted things about women, gays, Jews, blacks and others for years now, and CBS didn't give a hoot. Neither did advertisers or guests. The advertisers were getting a good return on their investment. CBS was pulling in an estimated $20 million a year in revenue and the guests were having their names, causes, and agendas promoted in front of 10 million listeners. Imus' latest words . . . are not newly offensive. It is not as if he had previously been Mr. Decorum. . . . He has long been a tasteless shock jock engaged in mass nastiness for the sake of attracting mass audiences, and he has been abetted by advertisers, networks and dozens of radio stations for the very simple reason that it meant money in the bank.
What changed, argues Ambrose, is not that the Imus content:
. . . led a conscience-stricken CBS to contemplate "the effect language like this has on our young people," in words of the CBS president, but that the comment began to get lots of attention, public anger began to grow, lots of people began to complain-and sponsors saw that their Imus ads could do them more harm than good. A show without advertisers is not what keeps network executives employed or gets them bonuses.
Washington Post columnist David Broder noted that:
CBS Radio and MSNBC fired the millionaire talk-show host only after criticism of his foul-mouthed assault on the Rutger's women's basketball team mounted and advertisers canceled their contracts. It showed no courage on the part of those organizations, which have put up with similar slurs for years and counted themselves lucky to have such a moneymaking act in their stable.
The hypocrisy, however, does not end with CBS and MSNBC. Among the loudest critics of Don Imus were Jesse Jackson and Al Sharpton. It was Al Sharpton's radio program that Imus chose as the place to extend his apologies for his remarks. Yet, Sharpton, Jackson, and too many other black leaders seem offended by racist and sexist remarks only when they come from white spokesmen. When they emanate from blacks, silence has been their response.
Rap and hip-hop music, promoted by black radio stations around the country, repeatedly describes black women in a manner as offensive, if not more so, than that used by Don Imus. Women are degraded as "whores" and "bitches." Violence, murder, and self-hatred are marketed as true blackness-authentic black identity.
It is rare indeed to hear criticism of this music. Comedian Bill Cosby, speaking to a Milwaukee audience about rap music, asked how many of the women in the audience considered themselves "bitches and hos." When no one raised a hand, Cosby asked, "If you're not a bitch or a ho, why do you dance to that music?" Cosby said that the result is rap filling radio and television with distorted images of black people that have nothing to do with a history of self-determination and pride:
In order for hip-hop, with all that misogyny and gangster violence and "don't study" to exist, you've got to know nothing about history, struggle, what it takes to get ahead.
In his book Enough, the respected black journalist Juan Williams writes that:
The consequence of black leaders failing to speak out against the corruption of rap . . . resulted in real damage to the most vulnerable of black America: poor children, boys and girls, often from broken homes. As a group, they were desperately searching for black pride in the sea of images being thrown at them on TV, on the radio, on the Internet, and in advertising. What those children found was a larger-than-life rapper who was materialistic, sexist, and violent, and used the word nigger as a casual description of all black people. It was a musical minstrel show that would have been a familiar delight to 19th century slave owners. In fact, there are similarities between the economics of slavery and the modern rap industry. Cheap labor, slaves, made it possible for the Southern plantation to make money. All that was required was silent assent to a hellish compromise with the obvious immorality of slavery by the politicians, the religious leaders, the bankers, and the newspaper editors. Cosby is particularly critical of The New York Times for a "liberal, patronizing attitude" toward black culture in which they promote hip-hop to show "they are so cool" but fail to write about its negative impact on the black community.
Occasionally, black critics of rap and hip-hop have emerged. One of the most outspoken was the late C. Delores Tucker, who crusaded for a decade against "gangster rap" pollution, including buying stock in major record companies in order to protest at stockholders meetings. Students at Spelman College, an historically black women's liberal arts college, forced the rapper Nelly to cancel a charity fund-raising visit to the school a few years ago in protest over one of his sexist music videos. Dr. William Banfield, head of the American Cultural Studies program at the University of St. Thomas, said of rappers:
They are the biggest sellouts of all time because they allow the white media structure to lessen the potential of a balanced picture of black people in contemporary American cultural projection.
Harry Belafonte, the singer, said much the same when he described rappers as "caught in a trick bag because it's a way to make unconscionable sums of money and a way to absent yourself from any sense of moral responsibility."
These, sadly, are isolated voices. The black churches have been largely silent, as have the major civil rights organizations, and the Congressional Black Caucus. Critics such as Al Sharpton and Jesse Jackson, eager to pounce on Don Imus, who, needless to say, has earned the criticism he has received, are indifferent to the racism and sexism of black rappers and show business personalities.
Journalist Michelle Malkin posted on her web site several videos from artists currently on the Billboard Hot Rap Tracks chart, including Mims, R. Kelly, and Bow Wow. Language resembling that used by Don Imus figured in each clip, prompting Malkin to ask whether Imus critics such as Al Sharpton and Jesse Jackson are "truly committed to cleaning up cultural pollution that demeans women and perpetuates racial epithets."
For Cynthia Neal Spence, an associate professor of sociology at Spelman College, the Imus controversy has been a teachable moment. "My students have been so hurt by all of this--and particularly their own role in it," she says.
Professor Spence was hardly shocked to learn that all of her female students recall having been called by the same noun that Mr. Imus used. What surprised her is that many of them acknowledged having themselves used the word. She said:
There's been a desensitization process that's had a profound effect on our choices of language, especially for our young people, who are so influenced by media culture. . . . These young people are growing up in a generation where everything goes.
The "shock jocks" and the rappers and hip-hop artists are hardly alone. Hollywood, television networks, and other media outlets have long provided Americans with a steady diet of sex and violence. Our culture has become increasingly coarsened, to the detriment of all of us. In this sense, Don Imus has ignited a useful national debate. It is time for the leaders of the black community to enter that debate and use the same standard in assessing the lyrics of rap and hip-hop songs as they do the words of white "shock jocks." And it is time for all of us to speak out against the coarse language, racism, sexism, promotion of promiscuity and drug use that, all too often, is to be found in the media. We must ask ourselves what kind of society we want to live in. We have received many wake-up calls but, thus far, have failed to take heed. *
"There is but one straight course, and that is to seek truth and pursue it steadily" --George Washington
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
The Republican defeat in the November election, and the decision of voters to give Democrats a majority in both the House and Senate, has been described by some as a defeat for conservatism. Nothing could be further from the truth.
There is, in fact, nothing conservative about the policies of the Bush administration and the Republican Congress that was rejected by the voters. In his book Buck Wild: How Republicans Broke the Bank and Became the Party of Big Government, Stephen Slivinski of the Cato Institute shows how earmarks-or pork-barrel projects-multiply for each home district or state. In the last Democratic Congress, earmarks numbered 1,549. The Republican Party in its first year got the number down to 958. But in 2005 and again in 2006 the yearly total zoomed to over 15,000, or an annual average of some 30 earmarks per member of Congress.
Such earmarks are also a gateway to corruption. Mr. Slivinski notes that indicted Republican lobbyist Jack Abramoff once called the Appropriations Committee, birthplace of most earmarks, a "favor factory." Rep. Floyd Flake (R-AZ) refers to earmarks as "the currency of corruption." California Republican, former Rep. Randy Cunningham, who confessed to taking bribes for promises of earmarks, got a jail sentence of eight years and four months.
At the National Review summit of conservatives, held in Washington in January, former Florida Governor Jeb Bush told his audience that Republicans lost the 2006 elections because they abandoned their principles of limited government and fiscal responsibility. David Boaz, executive vice president of the Cato Institute, points out that:
The Republican Congress came to power in 1994 promising "the end of government that is too big, too intrusive, and too easy with the public's money." But for the past six years, with Republicans controlling both the White House and Congress, they have instead delivered the biggest spending increases and the biggest expansion of entitlements since Lyndon Johnson, the federalization of education, the McCain-Feingold restrictions of political speech, and the Sarbanes-Oxley regulatory burden. When you combine that with a misguided war and a series of scandals that remind voters why no party should stay in power too long, is it any wonder that conservatives were dispirited in the 2006 elections?
David Keene, chairman of the American Conservative Union, notes that:
The historic core of the (conservative) movement has revolved around the relationship of the citizen to the state with conservatives of most, if not all, stripes arguing that a small government that is minimally involved in running the economy and the way people live their lives is superior to a larger government that wants to do more and more "for the people." In power, however, conservative politicians have tried to retain the rhetoric of small government while governing in a way barely distinguishable from their Democratic opponents.
Discussing the decline of conservatism and conservative ideas, Paul M. Weyrich, chairman of the Free Congress Research and Education Foundation, and William S. Lind, director of its Center for Cultural Conservatism, write in The American Conservative:
Conservatism has become so weak in ideas that during the presidency of George W. Bush, the word "conservative" could be and was applied with scant objection to policies that were starkly anti-conservative. Americans witnessed "conservative" Wilsonianism, if not Jacobinism, in foreign policy and an unnecessary foreign war; record "conservative" de-industrialization and dispossession of the middle class in the name of Ricardian free trade and Benthamite utilitarianism. No wonder the American people are confused and disillusioned by conservatism if these are its actions when in power. . . . If conservatism is to be re-established as an intellectual force, and not merely a label for whatever the establishment does to its own benefit, it must first reawaken intellectually.
A supreme irony of today's Big Spending-Big Government Republicans, argues William H. Peterson, an adjunct scholar at the Heritage Foundation and the Ludwig von Mises Institute:
. . . is their run-in with the anti-Big Government thinking of the American voters themselves. For according to polls such as ABC News/Washington Post and CBS/New York Times, American voters prefer a smaller state.
Indeed, poll numbers for the last 28 years on Americans opting for smaller government trend upward-from 44 percent for smaller government against 41 percent for larger government in 1978 to 64 percent for smaller government against only 22 percent for larger government in 2004.
In Stephen Silvinski's view:
It seems there is a large constituency that would respond favorably to a political party that can enunciate a clear program to make the federal government smaller, less powerful and less intrusive. It's those sorts of voters-Republicans, Democrats and independents alike-who catapulted Reagan to the White House. Those voters are still up for grabs. The Republican Party cannot take them for granted anymore.
Before the 2006 election, conservative commentators Kate O'Beirne and Rich Lowry, writing in National Review, had one word to describe the Republican Congress' approach to spending the big deficits: "Incontinence." They argued that the relevant question for conservatives was not "Can this Congress be saved?" but "Is it worth saving?"
Not long before his death, Nobel Prize winning economist Milton Friedman said that the previous four years of Republican spending increases were "disgraceful" and a betrayal of the party's principles. "I'm disgusted by it," he declared:
For the first time in many years, the Republicans have control of Congress. But once in power, the spending limits were off and it's disgraceful because they went against their principles.
Federal spending as a share of the entire economy was 18.4 percent when Mr. Bush took office in 2001. Since then, the government's annual spending levels have grown by $610 billion or to 20.2 percent of the economy, according to figures compiled by the Heritage Foundation. "This is not a happy time for fiscal conservatives. We have had way too much spending," said John F. Cogan, an economist at the Hoover Institution who has been a frequent adviser to the Bush White House.
The critiques of the Bush administration and the Republican Congress have been increasingly harsh-and perhaps the harshest of these is from conservatives. Columnist George Will, discussing the administration's Iraq policy, wrote: "This administration cannot be trusted to govern if it cannot be counted on to think and, having thought, to have second thoughts." Robert Kaga, a neoconservative supporter of the Iraq war, wrote:
All but the most blindly devoted Bush supporters can see that Bush administration officials have no clue about what to do in Iraq tomorrow, much less a month from now.
In a book published in 2004, former Bush Treasury secretary Paul H. O'Neill described Bush as "a blind man in a room full of deaf people" and said that policymakers put politics before sound policy judgments. O'Neill said that "the biggest difference" between his time in government in the 1970s and in the Bush administration:
. . . is that our group was mostly about evidence and analysis, and Karl (Rove), Dick (Cheney), (Bush communications strategist) Karen Hughes and the gang seemed to be mostly about politics.
The growth of government power, the diminution of individual freedom, and a soaring deficit are not the policies one would expect from a self-proclaimed conservative president and Congress. Still, perhaps we should not be too surprised.
The Founding Fathers understood very well that freedom was not man's natural state. Their entire political philosophy was based on a fear of government power and the need to limit and control that power very strictly. It was their fear of total government that initially caused them to rebel against the arbitrary rule of King George III. In the Constitution they tried their best to construct a form of government that through a series of checks and balances and a clear division of powers, would protect the individual. They believed that government was a necessary evil, not a positive good.
Yet, the Founding Fathers would not be surprised to see the many limitations upon individual freedom that have come into existence. In a letter to Edward Carrington, Thomas Jefferson wrote that: "The natural progress of things is for liberty to yield and government to gain ground." He noted that:
One of the most profound preferences in human nature is for satisfying one's needs and desires with the least possible exertion; for appropriating wealth produced by the labor of others, rather than producing it by one's own labor . . . the stronger and more centralized the government the safer would be the guarantee of such monopolies; in other words, the stronger the government, the weaker the producer, the less consideration need be given him and the more might be taken away from him.
The written and spoken words of the men who led the Revolution give us numerous examples of their fear and suspicion of power and the men who held it. Samuel Adams asserted that:
There is a degree of watchfulness over all men possessed of power or influence upon which the liberties of mankind much depend. It is necessary to guard against the infirmities of the best as well as the wickedness of the worst of men.
Therefore, "Jealousy is the best security of public liberty."
Conservatives, if they are sincere in their advocacy of limited government and fiscal responsibility, must be as vigilant when Republicans are in power as when Democrats are in control. There is a tendency for the party in power-whichever party it may be-to expand that power and build upon it. We have seen this tendency in full bloom with the Bush administration. Finally, perhaps too late, conservatives have now come to understand that reality.
In October, 2006, according to the U.S. Census Bureau, the population of the United States reached 300 million, behind only that of China and India.
One key ingredient in this population growth has been immigration. Over the past four decades, immigrants, primarily from Mexico and Latin America, have reshaped the country's ethnic makeup. Of the newest 100 million Americans, according to the Pew Hispanic Center, 53 percent are either immigrants or their descendants.
Nearly half of the nation's children under 5 belong to a racial or ethnic minority. The face of the future is clear in our schools. Writing in the Smithsonian Magazine, Joel Garrau notes that:
Our kindergartens now prefigure the country as a whole, circa 2050-a place where non-Hispanic whites are a slight majority. . . . The numerical study of who we are and how we got that way does have a refreshing habit of focusing our attention on what's important, long-term, about our culture and values-where we're headed and what makes us tick.
Many believe that the changing racial and ethnic makeup of the nation signals a fundamental change for our society. Political historian Michael Barone disagrees. In his important book, The New Americans, Barone, a senior writer at U.S. News and World Report, reminds us that the U.S. has never been a homogeneous, monoethnic nation:
The American colonies, as historian David Hackett Fisher teaches in "Albion's Seed," were settled by distinctive groups from different parts of the British Isles, with distinctive folkways, distinctive behaviors in everything from politics to sexual behavior. And this is not to mention the German immigrants who formed 40 percent of Pennsylvania's population in the Revolutionary years and who, Benjamin Franklin feared, would never be assimilated. Many different religious groups-Catholics and Mennonites, Shakers and Jews-established communities and congregations, making the thirteen colonies and the new nation more religiously diverse than any place in Europe. We were already, in John F. Kennedy's phrase, a nation of immigrants.
Barone shows how the new Americans of today can be interwoven into the fabric of American life just as immigrants have been interwoven throughout history. He believes, however, that it is essential we heed the lessons of America's past, and avoid misguided policies and programs-such as bilingual education-that hinder rather than help assimilation. The Melting Pot, he believes, can work as well today as it already has.
"The minority groups of 2000," writes Barone:
. . . resemble in important ways immigrant groups of 1900. . . . America, in the future, will be multiracial and multiethnic, but it will not-or should not-be multicultural in the sense of containing ethnic communities marked off from and adversarial to the larger society, any more than today's America consists of unassimilated and adversarial communities of Irish, Italians, or Jews. . . . We are not in a wholly new place in American history. We've been here before.
While the American society of a century ago sought to assimilate immigrants and make sure that they were taught the English language and the history, culture, and values of their new country, many in today's society, particularly among the nation's elites, have abandoned that goal.
In Barone's view,
In the last third of the twentieth century . . . elite Americans have not been preoccupied with immigration and have tended to regard "Americanization" as an uncouth expression of nationalistic pride or a form of bigotry. . . . Elites came to see Americanization as the unfair subjection of members of other races and cultures. They came to celebrate . . . an America that would be made up of separate and disparate "multicultural" groups, fenced off in their own communities, entitled to make demands on the larger society, but without any responsibility to assimilate to American mores.
Programs that have been adopted in recent years, Barone argues have hindered the integration of newer immigrants into the American society:
By stepping back from the prevalent view of the immigrant and minority groups, we see how misguided some of our policies and programs are. It is absurd, for instance, to grant immigrants quotas and preferences that are based on past discrimination because, as John Miller points out, "foreign-born newcomers almost by definition cannot have experienced a past history of discrimination in the United States." Even more absurd and counterproductive have been the so-called bilingual education programs, which have kept Latino immigrants' children in Spanish-language instruction and denied them knowledge of English that they need to advance in American society. What these immigrants need is what Americanization supplied the immigrants a hundred years ago-a knowledge of English and basic reading and mathematics skills, an appreciation of the American civic culture, a fair chance of moving ahead as far as their abilities will take them. We need to learn the good lessons our forebears taught, even as we strive to avoid their mistakes.
Most immigrants, Barone shows, are hard-working and are committed to making better lives for themselves in the American society. They are not the problem. He believes that:
The greatest obstacle to the interweaving of blacks, Latinos, and Asians into the fabric of American life is not so much the immigrants themselves or the great masses of the American people; it is the American elite. The American elite of a century ago may have looked on immigrants with distaste. . . . But it also championed the cause of Americanization and promoted assimilation of immigrants into the mainstream. . . . What is important now is to discard the notion that we are at a totally new place in American history, that we are about to change from a white-bread nation to a collection of peoples of color. On the contrary, the new Americans of today, like the new Americans of the past, can be interwoven into the fabric of American life. In many ways, that is already happening, and rapidly. In can happen even more rapidly if all of us realize that interweaving is part of the basic character of the country and that the descendants of the new Americans of today can be as much an integral part of their country, and as capable of working their way into its highest levels, as the descendants of the new Americans of a hundred years ago.
Clearly, the time has come to fire up the melting pot. Former Colorado Governor Richard Lamm makes this point:
The U.S. is at a crossroads. If it does not consciously move toward greater integration, it will inevitably drift toward more fragmentation. Cultural divisiveness is not a bedrock upon which a nation can be built. It is inherently unstable. . . . America can accept additional immigrants, but must be sure that they become Americans. We can be Joseph's coat of many nations, but we must be united. One of the common glues that hold us together is language-the English language. We should be color-blind but linguistically cohesive. We should be a rainbow but not a cacophony. We should welcome different peoples but not adopt different languages. We can teach English through bilingual education, but we should take great care not to become a bilingual society.
Professor Seymour Martin Lipset points out that:
The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension and tragedy. Canada, Belgium, Malaysia, Lebanon-all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons and Corsicans."
Remembering the way American public schools once served to bring children of immigrants into the mainstream, Fotine Z. Nicholas, who taught for 30 years in the New York City schools and for many years wrote an education column for a Greek-American weekly, notes:
I recall with nostalgia the way things used to be. At P.S. 82 in Manhattan, 90 percent of the students had European-born parents. Our teachers were mostly of Irish origin, and they tried hard to homogenize us. We might refer to ourselves as Czech or Hungarian or Greek but we developed a sense of pride in being American. . . . There were two unifying factors: the attitude of our teachers and the English language. . . . After we started school, we spoke only English to our siblings, our classmates and our friends. We studied and wrote in English, we played in English, we thought in English.
Discussing recent bilingual education programs, Mrs. Nicholas declares that:
It was a simple concept at first: Why not teach children English by means of the home language? A decade later, "disadvantaged" children were still being taught in their parents' language. As federal money poured into the program, it gradually became self-perpetuating. . . . Bilingual education seems to be developing into a permanent means of ethnic compartmentalization. Cultural pluralism may be the norm for a multi-ethnic nation, but is the family's role to build a cultural identity in children. The School's role is to help them enter the mainstream of school life, and eventually, the mainstream of the United States of America.
America has been a nation much beloved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This of course, is only natural. Yet, America is not simply another country. To think so is to miss the point of our history. America has been believed not only by native Americans, but by men and women throughout the world who have yearned for freedom.
America dreamed a bigger dream than any nation in the history of man. It was a dream of a free society in which a man's race, or religion or ethnic origin would be completely beside the point. It was a dream of common nationality in which the only price to be paid was a commitment to fulfill the responsibilities of citizenship.
In the 1840s, Herman Melville wrote that "We are the heirs of all time and with all nations we divide our inheritance." If you kill an American, he said, you shed the blood of the entire world.
At a celebration in New York several years ago of the 150th anniversary of Norwegian immigration, news commentator Eric Sevareid, whose grandfather emigrated from Norway, addressed the group-in the form of a letter to his grandfather. He said:
You knew that freedom and equality are not found but created. . . . This grandson believes this is what you did. I have seen much of the world. Were I now asked to name some region on earth where men and women lived in a surer climate of freedom and equality than that Northwest region where you settled--were I so asked I could not answer. I know of none.
In 1866, Lord Acton, the British Liberal leader, said that America was becoming the "distant magnet." Apart from the "millions who have crossed the ocean, who shall reckon the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West?"
Our new immigrants must be taught our history and must understand that what drew them to America will be lost if it is replaced by an ethnic and racial Balkanization, which some appear to seek. The melting pot worked well in the past. It will work well in the future if we will permit it to do so. *
"Our major obligation is not to mistake slogans for solution." -Edward R. Murrow