Allan C. Brownfeld is the author of five books, the latest of which is The Revolution Lobby(Council for Inter-American Security). He has been a staff aide to a U.S. vice president, members of Congress, and the U.S. Senate Internal Security Subcommittee. He is associate editor of The Lincoln Review, and a contributing editor to Human Events, The St. Croix Review, and The Washington Report on Middle East Affairs.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
The story of the life of Clarence Thomas, as set forth in his memoir, My Grandfather's Son (Harper), is destined to become an American classic, not dissimilar to the autobiographies of Frederick Douglass and Booker T. Washington.
This book chronicles an extraordinary life and describes, as well, the education of an inquiring mind, seeking to understand the complex reality around him, and to make sense of the racial politics and ideological divisions which confront him during the turbulent 1960s and 1970s.
In an era when "Identity" politics dictated a particular political, economic, and social stance for black Americans, those individuals who persisted in thinking for themselves and followed an often lonely path to discover their own view of reality and truth were isolated and bitterly attacked.
Describing Clarence Thomas as "the freest black man in America," Shelby Steele, a fellow at the Hoover Institution, notes that:
For minorities . . . their group identity will often be the enemy of their individuality. In its insecurity, the group is naturally threatened by the impulse of some of its members to think for themselves. . . . People who veer from the group masks -- who evolve by their own lights -- start to lose their moral authority as blacks. This is why President Bush got no credit for having two black secretaries of state. Naively, he selected two black individuals. Still, the black individual is now emerging as something of a new archetype in American life -- not someone who disowns his group but someone who rejects it as master. Today there is no more quintessential embodiment of this new archetype than Supreme Court Justice Clarence Thomas.
Clarence Thomas was born in rural Georgia on June 23, 1948. He was abandoned by his father and didn't even meet him until he was eleven years old. His mother was left to raise him and his brother and sister on the ten dollars a week she earned as a maid. At the age of seven Thomas and his six-year-old brother were sent to live with his mother's father, Myers Anderson, and her stepmother in their Savannah home. This was a move that would change Thomas' life.
His grandfather, whom he called "Daddy," had a strict work ethic. He owned his own fuel-oil business and he immediately subjected the two boys to a regime of sacrifice -- no school sports, very little television, self-development and hard work. His response to the poverty and segregation of black Savannah was the American ethic of self-help, faith in God, delayed gratification and individual initiative.
In every way that counts, I am my grandfather's son. I even called him Daddy because it was what my mother called him. He was the one hero in my life. What I am is what he made me.
Before going to live with his grandfather, the squalor of Thomas' life was overwhelming:
Nowadays most people know Savannah from reading Midnight in the Garden of Good and Evil. To them it is an architectural wonderland full of well-heeled eccentrics and beautifully preserved 18th and 19th century houses. I didn't live in that part of town. When I was born, Savannah was hell. . . . The only running water in our building was downstairs in the kitchen, where several layers of old linoleum were all that separated us from the ground. The toilet was outdoors in the muddy backyard.
After moving in with his grandfather, life changed:
In return for submitting to Daddy's iron will . . . I lived a life of luxury, at least by comparison. . . . The home that Daddy and Aunt Tina built for themselves (and in which my mother now lives) had two bedrooms, one bathroom, separate living and dining rooms, a den and a kitchen. . . . We had our own beds, plenty to eat. . . . I had never before seen a house with such conveniences, or with an indoor porcelain toilet that worked. I flushed it as often as I could in my first months on East 32nd Street.
Daddy had become a Roman Catholic, and sent Clarence and his brother to St. Benedict the Moor Grammar School. It was run, Thomas writes:
. . . by the Missionary Franciscan Sisters of the Immaculate Conception, most of whom were Irish immigrants. . . . They expected our full attention and made sure they got it, dispensing corporal punishment whenever they saw fit. . . . We learned that God made us to know, love, and serve Him in this world, and to be happy with Him in the next. The sisters also taught us that God made all men equal, that blacks were inherently equal to whites, and that segregation was morally wrong. This led some people to call them "the nigger sisters."
A few months before his 16th birthday, Thomas decided he wanted to enter St. John Vianney to prepare for the priesthood. At that time, the schools in Savannah were still segregated, including St. John Vianney. The Catholic Church, Thomas recalls:
. . . maintained separate parishes and parochial schools for blacks, and the local church fathers seemed in no hurry to lift the color bar. No doubt they thought they were being prudent, but before long I would come to see their caution as cowardice.
Once admitted to the seminary, Thomas began to slowly adjust to the new, predominantly white, environment. He observed racism on the part of his fellow seminarians:
. . . a small group of students was given permission to move the school t.v. into a classroom to watch the Cleveland Browns play. Those were the days when Jim Brown was on the team. Midway through one of his celebrated runs, one student yelled, "Look at that nigger go." I felt as if my soul had been pierced. . . . Worse yet was the time when the same student who'd called Jim Brown a "nigger" passed me a folded note during history class. "I like Martin Luther King . . ." it said on the outside. I unfolded the piece of paper. Inside was a single word: ". . . dead."
Despite these incidents, Thomas' grades improved steadily and toward the end of the first term he won a Latin bee. At the end of his sophomore year, he asked the other seminarians to sign his yearbook. One senior wrote, "Keep on trying, Clarence, one day you'll be as good as us."
The Church's failure to take a strong stand on racism became of growing concern to Thomas:
It seemed self-evident . . . that the treatment of blacks in America cried out for the unequivocal condemnation of a righteous institution that proclaimed the inherent equality of all men. Yet the Church remained silent, and its silence haunted me. I have often thought that my life might well have followed a different route had the church been as adamant about ending racism then as it is about ending abortion now.
What finished off Thomas' religious vocation was the day Martin Luther King was shot and a fellow seminarian said, "That's good. I hope the son of a bitch dies." Daddy, however, could not accept the decision to leave the seminary. "You've let me down," he said. "I'm finished helping you. You'll have to figure it out yourself. You'll probably end up like your no-good daddy."
On his own, Thomas entered the College of the Holy Cross in Massachusetts, one of six blacks in his class. He went through many political transformations -- from altar boy to seminary student to a campus radical and racial militant, before coming back to the values his grandfather taught him and an understanding of society which he acquired on his own.
He slowly came to oppose race-based affirmative action programs:
I didn't think it was a good idea to make poor blacks, or anyone else, more dependent on government. That would amount to a new kind of enslavement, one that ultimately relied on the generosity -- and the ever-changing self-interests -- of politicians and activists. It seemed to me that the dependency it fostered might ultimately prove as diabolical as segregation, permanently condemning poor people to the lowest rungs of the socioeconomic ladder by cannibalizing the values without which they had no long-term hope of improving their lot. . . . I began to suspect that Daddy had been right all along, the only hope I had of changing the world was to change myself first.
Thomas remembers that:
The more I read, the less inclined I was to conform to the cultural standards that blacks imposed on themselves and on one another. Merely because I was black, it seemed, I was supposed to listen to Hugh Masekala instead of Carole King, just as I was expected to be a radical, not a conservative. I no longer cared to play that game. . . . The black people I knew came from different places and backgrounds . . . yet the color of our skin was somehow supposed to make us identical in spite of our differences. I didn't buy it. Of course we had all experienced racism in one way or another but did that mean we had to think alike?
After Holy Cross, Thomas attended Yale Law School, graduating in 1974. His education continued, not only in the law but also in the racial political environment around him. "Like every other black law student," he writes:
. . . I was uncomfortably aware that blacks failed to pass the bar exam at a much higher rate than whites, and that the NAACP Legal Defense and Education Fund had filed lawsuits alleging that the exams they took were racially discriminatory. Lani Guinier, one of my classmates, was involved with the Legal Defense Fund, so I asked her to supply me with information about the extent of the problem. . . . At first I assumed that the disproportionate black failure rate was conclusive evidence of racial discrimination, but the more closely I looked at the facts, the more apparent it became that I was wrong. At that time each question on the bar exam was graded separately by a different scorer and each completed exam identified solely by number, thus making it impossible for the graders to tell which examinees, if any, were black.
To the Legal Defense Fund's "adverse impact theory," which held that if a neutral examination produced disparate results among the races, then it could be considered discriminatory. Thomas responds:
But I didn't buy that . . . knowing that no measurement of any part of our lives ever produced identical results for all racial or ethnic groups. To argue otherwise, I thought, diverted attention from the real culprits, the people who were responsible for the useless education these young people had received.
After law school Thomas went to work for John Danforth, who was serving as Missouri's attorney general. When Danforth was elected to the U.S. Senate, Thomas followed him to Washington and later worked at the Department of Education and as head of the Equal Employment Opportunity Commission, before being named a federal judge.
Along the way he discovered the writings of leading black conservatives such as Thomas Sowell and Walter Williams. He reports that:
One of the first people in Washington who talked sense to me abut race was Jay Parker, the editor of a new magazine called The Lincoln Review. . . . Jay was friendly, energetic, unflappable and unapologetically conservative. I'd never known a black person who called himself a conservative, and it surprised me that we rarely disagreed about anything of substance.
Thomas provides this assessment of the black conservatives who had influenced his thinking and became his friends:
Walter Williams, Thomas Sowell and Jay Parker were all smart, courageous, independent-minded men who came from modest backgrounds. Politics meant nothing to them. All they cared about was truthfully describing urgent social problems, then finding ways to solve them. Unhampered by partisan allegiances, they could speak their minds with honesty and clarity. They were my kind of black men. . . . I'll never forget the time when Jay reminded me that freedom came from God, not Ronald Reagan. For Jay politics was a part of life, not a way of life. It was an attitude I sought to emulate.
There is, of course, much in this book about Clarence Thomas' personal life, his difficult first marriage, raising his son, and his happy and fulfilling marriage to Virginia. There is, as well, a lengthy description of the Supreme Court confirmation hearings in the Senate, and the charges of sexual harassment by Anita Hill, which he convincingly refutes. It is his view, which the evidence supports, that he was targeted with such a vicious assault precisely because he was a black man who persisted in thinking for himself, and rejected the political correctness and liberal orthodoxy which the civil rights establishment and white liberals sought to impose upon him.
"The more I reflected on what was happening," he writes:
. . . the more it astonished me. As a child of the Deep South, I'd grown up fearing the lynch mobs of the Ku Klux Klan; as an adult, I was starting to wonder if I'd been afraid of the wrong white people all along. My worst fears had come to pass not in Georgia but in Washington, D.C., where I was being pursued not by bigots in white robes but by left-wing zealots draped in flowing sanctimony. For all the fear I'd known as a boy in Savannah, this was the first time I'd found myself at the mercy of people who would do whatever they could to hurt me -- and institutions that once prided themselves on bringing segregation and its abuses to an end were aiding and abetting in the assault.
During the hearings, journalist Juan Williams wrote in The Washington Post:
Here is indiscriminate, mean-spirited mud-slinging, supported by the so-called champions of fairness: liberal politicians, unions, civil rights groups, and women's organizations. They have been mindlessly led into mob action against one man by the Leadership Conference of Civil Rights. . . . To listen to or read some news reports on Thomas over the past month is to discover a monster of a man, totally unlike the human being full of sincerity, confusion, and struggles whom I saw as a reporter who watched him for some ten years. He has been conveniently transformed into a monster about whom it is fair to say anything, to whom it is fair to do anything. President Bush may be packing the court with conservatives, but that is another argument, larger than Clarence Thomas. In pursuit of abuses by a conservative president, the liberals become the abusive monsters.
Fortunately, Clarence Thomas survived the assault upon him and triumphed over his adversaries. If the goal of our society is for each individual to go as far as his ability will take him, for each person to come to his own conclusions about political, social, and economic issues, then Clarence Thomas has lived this American Dream. This book is an eloquent testimonial to both that life and that dream, and should be read not only by those who agree with Clarence Thomas' views but, more important, by all who cherish a society in which genuine independence of thought is respected and in which excellence is given its proper reward. *
"These hostile hungers have taken their turn in dominating the history of modern man: the hunger for liberty, to the detriment of equality, was the recurrent theme of the nineteenth century in Europe and America; the hunger for equality, at the cost of liberty, has been the dominant aspect of European and American history in the twentieth century." --Will Durant
We would like to thank the following people for their generous support of this journal (from 11/15/07 to 1/11/08): John G. Barrett, Carol & Bud Belz, Charles Benscheidt, Aleatha W. Berry, Veronica A. Binzley, Jan F. Branthaver, Terry Cahill, William C. Campion, Mark T. Cenac, Garry W. Croudis, John D'Aloia, Peter R. DeMarco, Richard A. Edstrom, Nicholas, Falco, The Anderson Foundation, James R. Gaines, Jane F. Gelderman, Robert C. Gerken, Robert L. & Carol J. Gilmore, Lee E. Goewey, Joyce Griffin, Violet H. Hall, Paul J. Hauser, John H. Hearding, Richard L. Herreid, Arthur H. Ivey, Marilyn P. Jaeger, David A. Jones, Edgar Jordan, Michael Kaye, Allyn M. Lay, Alan Lee, Gregor MacDonald, Bernard L. Poppert, Richard O. Ranheim, Howard J. Romanek, Harry Richard Schumache, Dave Smith, Frank T. Street, John West Thatcher, Clifford F. Thies, Julian Tonning, Elizbeth E. Torrance, Thomas Warth, Robert D. Wells, Eric B. Wilson.
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
As the Bush Administration enters its final period, more and more conservatives are expressing dismay about its legacy.
Alan Greenspan, who served as Federal Reserve chairman for 18 years and was the leading Republican economist for the past three decades, levels harsh criticism at President Bush and the Republican Party in his new memoir, The Age of Turbulence: Adventures in a New World. He argues that Bush abandoned the central conservative principle of fiscal restraint:
My biggest frustration remained the president's unwillingness to wield his veto against out-of-control spending. Not exercising the veto power became a hallmark of the Bush presidency. . . . To my mind, Bush's collaborate-don't-confront approach was a major mistake.
Greenspan accuses the Republicans who presided over the party's majority in the House until 2006 of being too eager to tolerate excessive federal spending in exchange for political opportunity. The Republicans, he says, deserved to lose control of the Senate and House in the 2006 elections. "The Republicans in Congress lost their way," he writes. "They swapped principle for power. They ended up with neither."
He singles out J. Dennis Hastert, the Illinois Republican who was House Speaker, and Tom DeLay, the Texas Republican who was majority leader until he resigned after being indicted for violating campaign finance laws in his home state. Greenspan writes:
House Speaker Hastert and House majority leader Tom DeLay seemed readily inclined to loosen the federal purse strings any time it might help add a few more seats to the Republican majority.
When Bush and Cheney won the 2000 election, Greenspan declares:
I thought we had a golden opportunity to advance the ideals of effective, fiscally conservative government and free markets. . . . I was soon to see my old friends veer off to unexpected directions. Little value was placed on rigorous economic policy debate or the weighing of long-term consequences.
He notes that the large, anticipated federal budget surpluses that were the basis for Bush's initial $1.35 trillion tax cut "were gone six to nine months after George W. Bush took office..." So Bush's goals "were no longer entirely appropriate. He continued to pursue his presidential campaign promises nevertheless."
Greenspan provides this assessment of the Bush presidency:
I'm just very disappointed. Smaller government, lower spending, lower taxes, less regulation -- they had the resources to do it, they had the knowledge to do it. They had the political majorities to do it, and they didn't. In the end, political control trumped policy, and they achieved neither political control nor policy.
In a column entitled "The Republican Collapse," conservative New York Times columnist David Brooks explains how philosophical drift had led to political decline:
The Modern conservatism begins with Edmund Burke. What Burke articulated was not an ideology or a creed, but a disposition, a reverence for tradition, a suspicion of radical change. . . . Over the past six years, the Republican Party has championed the spread of democracy in the Middle East. But the temperamental conservative is suspicious of rapid reform believing that efforts to quickly transform anything will have, as Burke wrote, "pleasing commencements" but "lamentable conclusions."
Brooks notes that,
The world is too complex, the Burkean conservative believes, for rapid reform. Existing arrangements contain latent functions that can be neither seen nor replaced by the reformer. The temperamental conservative prized epistemological modesty, the awareness of the limitations on what we do and can know, what we can and cannot plan. Over the past six years, the Bush administration has operated on the assumption that if you change the political institutions in Iraq, the society will follow. But the Burkean conservative believes that society is an organism; that custom, tradition, and habit are the prime movers of that organism; and that successful government institutions grow gradually from each nation's unique network of moral and social restraints.
In recent years, the vice president and the former attorney general have sought to expand executive power as much as possible in the name of protecting Americans from terror. This has also produced a reaction from conservatives who have long feared the power of government and have been committed to the constitutional system of checks and balances and divided authority between the executive, the legislature, and the judiciary.
In October 2003, Jack Goldsmith, a legal scholar with strong conservative credentials, was hired to head the Justice Department's Office of Legal Counsel, which advised the president and the attorney general about the legality of presidential actions. As he was briefed on counterterrorism measures the Bush administration had adopted in the wake of September 11, Goldsmith says he was alarmed to discover that many of those policies "rested on severely damaged legal foundations," that the legal opinions that supported these counterterrorism operations were, in his view, "sloppily reasoned, overbroad, and incautious in asserting extraordinary constitutional authorities on behalf of the president."
Mr. Goldsmith eventually withdrew several key department opinions -- including two highly controversial "torture memos," dealing with the authority of the executive branch to conduct coercive interrogation -- but only after contentious battles with administration hardliners led by David Addington, then Vice President Cheney's legal adviser and now chief of staff.
In his book The Terror Presidency, Goldsmith recounts how he and his Justice Department colleagues, in consultation with lawyers from the State Department, the Defense Department, the CIA, and the National Security council, reached a consensus in 2003 that the fourth Geneva Convention (which governs the duties of an occupying power and the treatment of civilians) affords protection to all Iraqis, including those who are terrorists. When he delivered this decision to the White House, he recalls, Addington exploded: "The president has already decided that terrorists do not receive Geneva Convention protections. You cannot question his decision."
Mr. Goldsmith, who resigned from the Office of Legal Counsel in June 2004 -- only nine months after assuming the post -- describes a highly insular White House obsessively focused on expanding presidential power and loathe to consult with Congress; a White House that sidelined Congress in its policymaking and pursued a "go-it-alone approach" based on "minimal deliberation, unilateral action, and legalistic defense."
Noting that "the president and the vice president always made clear that a central administration priority was to maintain and expand the president's formal legal powers," Goldsmith says that lawyers soon realized that they "could gain traction for a particular course of action -- usually going it alone -- by arguing that alternative proposals would diminish the president's power."
Mr. Goldsmith is also critical of the manner in which the Bush administration went about side-stepping the 1978 Foreign Intelligence Surveillance act, which required the president and government agencies to obtain warrants from a special court before conducting electronic surveillance of people suspected of being terrorists or spies. Although he says he shared many of the administration's concerns on this issue, he "deplored the way the White House went about fixing the problem."
He quotes Mr. Addington saying of the surveillance act in court: "We're one bomb away from getting rid of that obnoxious court." And he observes that top Bush officials dealt with the act "the way they dealt with other laws they didn't like: they blew through them in secret based on flimsy legal opinions that they guarded closely so no one could question the legal basis for the operations."
Mr. Goldsmith concludes with the observation that unlike Lincoln and Franklin D. Roosevelt -- two presidents who also presided over the nation at times of crisis -- President Bush has relied only on "the hard power of prerogative," ignoring:
. . . the soft factors of legitimation -- consultation, deliberation, the appearance of deference, and credible expressions of public concern for constitutional and international values -- in his dealing with Congress, the courts, and allies.
As a result, Goldsmith says, even if President Bush's "accomplishments are viewed more charitably by future historians than they are viewed today," they will "likely always be dimmed by our knowledge of his administration's strange and unattractive views of presidential power."
The combination of deficit spending, the growth of executive power, and the increasingly unpopular war in Iraq has left many conservatives completely disillusioned. "I've never seen conservatives so downright fed up," says long-time conservative leader Richard Viguerie.
The Economist notes that:
Mr. Bush has . . . presided over the biggest expansion in government spending since his fellow Texan, Lyndon Johnson, provoking fury on the right. His prescription-drug benefit was the largest expansion of government entitlements in 40 years. He has increased federal education spending by about 60 percent and added some 7,000 pages of new regulations. Pat Toomey, the head of the Club for Growth, says the conservative base feels "disgust with what appears to be a complete abandonment of limited government."
William F. Buckley, a leading conservative spokesmen and the founding editor of National Review, says that if Mr. Bush were the leader of a parliamentary system, "it would be expected that he would retire or resign." Bruce Bartlett, a former Reagan administration economist, accuses him of "betraying the conservative movement."
In The Economist's view:
The Republicans have failed the most important test of any political movement -- wielding power successfully. They have botched a war. They have splurged on spending. And they have alienated a huge section of the population.
As the 2008 presidential campaign gets under way, Republican contenders are in the process of keeping their distance from the Bush presidency. The Washington Post reports:
For all the candidates, the unspoken problem is the same: how to establish a clear break from the legacy of President Bush and his sagging poll numbers without alienating the party faithful. . . . All the candidates have sought to make subtle distinctions with Bush. Thompson and McCain say the U.S. should have mobilized more troops for the invasion of Iraq, while Romney and McCain say the response to Hurricane Katrina should have been handled better -- leaving no doubt that they have some concerns about the Bush presidency, but stopping short of attacking his leadership. They have failed against excessive federal spending and say they would be willing to veto spending bills, in contrast to Bush, who this week vetoed a bill over spending concerns for only the second time in his presidency.
While there is a broad consensus that the party must find a way to move beyond Bush's legacy, according to The Post:
. . . there are widespread divisions within the GOP over the solutions. Some see a priority in the need to address ballooning spending, while others view social issues such as abortion and traditional marriage as paramount in determining the party's course.
Peter Wehner, a former White House official who worked under Karl Rove as director of strategic initiatives, says that:
Conservatism is going through an interesting moment. It's still a center-right country, but conservatism is going through an interesting moment when it's got to consider fresh the issues it's going to put forward and the politics it's going to advance.
He described his party as "at sea" on domestic policy such as health care.
Historian George H. Nash says that, "American conservatism has become middle-aged," and that with middle age has came a midlife crisis. In the book The Future of Conservatism (Edited by Charles W. Dunn, ISI Books), Nash identifies four distinct strands of modern American conservatism. Traditionalists value continuity, order, and hierarchy; libertarians prize personal freedom and social spontaneity; neoconservatives blend the New Deal's idealistic spirit with conservatism's muscular nationalism; and religious conservatives fight relativism, secularism, and immorality.
Jonathan Rauch, a senior writer with National Journal, and a guest scholar at the Brookings Institution, writes that:
Given their differences, the surprise is that these four heads ever joined atop one political beast. Yet Ronald Reagan, Soviet Communism, and hostility to the excesses of the 1960s brought together a vibrant coalition. Today Reagan and the Soviet Union are gone, and conservatism in power has produced excesses of its own, bringing the movement's cultural contradictions to the fore. Libertarians and traditionalists disagree on the relative importance of liberty and virtue; many neocons care not a fig about abortion, while religious conservatives seem to care about little else.
In Rauch's view:
Unexpectedly, George W. Bush, Reagan's would-be heir, has divided the conservative movement. . . . In his obsession with marginalizing the Democrats, and in his determination to be a "transformational" president, Bush embraced an activism that unmoored the party from its libertarian preference for small government and its traditionalist preference for orderly incrementalism. Libertarians' disenchantment has become obvious; less widely appreciated is that "there is now an apparently unbridgeable divide between traditional conservatives and the Bush administration on major policy matters," writes George E. Carey, a professor of government at Georgetown.
In the area of foreign policy, Bush broke decisively with the more cautious conservatism of Dwight D. Eisenhower and Bush's own father. Political scientist Daniel J. Mahoney argues that Bush and the neoconservatives "misplace one-sided emphasis on democracy" -- their "undemocratic monomania":
. . . marks a break with an older conservative tradition which always insisted that Western liberty draws on intellectual and spiritual resources broader and deeper than that of modern democracy. . . . The best conservative thinkers of the last two centuries have been wary of unalloyed democracy.
Social and religious conservatives are also unhappy. Their rise to political power has redirected cultural currents much less than they had hoped, and Bush's Republican Party, the family scholar Allan C. Carlson complains, has sided with big business over families.
Where American conservatism is headed after two terms of an administration which proclaimed itself "conservative" but presided over huge budget deficits, a growth in government power, increased claims of executive authority, and a costly war which the majority of Americans believe was unnecessary -- all policies which are the opposite of what conservatives have always advocated -- is difficult to say.
At the present time, disillusionment among traditional conservatives is growing as is their view that the Republican Party may no longer be viewed as a vehicle through which to advance the program of smaller and limited government, balanced budgets, and a prudent foreign policy. Some are now speaking of launching a third party alternative.
Whatever happens, it is clear that the American political landscape has been dramatically altered as conservatives ponder their -- and the nation's -- future.
A Virginia state panel has sharply criticized decisions made by Virginia Tech before and after last April's murder of 32 people, saying university officials could have saved lives by notifying students and faculty members earlier that there had been killings on campus. Because university officials misunderstood federal privacy laws as forbidding any exchange of a student's mental health information, the report concludes, they missed numerous indications of the gunman's mental health problems.
The Virginia Tech tragedy makes it clear that our privacy laws must be carefully examined and that changes are in order. When Seung Hui Cho, the shooter, was in high school, Fairfax County, Virginia, school officials determined that he suffered from an anxiety disorder so severe that they put him in special education, and devised a plan to help, but Virginia Tech was never told of the problem. Fairfax officials were forbidden from transmitting this information to Virginia Tech by federal privacy and disability laws that prohibit high schools from sharing with colleges private information such as a student's special education coding or disability. Those laws also prohibit colleges from asking for such information.
At Virginia Tech itself, it became clear that Cho had mental problems. Campus authorities were aware of his troubled mental state17 months before the massacre. More than one professor reported bizarre behavior. Campus security tried to have him committed involuntarily to a mental institution. There were complaints that Cho made unwelcome phone calls and stalked female students. Walter Williams, professor of economics at George Mason, University states that:
Given the university's experiences with Cho, they should at least have expelled him, and their failure or inability to do so was the direct cause of the massacre.
Beyond this, argues Williams, we must consider a federal law known as the Family Educational Rights and Privacy Act of 1974 (FERPA). As Virginia Tech's registrar reports:
Third Party Disclosures are prohibited by FERPA without the written consent of the student. Any persons other than the student are defined as Third Party, including parents, spouses, and employers.
College officials are required to secure written permission from the student prior to the release of any academic record information.
That means a mother, father, or spouse who might have intimate historical knowledge of a student's mental, physical, or academic problems, who might be in a position to render assistance in a crisis, is prohibited from being notified of new information. . . . Alternatively, should the family member wish to initiate an inquiry as to whether there have been any reports of mental, physical, or academic problems, they are prohibited from access by FERPA.
Under our present law, the only way Virginia Tech officials could have known about Cho's mental problems would have been from Cho himself. Yet, experts point out, asking for help is almost impossible for someone with his condition, which has been described as selective mutism. Robert Schum, a clinical psychologist and expert in selective mutism says that:
Children with selective mutism don't want to be the center of attention. They don't like to sit on Santa's lap. They don't like their photo taken on picture day. They don't want kids to sing to them at their birthday celebration. They just want to be left alone. So when you put the responsibility on them and ask them to draw attention to themselves by asking for help . . . that's really tough.
Cho's parents, although cooperative with Fairfax school officials, might not have fully understood what was wrong, and that their son needed help in college as well. As recently as the summer before the April shooting, Cho's mother had sought out members of One Mind Church in Woodbridge, Virginia, to purge him of what the pastor there called the "demonic power" possessing him.
Richard Crowley, coordinator of guidance services for Fairfax County, said high schools generally send transcripts to colleges with only a student's courses, grades, and test scores. Even the number of times a student has been suspended is considered an optional piece of information. Moreover, many colleges say they don't want to know because of the potential liability. Barmak Nassirian, with the American Association of Collegiate Registrars and Admissions Officers, says that:
In soliciting a student's history of psychiatric treatment or diagnoses by treating physicians, you basically open a Pandora's box. Even if you should decide, for reasons that have nothing to do with medical circumstances, not to accept a student, you most certainly will have a case that will be litigated.
Washington Post columnist Mark Fisher notes that:
I've read the 260 pages filed by the review panel . . . and I understand that the university should have stepped in to help Seung Hui Cho but didn't, and that the state mental health system should have acted more decisively but didn't. . . . In the hundreds of interviews the panel conducted, why didn't they ask all those people whose job it is to care for students one question: how would you have handled Cho if you had let your conscience, not privacy laws, guide you? . . . If the mental health professionals, police and college administrators, had acted as if their own child were involved, there might not have been any need for an investigation.
In Fisher's view:
The culprit here is the culture of privacy that we have allowed to pervade certain areas of life, especially health and education. . . . By walling off mental illness, we prevent the power of light from reaching those who are suffering. Privacy laws leave everyone, from health workers to college administrations, confused and defensive about what they may do and say. They react by doing less than they would if left to their own empathy and common sense. . . . Ultimate responsibility for the shootings rests squarely on Cho. But that does not absolve others of the need to act when something goes very wrong. Parents, as Virginia Governor Tim Kaine said, cannot "just drop your child off on campus." Rather, they must seek out resident advisers and counselors and say "Let me tell you about my precious child." And colleges must exhibit the same care toward young adults that parents, friends, or good bosses do -- no matter how much the law may seek to separate us from our human obligations.
Governor Kaine said it was hard to understand why more was not done about a student who once showed fascination with the 1999 Columbine High School shootings and considered the two students who committed it martyrs. "Look, I'm troubled that a student who had talked about Columbine at an earlier point in his life, that that information was unknown to anybody on the Tech campus."
The Virginia Tech Review Panel concludes that, "The current state of information privacy law and practice is inadequate. The privacy laws need amendment and clarification."
While colleges require students to submit immunization records, all records of emotional problems are sealed. "Perhaps students should be required to submit records of emotional or mental disturbance . . . after they have been admitted, but before they enroll," the report says. "Maybe there should be some form of 'permanent record.'"
In an English paper Virginia Tech did not disclose until The Washington Post revealed its existence, Cho wrote in an English class:
I hate this. I hate all these frauds. I hate my life. . . .This is it. . . . This is when you damn people die with me.
Panel member Roger Depue, a longtime FBI profiler, says, "Just writing fantasies isn't the problem. It's the combination of disturbing writing and all the other danger signs."
There can be no doubt that public safety must trump privacy rights, particularly in a university setting. Cho's dysfunction had been noted and treated by his high school counselors -- but this was never communicated to Virginia Tech. As his problems intensified at college, his parents were never alerted. Cho spoke to employees in the campus counseling center three times in fifteen days in late 2005 and early 2006, but they failed to follow up and treated his case in an indifferent manner.
Virginia Tech Provost Mark G. McNamee said the university will push for changes in the privacy laws: "I think we are moving into a new era, a new national dialogue" about how individual privacy rights are weighed against public safety, McNamee said. "Safety has clearly risen to a higher profile."
It is sad indeed that we need a tragedy of this magnitude to restore a common sense approach to our treatment of troubled and potentially violent students on the nation's college campuses. *
"He who steals from a citizen, ends his days in fetters and chains; but he who steals from the community ends them in purple and gold." --Marcus Porcius Cato
Allan C. Brownfeld is a syndicated columnist and associate editor of the Lincoln Review, a journal published by the Lincoln Institute of Research and Education, and editor of Issues, the quarterly journal of the American Council for Judaism.
Heading::: Conservatives and Neo-Conservatives: The Foreign Policy Differences Are Significant and Worthy of Serious Debate
As the presidential election campaign of 2008 gets under way, and as America's role in the world undergoes increasing scrutiny both at home and abroad, it would be useful for a serious exploration of what that role should be in the post-Cold War World.
The "democratic globalism" which has been promoted by those who call themselves neo-conservatives is quite different from the traditional conservative approach to foreign policy. The war in Iraq and the effort to spread "democracy" to the Middle East are the products of this world view. The results of this crusade are less than clear. The intellectual underpinnings of this effort, however, are far from what conservatives have generally believed about foreign policy. George Will, the conservative political commentator, notes that:
On foreign policy, conservatism begins, and very nearly ends, by eschewing abroad the fatal conceit that has been liberalism's undoing domestically-hubris about controlling what cannot, and should not, be controlled. Conservatism is realism about human nature and government's competence . . .
Those who would base U.S. foreign policy upon the internal governmental organization of a particular country rather than its foreign policy and international actions, traditional conservatives argue, may misunderstand the very nature of what U.S. policy is meant to achieve. Pat Buchanan notes that:
The point here is quite simple: Because a nation has a free press, free elections, and a bicameral legislature does not alone make it a valued ally of the United States; and because a nation is ruled by an autocrat, a king, or a general does not make it an enemy. When Americans were dying in Vietnam, one recalls, NATO merchant ships were hauling goods to Hanoi, and Swedish diplomats were harassing us at the United Nations. Meanwhile, South Korean soldiers were fighting alongside ours. Not all our friends are democratic; and not all democrats are our friends.
On June 7, 1988, a conference entitled "Promoting Democracy Abroad," was held at the American Enterprise Institute in Washington, D.C. It was sponsored by the Philanthropic Roundtable, a group of foundations and grant-makers. Writing in Philanthropy (Spring 1987), Carl Gersham of the National Endowment for Democracy, declared that:
The job of assisting democratic political developments cannot be left solely to governments. Democracy cannot be sustained without the existence of countless private organizations and institutions. . . . Beyond its ability to mobilize financial resources, private philanthropy can engage private institutions and individuals whose knowledge and technical skills are vital resources for those seeking to develop new democratic structures.
Presenting a conservative rejoinder to the philosophy of promoting global democracy was Paul Gottfried, professor of political science at Elizabethtown College and author of a number of important books, including The Conservative Movement. Dr. Gottfried states at the outset that he shared the views of the other speakers concerning the "inhumanity endemic to Communist control anywhere" and noted that he would "find nothing wrong with having our leaders call attention to particularly heinous acts committed by other states or shamelessly tolerated within their borders," in the case of non-Communist regimes. He rejected the idea that opposition to the notion of spreading democracy around the world was rooted in "ethical relativity."
In this regard, Gottfried states that:
. . . generations of statesmen who believed in revealed religion and prescribed morality accepted a politically pluralistic universe. The 18th century Whig reformer and advocate of the American colonies, Edmund Burke, wrote to the sheriffs of Bristol in 1777 on the affairs of America: "I was never wild enough to conceive that one method would serve for the whole, that the natives of Hindustan and those of Virginia could be ordered in the same manner. . . . I was persuaded that government was a practical thing, made for the happiness of mankind, and not furnish out a spectacle of uniformity to gratify the schemes of visionary politicians.". . . Burke held a basic unquestioned assumption of Western political thought from Aristotle to Montesquieu and beyond: that there are different good or at least tolerable regimes adapted to the needs and histories of different peoples.
Those who advocate a global democratic revolution, Gottfried argued:
. . . reject that axiom not because they preach a higher morality than did Burke or Montesquieu. They are addicted to a 20th century enthusiasm that denies respect to all forms of government but their own.
To the argument that the U.S. should export democracy throughout the world because democratic regimes will feel a natural affinity for the U.S., Gottfried points to a long historic record that would challenge such a thesis:
It was Athens' bullying of a fellow-democracy, Corinth, that contributed to the entry of Sparta, Corinth's ally, into the Peloponnesian War. Democracies and, in the early 20th century, constitutional monarchies -- e.g., England and Imperial Germany and Habsburg Austria-Hungary -- have in fact fought each other.
Because, on particular occasions, the U.S. may "sometimes be called upon to make political-structural changes in far-off lands in pursuit of our national interest," Gottfried concluded, "it does not follow that we should dictate the constitutional-social arrangements by which the entire world must live."
Traditional conservatives point to the manner in which the U.S. policy of "democratization" brought the Sandinistas to power in Nicaragua. Almost from the beginning of his presidency, Jimmy Carter tightened the screws on Nicaragua. By executive decree, the president prohibited the sale of military equipment. The president's representative on the International Monetary Fund twice blocked badly-needed standby credits for Nicaragua. When financing for Nicaragua's hydro-electric dam project was obtained through other nations, President Carter pressured those nations to cancel the financing arrangements.
Under orders from the White House, the U.S. Department of Agriculture gave instructions to beef inspectors to shut down Nicaraguan beef exports to the U.S. The U.S. Embassy in Managua called and advised businessmen for the opposition political party to transfer their dollars from Nicaragua to the U.S. At one point, the U.S. Ambassador to Nicaragua, William Bowdler, told President Somoza that "the Carter policy was to see that all of the right-wing governments in Central America were replaced and that Nicaragua would be the first to go."
Under serious attack from rebels armed by Cuba, President Somoza was unable to purchase needed military supplies from the U.S. When he was finally able to purchase vital ammunition from Israel, Somoza later said that as the ship approached the Nicaraguan coast:
. . . it turned back to Israel. We suspected the reason for the sudden change in shipping plans, and later our suspicious were verified. U.S. intelligence had learned the destination of this ship and the cargo she carried. Under extreme pressure applied by the U.S., Israel made the decision to return the ship. . . . When Carter says the U.S. played no role in the death of my government, and when he says he didn't know international law was being violated, he is lying. . . . At the time of my departure, we must have had close to 20,000 men who wanted to fight the enemy. These men were never defeated by international invaders; they simply did not have the means with which to fight.
Thus, in the view of traditional conservatives, it was U.S. policy that put the Sandinistas in power -- just as it was U.S. policy which assisted the Ayatollah Khomeini in coming to power in Iran and Robert Mugabe in assuming power in Zimbabwe.
More recently, of course, particularly under the influence of neo-conservatives, the Bush administration has embarked upon an ideological crusade of "democratizing" the Middle East, most prominently in Iraq.
Not only have the reasons for going to war in Iraq proven to be less than persuasive, but the Bush administration's assessment of what was necessary for success appears to have been misleading and far from reality.
On May 1, 2003, President Bush gave a speech aboard the aircraft carrier U.S.S. Lincoln beneath a large "Mission Accomplished" banner. Kenneth Adelman, head of the Arms Control and Disarmament Agency during the Reagan administration, predicted that the mission would be a "a cakewalk." Other advocates of the war were equally optimistic. It would be like Paris in 1944, with the Iraqis greeting American troops as liberators, not occupiers. In April 2003, columnist Mark Steyn predicted that "in a year's time Baghdad and Basra will have a lower crime rate than most British cities." Furthermore, there would be "no widespread resentment or resistance of the Western military presence."
Those who worried about the deep ethno-religious divisions in Iraq were summarily dismissed. On April 1, 2003, William Kristol, editor of The Weekly Standard, wrote that:
. . . there's been a certain amount of pop sociology in America . . . that the Shi'a can't get along with the Sunni, and the Shi'a in Iraq just want to establish some kind of Islamic fundamentalist regime. There's almost no evidence of that at all.
In May, Washington Post columist Charles Krauthammer stated, "The U.S. is in a position to bring about a unique and potentially revolutionary development in the Arab world: a genuinely pluralistic, open and free society." Department of Defense planners assumed the U.S. troop levels would be down to 50,000, or even lower, by the end of 2003. Some military experts, however, warned that such optimism was unwarranted. Gen. Eric Shinseki, the Army chief of staff, predicted that the occupation would require "several hundred thousand troops" for a period of "many years." Deputy Secretary of Defense Paul Wolfowitz flatly rejected Shinseki's assessment in congressional testimony and, for his candor, Shinseki was pressured into early retirement.
Wolfowitz also rejected the idea that the occupation would be a financial drain. He predicted that Iraq's oil revenues would pay for the entire cost of reconstruction. Those officials who did not share such an optimistic view were removed from office. Larry Lindsey, chairman of the President's Council of Economic Advisers, warned that the cost of the Iraq occupation would exceed $200 billion. He was quickly pressured out of his post. Even Lindsey's estimate was low. The Iraq war has cost far more, as of March 2007 it was $350 billion and counting. That figure does not include long-term, indirect costs, such as the continuing medical care and rehabilitation expenses for more than 22,000 service personnel who have been wounded. Former Rep. Lee Hamilton, co-chairman of the Iraq Study Group, has stated that the costs could exceed one trillion dollars in the near term.
In January, 2002, more than a year before U.S. troops entered Iraq, Ted Galen Carpenter, vice president for defense and foreign policy studies at the Cato Institute, cautioned that:
No matter how emotionally satisfying removing a thug like Saddam may seem, Americans would be wise to consider whether that step is worth the price. The inevitable U.S. military victory would not be the end of America's military troubles in Iraq. . . . Washington would become responsible for Iraq's political future and the U.S. would be entangled in an endless nation-building mission beset by intractable problems.
As war grew nearer, other experts in the traditional conservative "realist" school of foreign policy echoed such warnings. On September 26, 2002, 33 prominent foreign affairs scholars published an advertisement in The New York Times under the headline "War in Iraq Is Not in America's National Interest." They noted that the administration of George H. W. Bush "did not try to conquer Iraq in 1991 because it understood that doing so could spread instability in the Middle East. . . . This remains a valid concern today." They added: "Even if we win easily, we have no plausible exit strategy. Iraq is a deeply divided society that the U.S. would have to occupy for many years to create a viable state." Those who signed that ad included University of Chicago Professor John Mearsheimer, MIT Professor Barry Posen, Columbia University Professor Richard K. Betts, and the dean of Harvard University's Kennedy school of Government, Stephen M. Walt.
Writing in Chronicles, the Cato Institute's Ted Galen Carpenter notes that:
Not only did the administration and other proponents of war ignore such warning, they refused later on to recognize growing evidence that the mission was going badly. Even as the security environment deteriorated, the chorus of optimism scarcely diminished. In early 2005, Vice President Dick Cheney confidently asserted that the insurgency was "in its last throes." By late 2006, though, the evidence of massive disorder in Iraq was irrefutable. Instead of admitting error, most of the hawks have redoubled their efforts to give advice about future strategy. . . . The increasingly shrill neo-conservatives argue that the Bush administration had launched the mission with too few troops -- even though most of the lobbyists for war had argued exactly the opposite at the time. (Indeed, some of them, including Wolfowitz, had proposed going in with even lighter force -- no more than 40,000 or 50,000 troops). Now, they insist that even the existing force of 145,000 is insufficient.
In Carpenter's view:
Except when the survival of the nation is at stake, all military missions must be judged according to a rigorous cost-benefit calculation. Iraq has never come close to being a war for America's survival. Even the connection of the Iraq mission to the larger war against Islamic terrorism was always tenuous at best. For all of his odious qualities, Saddam Hussein was a secular tyrant, not an Islamic radical. Indeed, radical Muslims expressed nearly as much hatred for Saddam as they did for the U.S. Iraq was an elective war -- a war of choice, and a bad choice at that. . . . Alarm bells should be ringing when the people who pushed America into the folly of a nation-building mission in Iraq are now advocating a redoubled effort.
Assessing the role of neo-conservatives in the formulation of U.S. policy toward the Middle East, Professor Andrew J. Bacevich of Boston University, writing in the The American Conservative, declares:
Neo-conservatives . . . believe that the U.S. is called upon to remake the Middle East, bringing the light of freedom to a dark quarter of the world. Pseudo-realists like Baker (James Baker, co-chairman of the Iraq Study Group) believe that the U.S. can manipulate events in the Middle East, persuading others to do our bidding. Both views, rooted in the conviction that Providence has endowed America with a unique capacity to manage history, are pernicious.
In Bacevich's view:
The way forward requires abandoning that conviction in favor of a fundamentally different course. A sound Middle East strategy will restore American freedom of action by ending our dependence on Persian Gulf oil. It will husband our power by using American soldiers to defend America rather than searching abroad for dragons to destroy. A sound strategy will tend first to the cultivation of our own garden. A real course change will require a different compass, different navigational charts, and perhaps above all different helmsmen, admitting into the debate those who earn their livelihoods far from the imperial city on the Potomac.
The tragedy of neo-conservatism, argues The Economist,
. . . is that the movement began as a critique of the arrogance of power. Early neocons warned that government schemes to improve the world might end up making it worse. They also argued that social engineers are always plagued by the law of unintended consequences. The neocons have not only messed up American foreign policy by forgetting their founders' insights. They may also have put a stake through the heart of their movement.
Needless to say, all Americans should hope for as successful an outcome in Iraq as possible. However mistaken the arguments presented in behalf of the war, it is in the interest of our own country and our friends in the region that Iraq is left better than we found it. Beyond the events of the moment, however, what is required is a careful revisiting of the different foreign policy perspectives of traditional conservatives and the neo-conservatives who have been so influential in the current administration. That debate has been postponed for far too long.
Life in Zimbabwe continues its dramatic decline as Robert Mugabe's cult of personality continues to grow.
In June, Mugabe threatened to seize foreign companies, including mines, which have raised prices and cut output in what he called an economic "dirty tricks" campaign to oust his government. Zimbabwe's businesses -- including a dwindling number of local subsidiaries of multinational companies, older white-owned firms, and black-owned companies that prospered after independence in 1980 -- are already struggling in what the World Bank calls the fastest-shrinking economy for a country not at war.
Analysts said that approval of the bill could deepen the economic crisis, which has pushed Zimbabwe to the brink of collapse, with inflation now believed to be more than 10,000 percent a year.
Yet, as Zimbabwe's people are struggling to survive in the face of growing food and fuel shortages, and the prospect of power cuts for up to 20 hours a day, President Robert Mugabe, who has led the country since 1980, is spending almost $4 million on a grandiose project -- a monument to himself. Work has already begun on a museum dedicated to his life in his home district of Zvimba. Construction of the grand edifice, which will cover the area of a soccer field and has been dubbed the "Mugabe Shrine" is being supervised by the local government minister, Ignatius Chomvo. Mugabe's extravagance is well known. Besides his five official residences, he owns a number of private houses including the most recent edition, a palatial three-story, 25-bedroom, $15.8 million residence in the exclusive Harare suburb of Borrowdale.
A recently published book, My Life with an Unsung Hero by Vesta Sithole, the widow of an early leader in the fight for Zimbabwean independence, the Rev. Ndabaningi Sihthole, places some of the latest developments in that country in historical perspective.
Vesta Sithole was a participant in the battle for independence and democracy in the former Rhodesia from the very beginning. She helped to recruit people for military training, sheltered freedom fighters and raised funds. As a result of her active participation in Zimbabwe's road to freedom, the author was harassed jailed and subject to mistreatment by both the Rhodesian forces and her fellow citizens. In 1967, she married Tanzanian banker and economist Jackson Mwakalyelya, with whom she had four children. She was widowed in 1972 and in 1980 married the liberation fighter and founder of the Zimbabwe African National Union (ZANU), the late Rev. Sithole.
We see from this memoir that Robert Mugabe was committed from the beginning to total control and that his verbal commitment to democracy and freedom was far from the reality of what he had in mind.
Mrs. Sithole writes that:
March 3, 1979, was the day Rev. Sithole, Bishop Muzorewa, Chief Chirau, and Mr. Smith signed the Internal Agreement. After the agreement was signed, an interim government was put in place and the country's name officially changed to Zimbabwe/Rhodesia. Mr. Smith and his delegation could not imagine the country without the name "Rhodesia" in it. . . . The Executive Council was made up of Rev. Sithole, Bishop Muzorewa, Chief Chirau, and Ian Smith. The Executive Council would rotate every month, and each leader would be the chairman of the council and run the affairs of the state before the scheduled election took place. It was not a perfect constitution, but it was a beginning.
Robert Mugabe, then outside the country, fought the internal settlement, in part because it did not elevate him to the power he sought. To the end, after a series of assassinations of his opponents, Robert Mugabe was sworn in as prime minister of independent Zimbabwe. Mrs. Sithole notes that:
Rev. Sithole was not even invited to the independence celebrations held in the country. . . . However, he did hope that Mr. Mugabe would give the Zimbabwean people what they fought and died for -- freedom to speak, freedom to associate, and freedom to be Zimbabweans. He publicly said this at our Waterfalls home soon after Mr. Mugabe was sworn in. He added that he would oppose Mr. Mugabe if he did not rule well.
While living in the U.S. the Rev. Sithole wrote two books, The Secret of American Success -- Africa's Great Hope and Hammer and Sickle Over Africa. He was convinced that the example of American freedom and free enterprise was the path Africa should take, not that of Communism. He declared:
It should be noted that what is Africa's great hope is not America, but the secret of American success. This is to say the principles that explain America's success are Africa's great hope, The constant success factors that have served the U.S.A. during the last 210 years are Africa's/Zimbabwe's great hope if she will adopt the same. . . . Allow the people free enterprise and they will succeed beyond belief. Deny them free enterprise, and you kill real success among them. Africa/Zimbabwe is more suited to a free economy than a nationalized economy, which tends to benefit mostly the ruling elites, and to retard ordinary people.
For Sithole, individual freedom was the path for Zimbabwe's future success:
Let everyone in Africa/Zimbabwe have his own dream, not another's dream. Let him become what he himself wants to become. In colonial days people were forced to become subjects of the colonial powers. In present-day Africa people are still forced to become Marxists, Marxist-Leninists, Communists, or Socialists. In other words, they are forced not to become themselves, but carbon copies of others.
After returning to Zimbabwe, the Sitholes suffered under Robert Mugabe's rule. Their home and property was confiscated and the Rev. Sithole were arrested and charged with treason. Finally, for health reasons, he was permitted to leave for medical care in the U.S., where he died.
Mrs. Sithole notes that:
The arrest of my husband on treason charges was neither the beginning nor the end of arrests by Mr. Mugabe of his political opponents. It is my observation that it was, and still is, Mr. Mugabe's modus operandi to charge with treason those who oppose him. Many will recall that Joshua Nkomo was accused of trying to overthrow Mr. Mugabe's government. . . . Fearing for his life, Mr. Nkomo escaped to Botswana disguised as a woman, and later made it to the United Kingdom. . . . Bishop Abel Muzorewa was imprisoned for 10 months after being accused of trying to kill Mr. Mugabe. Even Edgar Tekere, once his secretary general and right-hand man, was accused of trying to kill Mr. Mugabe when they become estranged.
Now living in the U.S., Vesta Sithole remains committed to speaking out against the injustices her fellow citizens of Zimbabwe have suffered at the hands of Robert Mugabe. Having devoted her life to the struggle for a free Zimbabwe, she believes that the people of her country deserve the freedom they fought for. The world that was eager for Zimbabwean independence, however, now seems indifferent to its fate.
In August, Southern African leaders did not urge Zimbabwean President Robert Mugabe to enact reforms in his country during a regional meeting in Lusaka, Zambia, and appeared satisfied with his human rights record.
Zambian President Levy Mwanawasa, then new chairman of the Southern Africa Development Community (SADC) said the group of countries had relied on a report submitted by South Africa on Zimbabwe's crisis and had not raised the issue with Mr. Mugabe.
South African President Thabo Mbeki, who has been mediating between the Zimbabwean government and its opposition, submitted the report. "We are quite happy that Mr. Thabo Mbeki was capable enough and was moving in the right direction," Mr. Mwanawasa said.
Asked if the SACD had pressed Mugabe on accusations of widespread human rights abuses during the summit, Mr. Mwanawasa said, "We have discussed them (abuse claims) and we are satisfied with the answers which were given."
It is clear that Southern African nations do not have the resolve to confront the regime of Robert Mugabe. Earlier, Mr. Mwanawasa had raised hopes that African states would break their public silence on Robert Mugabe when he became the first leader of the continent to publicly criticize him. But his description of Zimbabwe as a "sinking Titantic" during a trip to Nambia earlier this year has faded and he has softened his position.
Despite his brutal regime, Robert Mugabe, somehow, remains immune to criticism from his neighbors. More than this, the New York Times reported that at the Lusaka meeting:
Mr. Mugabe arrived . . . to a fusillade of cheers and applause from attendees that . . . overwhelmed the polite welcomes for other heads of state. . . . Mr. Mugabe was unrepentant and the comments of leaders of neighboring states about Zimbabwe's descent were notably bland.
Zimbabwe's seven-year decline is a result of increasingly repressive policies imposed by the Mugabe regime that have driven away foreign investment and stoked hyperinflation. In June, the government ordered all businesses to roll back prices, effectively declaring inflation illegal. Since then, factories have curtailed production, workers have been laid off, and store shelves have emptied of basic goods. In Bulawayo, Zimbabwe's second-largest city, a security guard and a child were reported to have been killed in August after a line of shoppers awaiting a rare delivery of sugar mobbed a storefront, toppling a brick wall on top of them.
While the rest of Africa -- and most of the world -- turns away, Zimbabwe is in a state of collapse. Water shortages have worsened because of pump breakdowns, and a senior government official said kidney patients were dying for lack of dialysis machines. Power, water, health, and communications systems are collapsing, and there are acute shortages of staple foods and gasoline. Unemployment is around 80 percent.
As 11 million people in Zimbabwe descend into destitution, a tiny slice of the population is becoming ever more powerful and wealthy at their expense. Zimbabwe, critics charge, is fast becoming a kleptocracy, and the government's seemingly inexplicable policies are in fact preserving and expanding it.
"Their sole interest is in maintaining power by any means," said David Coltart, a white opposition member of Parliament. "I think their calculation is that the rest of Africa is not going to do anything to stop them, and the West is distracted by Iraq and Afghanistan. The platinum mines can keep the core of the elite living in the manner they're accustomed to -- just in a sea of poverty."
According to New York Times correspondent Michael Wines, writing from Zimbabwe:
In interviews, Mr. Coltart's view was widely shared by blacks and whites alike, many with no political ax to grind. Even a governing party politician allowed that whatever the aims of Mr. Mugabe's policies, their execution had gone terribly awry. Zimbabwe's farm seizures destroyed the nation's rich agriculture industry, and, as a form of patronage, vast tracts of land were handed over to party elites with little experience or interest in farming. The looming take over of business is expected to produce the same result.
Zimbabwe's falling currency -- 200,000 Zimbabwe dollars now buy a single American dollar on the black market -- has rendered the salaries of working Zimbabweans all but worthless. Yet the official exchange rate is not 200,000 to 1, but 250 to 1. Those with connections to the government's reserve bank are widely said to buy American dollars cheap and sell them dear -- and reap an 800-fold profit on currency transactions.
Things have become so bad that the Archbishop of Bulawayo, Pius Ncube, has called on Britain to invade his country and engage in regime change against Robert Mugabe. Bishop Neube warns that millions of his countrymen face starvation without outside intervention. There is "massive risk to life" said the leader of Zimbabwe's one million Roman Catholics. "People in our mission hospitals are dying of malnutrition," says Bishop Ncube.
We had the best education in Africa and now our schools are closing. Most people are earning less than their bus fares. There's no water or power. Is the world just going to let everything collapse in on us?
Thus far, sadly, the answer appears to be yes. And those in the U.S., Britain, and elsewhere who so eagerly embraced Robert Mugabe in 1980, and bear some responsibility for his ascent to power, seem totally indifferent to the results of their efforts. *
"Liberty cannot be caged into a charter or handed on ready-made to the next generation. Each generation must recreate liberty for its own times. Whether or not we establish freedom rests with ourselves." --Florence Ellinwood Allen
In the 1960s, Senator Daniel Patrick Moynihan (D-NY), then Assistant Secretary of Labor, produced a report entitled "The Negro Family: The Case for National Action." He found that a quarter of all black children were born to unmarried women and the percentage was rising. The tangle of poverty and despair was bleak, and Moynihan predicted that is would get worse. It has. Today, among non-Hispanic blacks, the out-of-wedlock birth rate had reached 69.5 percent. Beyond this, trends he found troubling within the black community are now to be found in the larger society as well. The illegitimacy rate for Hispanics has reached 47.9 percent and the rate for non-Hispanic whites now exceeds 25 percent.
For his efforts, Moynihan was sharply criticized by liberals for "blaming the victim," a catchphrase that was coined by his critics and has now entered the lexicon. Now, however, that notion is being increasingly discredited. In an important new book, Ghetto Nation (Doubleday), Cora Daniels, an award-winning journalist who is now a contributing writer for Essence and has served as a commentator on CNN, BET, and NPR, and is the author of the widely praised book, Black Power, Inc., argues that it is the "ghetto mindset" that is harming the future of residents of the nation's inner cities, and that corporate America bears a share of the responsibility for promoting this destructive mindset. Daniels joins a long list of thoughtful black critics of the prevailing ghetto culture, among them comedian Bill Cosby and author Juan Williams.
For Daniels, "ghetto" is a condition -- an addiction, even -- that has spread through American popular culture. It is an impoverished mindset defined by conspicuous consumption and irresponsibility. She writes that:
Ghetto no longer refers to where you live; it is how you live. It is a mind-set. . . . The jump from an impoverished physical landscape to an impoverished mental one is harder to trace. . . . there is no denying that these days ghetto, as it is used, had indeed made that leap. I did not reposition ghetto from noun to adjective -- we all did that. . . . As a black woman surviving, and drowning, in Ghettonation, I am defining ghetto as a mindset. . . . A mindset that thinks it is fine to bounce, baby, bounce, in some video, as if that makes it any different from performing such a display on a table, on a pole, on some john's lap, or on the corner. And a mindset that thinks a record deal and a phat beat in the background makes it to okay to say . . . well I do know what bad language is, so I won't say. Most of all, ghetto is a mindset that embraces the worst. It is the embodiment of expectations that have gotten dangerously too low.
A number of academic studies show how this mindset has had harmful ramifications. Professors from M.I.T. and the University of Chicago wanted to see what was in a name. They answered thousands of real newspaper want ads sending in identical resumes except for the name. One group of resumes was sent to the same companies with names like Latoya, Shamika and LaShawn. The Brads and Kristens were 50 percent more likely to be called back for an interview than the Shamikas and LaShawns with the exact same resumes. Daniels writes:
The researchers thought with the results of their LaShawn and Shamika resumes they were making a statement about race [but in reality] they really uncovered the side effects of the ghetto.
Several years ago, anthropologist John Ogbu, who coined the term "acting white" to explain why some black students seemed to shun doing well in school, released an even more explosive study about black middle-class students in suburban Shaker Heights, Ohio, outside of Cleveland. In pointing the finger for poor performance in school back at parents instead of at "the system," the late scholar drew criticism from both his colleagues and the community. It is Daniels' view that there is much to learn from Ogbu's efforts.
She notes that
Ogbu was invited to Shaker Heights not by school district administrators or teachers but by black parents themselves. They wanted to know why their children -- the sons and daughters of doctors, lawyers, judges, business execs -- were doing so much worse in school than their white classmates. Although black students in the middle-class district were some of the best students in the state, they were still lagging behind whites in the district in terms of grade point averages, standardized test scores, and enrollment in advanced-placement courses. . . . The situation sounded like my own high school experience. While my good school was approximately 60 percent black, I would typically find myself among the same small bunch of black students in all of my AP and honors classes. By the time I was a senior, when my schedule was filled with nothing but demanding classes, I had virtually no black classmates, even though I was attending a black school.
In his 2003 book, Black American Students in an Affluent Suburb: a Study of Academic Disengagement, John Ogbu concluded that black students in Shaker Heights weren't doing as well as their white counterparts because their parents didn't stress education enough. It was disengagement of the worst kind. Not discounting other factors such as low teacher expectations and prejudiced personnel, Ogbu failed the efforts of the black students and their parents. Despite parents' obvious concerns, illustrated by the fact that they invited the anthropologist to come and have a look to begin with, Ogbu concluded that many of the same black parents did not stress homework, attend teacher conferences, or push their children to enroll in the most challenging classes as much as their white counterparts did. In addition, he suggested that the black students suffered from what he termed "low effort syndrome," meaning that they didn't work as hard even though they knew how much work was needed to succeed in the Shaker Heights schools.
What amazed me is that these kids who come from homes of doctors and lawyers are not thinking like their parents; they don't know how their parents made it. . . . They are looking at rappers in ghettos as their role models; they are looking at entertainers. The parents work two jobs, three jobs, to give their children everything, but they are not guiding their children.
A 2004 study by the Civil Rights Project at Harvard University and the Urban Institute concluded that only 50 percent of black students graduate from high school and 53 percent of Hispanic students are doing so. For male students, the figures are even worse. Only 43 percent of black males and 48 percent of Hispanic males graduated. The reality, Daniels shows, may be even worse:
Researchers concluded that the official data on dropout rates were misleading, arguing that in reality there are actually more dropouts than schools report. The study charged that school districts and states routinely try to depress their dropout numbers by pushing out and eliminating problem students from school rosters, especially right before state tests, because higher scores translate into extra funding and certification credentials. In a study of the 100 biggest school districts in the country, in almost half the schools sampled, the size of the senior class had shrunk by more than half compared to the class size back in the ninth grade, four years earlier. In one year in Chicago, 15,653 students graduated from high school while 17,404 dropped out.
Sadly, corporate America has devoted a great deal of its resources promoting the ghetto mindset. Daniels notes that:
Madison Avenue has certainly put its cash behind the tomorrow-doesn't-matter message. . . . The "I am What I am" billboard that I saw most often . . . featured 50 Cent with his stale . . . frown. His quote, displayed against a police fingerprint sheet, read: "Where I'm from there is no Plan B. So take advantage of today because tomorrow is not promised.". . . In Ghettonation, living within your means just isn't done. There is no need to when you think tomorrow doesn't matter.
Middle-class blacks often claim to be from the inner city to achieve success. Rapper Russell Jones -- known by a variety of names including ODB (Ol' Dirty Bastard) -- died at the age of 35. In its multi-day coverage of his death, The New York Times reported that: "As ODB, he was also uncomfortable spinning a public mythology, saying, for example that he had grown up on welfare, or that he had not known his father." Neither was true. "Our brother looked at things as selling records," said his sister Monique Jones. "So he dismissed whatever lies he told as just a way of getting publicity."
Cora Daniels points out that:
Hip-hop is the music that trumps through black neighborhoods and encompasses the black images that are spit across the world. It is what our children are bouncing their heads to. So as a black woman living within earshot of what is coming out of the mouths of young black folks over the radio these days, I can't afford not to say something. Especially when the ho-ing of black women has become big business.
In addition to hip-hop, music is "street fiction" which has been a constant strand in black literature for decades. The first of such writers was probably Iceberg Slim, who after being released from ten months of solitary confinement in Cook County Jail penned Pimp: The Story of My Life, published in 1969. Graphic in both language and subject matter, the book broke narrative ground by capturing Slim's life as a pimp in Chicago in the 1950s. In 2003, Pimp briefly graced U.P.I.'s top ten mass-market paperback list alongside To Kill A Mockingbird, The Hobbit, and Fahrenheit 451. Donald Goines, also a pioneer in street lit, first read Iceberg Slim while doing his last stint behind bars. The result for Goines, a heroin addict who pimped and robbed to support his habit, was the birth of his first book, Dopefiend, an instant ghetto classic published in 1970 that still sells 200,000 copies a year.
James Fugate, owner of Eso Won Books, a black bookstore in South Central Los Angeles, has one word for ghetto lit: disturbing. "I'm sick of talking about it," he says.
To me people can read what they want to read. I've never been opposed to books by Donald Goines and Iceberg Slim. But those books were bridges to other literature.
The ghetto lit being written today, he says, is mostly "mindless garbage about murder, killing, thuggery."
Dr. Todd Boyd, a member of the faculty at the School of Cinema and Television at the University of Southern California, was asked why ghetto lit is the fantasy so many readers are choosing. "The ghetto is drama," he said. "The ills of poverty are far more dramatic than the angst of middle-class life."
Daniels writes that:
It struck me just how universal Boyd's truism was when author James Frey's credibility began to shatter into a million pieces in the winter of 2006 when his best-selling memoir, A Million Little Pieces, about his drug addiction and rehab struggles was found to be soaked with untruths. The interesting things about Frey's embellishments is that he didn't lie to make his life seem better but to make it seem worse. "I am an alcoholic and I am a drug addict and I am a criminal," he wrote repeatedly. One of the most blatant lies that Frey wrote about was the criminal part. He claimed he did a three-month stint in jail for beating up a cop. It never happened. . . . Remember when folks used to lie their way up? Like claiming we made more money that we did, had better jobs than we had, were the hero instead of the sidekick, raised extraordinary beings instead of average ones? Now folks are lying their way downward. And why not? Frey's book was the second biggest seller of 2005 behind only the new Harry Potter. Being a "criminal" sells. Ghetto!
All too often, the black establishment has embraced those who promote this ghetto mentality. The NAACP, for example, nominated rapper R. Kelly for an Image award after the singer had already been charged with child pornography. In 2005, one of the most celebrated independent films was "Hustle & Flow," a movie about a pimp turned rapper in Memphis. The film earned Terrence Howard an Oscar nomination for playing the pimp. Daniels writes that:
In one of the clearest signs of pimp praise, "Hustle & Flow's" title song, "It's Hard Out There for a Pimp," by Three 6 Mafia, won an Academy Award in 2006 for best song. Upon receiving their Oscar, Three 6 Mafia had to be bleeped by network censors several times during their acceptance speech.
Fortunately, Daniels reports, more and more prominent black figures are beginning to speak out against the ghetto mindset. Professor Orlando Patterson, a sociologist at Harvard, says that it is a culture of self-destructiveness that is holding black men back. According to Patterson, a so-called "cool-pose culture" that includes "hanging out on the street, dressing sharply, sexual conquests, party drugs, hip-hop music" was just too gratifying to give up. Culture was making the pull of the ghetto attractive. "Not only was living this subculture immensely fulfilling," wrote Patterson. "It also bought them a great respect from white youths."
To those who decry black spokesmen challenging ghetto mindset as washing the community's "dirty laundry" in public, comedian Bill Cosby responds:
Let me tell you something, your dirty laundry gets out of school at 2:30 every day, it's cursing and calling each other nigger as they're walking up and down the street. They think they're hip. They can't read; they can't write. They're laughing and giggling and they're going nowhere.
Like Bill Cosby's initial, and much-discussed comments about the problems within the black community today, Cora Daniels' book should trigger widespread interest and heated debate. And it is not only the black community that is involved. She laments that, "Ghetto is also packaged in the form of music, T.V., books, and movies, and then sold around the world . . . ghetto is contagious and no one is immune."
Germany today boasts the fastest growing population of Jews in Europe. The streets of Berlin abound with signs of a revival of Jewish culture. In September, 2006, Germany ordained its first rabbis since World War II, an event hailed as a milestone in the rebirth of Jewish life in the country where the Holocaust began. German President Horst Koehler declared:
After the Holocaust many people could never have imagined that Jewish life in Germany could blossom again. That is why the first ordination of rabbis in Germany is a very special event indeed.
In an important new book, Being Jewish in the New Germany (Rutgers University Press), Professor Jeffery N. Peck of Georgetown University, who is also a senior fellow in residence at the American Institute for Contemporary German Studies, explores the diversity of contemporary Jewish life and the complex struggles within the community over history, responsibility, culture, and identity. He provides a glimpse of an emerging, if conflicted, multicultural country and examines how the development of the European Community, globalization, and the post-9/11 political climate play out in this context.
Today there are more than 100,000 registered members of the official German Jewish community and many more Jews who are not affiliated. While the Jewish population is still relatively small in relation to the total German population, its moral and political significance outweighs its size. In 1933, it was estimated that Germany had about 500,000 Jews. At the end of World War II, Germany's and Europe's Jewish population was decimated to a mere remnant of survivors.
"Now in the new millennium," writes Professor Peck:
. . . there is all the more reason to celebrate the triumph, as many see it, over Hitler's Final Solution. Germany's Jewish population has gained prominence as the fastest-growing Jewish community in Europe and the third largest overall. Jewish Berlin has become a popular tourist site and home of major international and Jewish organizations. The leaders of institutions, like the American Jewish Committee (AJC), understand the importance of a sustained Jewish life in Germany. Far ahead of the Jewish public, they have established positive relations with Germany for decades. Yet, American Jews are often unable to overcome old stereotypes and prejudices. Many American Jews still feel uncomfortable traveling to Germany or even buying German products. In their minds, all Germans, even those born after the war, are tainted by genocide.
Today, Peck argues:
Being "German" and being "Jewish," separated identities that have a long history, especially since the Holocaust, are no longer mutually exclusive. Before the Second World War, most German Jews thought of themselves as Germans. . . . After the war, the terminology separating Germans and Jews connoted the alienation and separation for those Jews remaining, most of whom were not "German" but displaced persons from Eastern Europe who came to be known by those ignominious initials as DPs. Then, it was simple, the Germans were the perpetrators and the Jews were the victims. As a postwar Jewish community took shape, albeit until recently very small, the term "Jews in Germany" became the dominant description of a people who were not fully comfortable or integrated into the society around them. My own prognosis looks toward a potentially new categorization, a "new" German Jewry that will represent a different status in both historical and contemporary terms.
At the end of the war, about 6,500 Jews survived through marriage to non-Jews, living underground, and other means. About 2,000 returned to Berlin from the concentration camps. Most importantly, approximately 200,000 Jews came to Germany as DPs. Housed in camps established by the U.S. military authority, most of these people did not want to stay in Germany. They were only in transit on their way to Israel, the U.S., or Canada. The few thousand who remained formed the basis of the postwar Jewish community.
In January 1991, after German reunification, Soviet Jews were admitted under the quota refugee law granting them rights ironically only accorded to the so-called ethnic Germans, whose relationship to contemporary Germany after many generations of living in remote areas of the Soviet Union was more imagined than real. This legislation was a turning point for Soviet Jewish immigration since it allowed masses of Jews to enter Germany as immigrants rather than on tourist visas as had previously been the informal practice.
Most of the newcomers, an estimated 80 percent, are not "Jewish" by Orthodox law, meaning they do not have Jewish mothers or do not fulfill other requirements, such as Orthodox conversion. The intolerance of Germany's organized Jewish community is something that Professor Peck laments and finds counterproductive:
Religion, which defines the Jewish Community, allows little diversity; Orthodoxy dominates, some Liberal (Conservative) synagogues fill out the picture, and no official Reform offerings are available, except for the controversial World Union of Progressive Judaism (WUPJ) still fighting for recognition. . . . I know of one story of an esteemed young Jew who was dismissed from a religious Jewish institution in Berlin when it was discovered that his mother's conversion was not properly Orthodox. Whatever the reasons . . . halakhic (Orthodox) regulation remains . . . powerful in a community that some might say can ill afford to be so conservative.
Peck is clearly optimistic about the future of Jewish life in Germany. He writes of the symbolism involved in the official opening of the Jewish Museum of Berlin in September 2001, under the directorship of Michael Blumenthal, an American Jew who escaped Germany through Shanghai as a child, and served as Secretary of the Treasury in the Carter Administration. In his "Welcome" printed in the book Stories of an Exhibition: Two Millennia of German Jewish History, the official documentation of the museum, Blumenthal states:
The Jewish Museum of Berlin is no ordinary museum. As a major institution supported by the Federal Government, the State of Berlin, all political parties, and a broad cross-section of the public, its mission has sociopolitical meaning that far transcends the story it tells of the 2000-year history of German Jewry. It symbolizes, in fact, a widely-shared determination to confront the past and to apply its lessons to societal problems of today and tomorrow.
By presenting a chronological history of the Jews from their earliest settlement in the areas that became the German-speaking lands through the present, it reminds the visitor, especially non-Jewish Germans, Peck points out:
. . . that the Jew is not an "other," an exoticized foreigner who does not belong, but an integral part of a historical German identity. While the urge to harmonize German-Jewish identity is understandable, I would suggest that this exhibition is also a contentious site for definitions of what it means to be German as well as Jewish.
Throughout the exhibition, Jews in their many historical and religious incarnations are presented as an integral part of the German tradition, one that stretches back thousands of years to the time, where the exhibition begins, when the "Children of Israel were expelled from the Holy Land and first came to the Germanic lands." One sees how Jews not only tried to participate in German society that would create the so-called German-Jewish symbiosis that was destroyed by the Nazis, but also that this interrelationship was longstanding. The exhibit declares that, "From the beginning, the history of what is now Germany was a German-Jewish history" and "Jewish merchants were among the first inhabitants of medieval German cities. Often they were among their founders."
Germany's Jewish community, Peck believes,
. . . has a symbolic value beyond the numbers. Although the new community may not be enough to guarantee that Jews will never be the targets of prejudice or attack . . . its mere presence carries weight and makes a powerful statement. It represents the defeat of Hitler's Final Solution, and hope for acceptance of diversity in a country that, unlike the U.S., defined itself for a long time as only white and Christian.
On a January 1996 visit to Germany, Israeli President Ezer Weizman declared that he "cannot understand how 40,000 Jews can live in Germany," and asserted that "The only place for Jews is in Israel. Only in Israel can Jews live full Jewish lives."
Professor Peck rejects such a notion, and shows how German Jews reacted to it. Ignatz Bubis, then head of Germany's main Jewish organization, responded, "I have lived here since 1945 and have met two new generations who simply do not identify with the Nazis. This is a new generation."
Arguing that a Jewish presence in Germany prevents Hitler from achieving his posthumous victory of a "Judenrein" Germany, Bubis declared:
The full revival of the Jewish community in post-war Germany is important. . . . There is no reason to say that Jews cannot live in Germany.
While a proud Jew, he was also proud to be as the title of his memoir emphasizes, "A German citizen of the Jewish faith." Peck Writes:
As Israelis like Weizman tried to isolate German Jews Bubis often had to remind non-Jewish Germans that he did not want to be turned from a German into an Israeli simply because he was Jewish.
Alice Brauner, one of a group called "The Young Jews of Berlin" exemplifies this positive attitude: "I won't emigrate. On the contrary, our roots in this country cannot be broken." The daughter of film producer Arthur Brauner who returned to Germany after the war calls herself a "Jewish Berliner with German citizenship." Brauner calls Germany home: "We stay because we are at home here and feel at home here."
The optimism of those who are working to enhance Jewish life in today's Germany is reported through the many people with whom Jeffrey Peck spoke in the preparation of this book. The Lauder Foundation, now housed in its own Lehrhaus (learning center) -- named after an institution founded by Franz Rosenweig in the 1920s -- is headed by a young rabbi, Josh Spinner, who was born in Baltimore and grew up in Hamilton, Ontario. Peck reports that:
He was lively, engaged, energetic, and clearly committed to the future of Jewish life in Germany. He felt that God had brought the Jews to Germany, and it was his job to do what he could to make a Jewish life possible, no matter where it might be.
Another rabbi active in Berlin is Yehuda Teichtel, a native of Brooklyn who heads Chabad Lubavitch. Teichtel's plans for an enlarged Chabad Center in a new building testify to the future he sees for a flourishing Jewish life in Germany. He believes that the greatest service to the six million killed is the establishment of Jewish life on German soil to prove Hitler wrong.
Interviews with prominent leaders of the Jewish community in Germany and with American directors of important Jewish institutions now located in Berlin convince Peck that:
Jewish life, even with its many problems . . . offers optimism and potentially a vehicle for improving transatlantic relations. The sheer presence of a Jewish population with a variety of religious orientations . . . paints a picture of a thriving and vibrant community.
Transatlantic relations might be improved if more Americans, especially Jewish Americans, know more about the complexities of Jewish life in Germany and the role of German politics and culture in European-wide relations. During the Office of Security and Cooperation in Europe (OSCE) 2004 conference in Berlin on anti-Semitism, the fact that Berlin was the capital of Nazi Germany and now the location of such a meeting, was cited by almost all of the prominent participants. German Foreign Minister Joschka Fischer welcomed the guests with these words:
The German government has invited you all to this conference in Berlin -- in our capital, in the city that almost seventy years ago, not far from here, the destruction of European Jewry was decided, planned and instituted. We, as hosts, want to acknowledge the historical and moral responsibility of Germany for the Shoa. The memory of this monstrous crime against humanity will also influence German politics in the future.
Dr. Peck laments that:
Unfortunately, Germany is still often identified in the U.S. exclusively with past Nazi horrors rather than with its postwar democratic and liberal successes. The site of the OSCE Conference was to demonstrate dramatically Germany's commitment to combating anti-Semitism.
In his description of the rebirth of the German Jewish community, Jeffrey Peck argues that there is, indeed, a vibrant and significant future for Jews in Germany. He also speculates that contemporary European Jewry can transform Judaism to be more inclusive, which he feels would be an important step forward. It is essential, Peck declares, that Americans in general and American Jews in particular see today's Germans and contemporary Germany as they really are, and not as reflected in memories of the Nazi era. It is also essential, he believes, that Israelis abandon the idea that Jews living outside of its borders are, somehow, in "exile," and that genuine Jewish life cannot exist in the larger world and, in particular, in a country so burdened with history as Germany.
Sharing the idea that a thriving and growing Jewish community in Germany is, indeed, a final defeat for Hitler, Jeffrey Peck is optimistic about the future. His book deserves widespread attention. *
"Many a time I have wanted to stop talking and find out what I really believed." --Walter Lippmann
We would like to thank the following people for their generous support of this journal (from 5/11/07 to 6/9/07): Nancy M. Bannick, Reid S. Barker, James M. Broz, David G. Budinger, D. J. Cahill, Robert T. Cristensen, Michael D. Detmer, Nicholas Falco, John B. Gardner, Robert W. Garhwait, Joyce Griffin, Daniel J. Haley, Weston N. Hammel, Mrs. Thomas E. Heatley, Mr. & Mrs. Ken E. Kampfe, Gloria Knoblauch, Daniel Maher, Paul Maxwell, Thomas J. McGreevy, Delbert H. Meyer, Clark Palmer, David Renkert, Patrick L. Risch, Heidi Schumache, Frank T. Street, Thomas H. Webster, Gaylord T. Willett.
The surge in violent crime that began in 2005 accelerated in the first half of 2006, the FBI reported in December, providing the clearest signal yet that the historic drop in the U.S. rate had ended and is being reversed.
Reports of homicides, assaults, and other violent offenses surged by nearly 4 percent in the first six months of 2006 compared with the same period in 2005, according to the FBI's Uniform Crime Report. The numbers included an increase of nearly 10 percent for robberies, which many criminologists consider a leading indicator of coming trends. The results follow a 2.5 percent jump in violent crime for 2005, which at the time represented the largest increase in 15 years.
Homicides increased 20 percent or more in cities including Boston, Cincinnati, Cleveland, Hartford, Memphis, and Orlando. Robberies went up more than 30 percent in places including Detroit, Fort Wayne, and Milwaukee. Aggravated assaults with guns were up more than 30 percent in cities like Boston, Sacramento, St. Louis, and Rochester. In fact, 71 percent of the cities surveyed had an increase in homicides, 80 percent had an increase in robberies, and 67 percent reported an increase in aggravated assaults with guns.
According to criminal justice experts, many communities, particularly those in urbanized areas, may be headed into a period of sustained crime increases. David A. Harris, a law professor who studies crime trends at the University of Toledo, says:
This confirms what law enforcement has been seeing and saying on a more anecdotal level: that crime is on the way up. While it is still too early to be sure, you've certainly got things pointing in one direction.
Police officials say that arguments that 20 years ago would have led to fistfights now lead to guns. "There's really no rhyme or reason with these homicides," said Edward Davis, the police commissioner in Boston.
An incident will occur involving disrespect, a fight over a girl. Then there's a retaliation aspect where if someone shoots someone else, their friends will come back and shoot at the people who did it.
Chris Magnus, the police chief in Richmond, California, north of San Francisco, said he would often go to the scene of a crime and discover that 30 to 75 rounds had been fired. "It speaks to the level of anger, the indiscriminate nature of the violence," he said.
I go to meetings, and start talking to some of the people in the neighborhoods about who's been a victim of violence, and people can start reciting: "One of my sons was killed, one of my nephews. . . . It's hard to find people who haven't been touched by this kind of violence.
These numbers come amid heightened criticism of the federal government from many police chiefs and state law enforcement officials who complain that the federal government has retreated from fighting traditional crime in favor of combating terrorism and protecting homeland security. Justice Department officials dispute those contentions and point to an ongoing study designed to identify solutions to the rise in violent crime.
The Justice Department inspector general's office has reported sharp declines in the number of FBI agents and investigators dedicated to traditional crime since the Sept. 11, 2001, terrorist attacks. In addition, the International Association of Chiefs of Police says that law enforcement programs at the Justice Department have been cut by more than $2 billion since 2002 and that overall funding for such programs has been reduced to levels of a decade ago.
James Alan Fox, a criminologist at Northeastern University in Boston who has been critical of the Bush administration's crime-fighting strategies, said the overall rise in violent crime should be expected given dramatic cuts in assistance to local police and simultaneous increases in the population of males in their teens and 20s. He said:
We have many high-crime areas where gangs have made a comeback, where police resources are down, and where whatever resources there have been have shifted to anti-terrorism activity.
Justice Department officials have repeatedly rejected such criticism, arguing that the causes and trajectory of crime increase is still unclear. Still, Attorney General Alberto Gonzales has launched a series of anti-drug and anti-gang initiatives at Justice, and he acknowledged at a crime conference in Boston in December that local police are struggling with "increased responsibilities" since Sept. 11, 2001.
One factor in increasing crime rates may well be the decline of traditional family life, particularly in inner city urban neighborhoods. Married couples with children now occupy fewer than one in every four households--a share that has been cut in half since 1960 and is the lowest ever recorded by the census. As marriage with children becomes an exception rather than the norm, social scientists say it is also becoming the self-selected province of the college-educated and the affluent. Isabel V. Sawhill, an expert on marriage and a senior fellow at the Brookings Institution said:
The culture is shifting, and marriage has almost become a luxury item, one that only the well-educated and well-paid are interested in.
Out-of-wedlock births exceeded 1.5 million in 2005 for the first time ever, representing 36.8 percent of all births in the U.S. Among non-Hispanic blacks, the out-of-wedlock birth rate reached a staggering 69.5 percent. For non-Hispanic whites, it reached a new milestone, exceeding 25 percent. The illegitimacy rate for Hispanics reached 47.9 percent. The rise in unwed births is "disastrous, about as big a leap as we've ever had," said Robert Rector, welfare analyst at the Heritage Foundation. He noted that unwed birth figures leveled off and seemed to stabilize for a time after Congress passed welfare reform in 1996. However recent increases in these numbers "clearly show that the impact of welfare reform is now virtually zero, and we are going back to the way things were before welfare reform."
The percentage of children in households with two parents continues to fall. The National Marriage Project's "State of Our Unions 2006" shows that children in two-parent households have dropped to 67 percent. In minority communities, the majority of children live in one-parent households. There is, it seems clear, a correlation between the decline in family life and the rise in crime. By 2004, federal data showed that black Americans--13 percent of the population--accounted for 37 percent of the violent crimes, 54 percent of arrests for robbery, and 51 percent of murders. Most of the victims of these violent criminals were their fellow black Americans.
In an article in Philadelphia Magazine (November 2006), reporter Gregory Gilderman rode along with police officer Dennis Stephens in crime-ridden North Philadelphia's 22nd district. Gilderman writes:
Everything you need to know about Philadelphia's current murder wave-the out-of-control nature of it, the futility of our response to it--may be encapsulated in this fact: within 24 hours of Mayor Street's emergency televised address last July about the city's surging homicide rate, in which he urged the city's youth to put down their guns, five murders were committed. . . . It's been that kind of year in Philadelphia.
According to Gilderman:
One thing that makes fighting crime difficult in North Philadelphia, as well as in the nation's other inner cities, is the hostility felt toward the police and the lack of citizen cooperation. This is one of the more demoralizing aspects of policing this city: the culture of the street hates the cops. Never mind that most of the officers are African-American, and that more than a few of them grew up in this neighborhood. Perfectly law-abiding teenagers wear "STOP SNITCHIN" t-shirts, cops are taunted for being sellouts or "trying to be white," and witnesses and victims won't talk at crime scenes, let alone show up at court. Because of this, a vast swath of the criminal element-muggers, rapists, even murderers, see charges dropped or reduced to the one crime for which a police officer's testimony alone just might provide leverage for plea-bargained prison time: possession of a firearm. This is especially frustrating for veteran officers. A dangerous police district is like a small town: Very few new faces show up, and the same career criminals are arrested over and over. They are returned to the street over and over.
Professor Pamela Smock of the University of Michigan, co-author of the recent review of patterns of marriage, points out that class is a better tool than race for predicting whether Americans marry. "The poor aren't entering into marriage very much at all," said Smock. She reports that young people from these backgrounds often do not think they can afford marriage. Arguments that marriage can mean stability do not seem to change their attitudes, she said, noting that many of them have parents with troubled marriages.
To reverse the latest trends in crime we must not only consider the role of law enforcement agencies but the culture out of which such crime emerges. One key element, particularly in minority communities, is the breakdown of the family and the increasing out-of-wedlock birth rate. Unless current trends are reversed, our increase in crime is likely to continue.
Bowing to a national outcry and internal protest, CBS Radio decided to end Don Imus' morning program after he called the Rutgers University women's basketball team "nappy-headed hos."
The Imus case reflects the coarsening of American culture that is all around us. It reflects as well an extraordinary level of hypocrisy, both on the part of CBS and such black spokesmen as Al Sharpton and Jesse Jackson.
Columnist Jay Ambrose provided this assessment:
The president of CBS has explained that he fired Don Imus because of how "deeply upset and repulsed" the network has been by the vile, on-air statements of the radio host about a group of fine young women, and you will excuse me while I roll on the floor laughing at the hypocrisy. . . . Don Imus has been saying mean, bigoted things about women, gays, Jews, blacks and others for years now, and CBS didn't give a hoot. Neither did advertisers or guests. The advertisers were getting a good return on their investment. CBS was pulling in an estimated $20 million a year in revenue and the guests were having their names, causes, and agendas promoted in front of 10 million listeners. Imus' latest words . . . are not newly offensive. It is not as if he had previously been Mr. Decorum. . . . He has long been a tasteless shock jock engaged in mass nastiness for the sake of attracting mass audiences, and he has been abetted by advertisers, networks and dozens of radio stations for the very simple reason that it meant money in the bank.
What changed, argues Ambrose, is not that the Imus content:
. . . led a conscience-stricken CBS to contemplate "the effect language like this has on our young people," in words of the CBS president, but that the comment began to get lots of attention, public anger began to grow, lots of people began to complain-and sponsors saw that their Imus ads could do them more harm than good. A show without advertisers is not what keeps network executives employed or gets them bonuses.
Washington Post columnist David Broder noted that:
CBS Radio and MSNBC fired the millionaire talk-show host only after criticism of his foul-mouthed assault on the Rutger's women's basketball team mounted and advertisers canceled their contracts. It showed no courage on the part of those organizations, which have put up with similar slurs for years and counted themselves lucky to have such a moneymaking act in their stable.
The hypocrisy, however, does not end with CBS and MSNBC. Among the loudest critics of Don Imus were Jesse Jackson and Al Sharpton. It was Al Sharpton's radio program that Imus chose as the place to extend his apologies for his remarks. Yet, Sharpton, Jackson, and too many other black leaders seem offended by racist and sexist remarks only when they come from white spokesmen. When they emanate from blacks, silence has been their response.
Rap and hip-hop music, promoted by black radio stations around the country, repeatedly describes black women in a manner as offensive, if not more so, than that used by Don Imus. Women are degraded as "whores" and "bitches." Violence, murder, and self-hatred are marketed as true blackness-authentic black identity.
It is rare indeed to hear criticism of this music. Comedian Bill Cosby, speaking to a Milwaukee audience about rap music, asked how many of the women in the audience considered themselves "bitches and hos." When no one raised a hand, Cosby asked, "If you're not a bitch or a ho, why do you dance to that music?" Cosby said that the result is rap filling radio and television with distorted images of black people that have nothing to do with a history of self-determination and pride:
In order for hip-hop, with all that misogyny and gangster violence and "don't study" to exist, you've got to know nothing about history, struggle, what it takes to get ahead.
In his book Enough, the respected black journalist Juan Williams writes that:
The consequence of black leaders failing to speak out against the corruption of rap . . . resulted in real damage to the most vulnerable of black America: poor children, boys and girls, often from broken homes. As a group, they were desperately searching for black pride in the sea of images being thrown at them on TV, on the radio, on the Internet, and in advertising. What those children found was a larger-than-life rapper who was materialistic, sexist, and violent, and used the word nigger as a casual description of all black people. It was a musical minstrel show that would have been a familiar delight to 19th century slave owners. In fact, there are similarities between the economics of slavery and the modern rap industry. Cheap labor, slaves, made it possible for the Southern plantation to make money. All that was required was silent assent to a hellish compromise with the obvious immorality of slavery by the politicians, the religious leaders, the bankers, and the newspaper editors. Cosby is particularly critical of The New York Times for a "liberal, patronizing attitude" toward black culture in which they promote hip-hop to show "they are so cool" but fail to write about its negative impact on the black community.
Occasionally, black critics of rap and hip-hop have emerged. One of the most outspoken was the late C. Delores Tucker, who crusaded for a decade against "gangster rap" pollution, including buying stock in major record companies in order to protest at stockholders meetings. Students at Spelman College, an historically black women's liberal arts college, forced the rapper Nelly to cancel a charity fund-raising visit to the school a few years ago in protest over one of his sexist music videos. Dr. William Banfield, head of the American Cultural Studies program at the University of St. Thomas, said of rappers:
They are the biggest sellouts of all time because they allow the white media structure to lessen the potential of a balanced picture of black people in contemporary American cultural projection.
Harry Belafonte, the singer, said much the same when he described rappers as "caught in a trick bag because it's a way to make unconscionable sums of money and a way to absent yourself from any sense of moral responsibility."
These, sadly, are isolated voices. The black churches have been largely silent, as have the major civil rights organizations, and the Congressional Black Caucus. Critics such as Al Sharpton and Jesse Jackson, eager to pounce on Don Imus, who, needless to say, has earned the criticism he has received, are indifferent to the racism and sexism of black rappers and show business personalities.
Journalist Michelle Malkin posted on her web site several videos from artists currently on the Billboard Hot Rap Tracks chart, including Mims, R. Kelly, and Bow Wow. Language resembling that used by Don Imus figured in each clip, prompting Malkin to ask whether Imus critics such as Al Sharpton and Jesse Jackson are "truly committed to cleaning up cultural pollution that demeans women and perpetuates racial epithets."
For Cynthia Neal Spence, an associate professor of sociology at Spelman College, the Imus controversy has been a teachable moment. "My students have been so hurt by all of this--and particularly their own role in it," she says.
Professor Spence was hardly shocked to learn that all of her female students recall having been called by the same noun that Mr. Imus used. What surprised her is that many of them acknowledged having themselves used the word. She said:
There's been a desensitization process that's had a profound effect on our choices of language, especially for our young people, who are so influenced by media culture. . . . These young people are growing up in a generation where everything goes.
The "shock jocks" and the rappers and hip-hop artists are hardly alone. Hollywood, television networks, and other media outlets have long provided Americans with a steady diet of sex and violence. Our culture has become increasingly coarsened, to the detriment of all of us. In this sense, Don Imus has ignited a useful national debate. It is time for the leaders of the black community to enter that debate and use the same standard in assessing the lyrics of rap and hip-hop songs as they do the words of white "shock jocks." And it is time for all of us to speak out against the coarse language, racism, sexism, promotion of promiscuity and drug use that, all too often, is to be found in the media. We must ask ourselves what kind of society we want to live in. We have received many wake-up calls but, thus far, have failed to take heed. *
"There is but one straight course, and that is to seek truth and pursue it steadily" --George Washington
The Republican defeat in the November election, and the decision of voters to give Democrats a majority in both the House and Senate, has been described by some as a defeat for conservatism. Nothing could be further from the truth.
There is, in fact, nothing conservative about the policies of the Bush administration and the Republican Congress that was rejected by the voters. In his book Buck Wild: How Republicans Broke the Bank and Became the Party of Big Government, Stephen Slivinski of the Cato Institute shows how earmarks-or pork-barrel projects-multiply for each home district or state. In the last Democratic Congress, earmarks numbered 1,549. The Republican Party in its first year got the number down to 958. But in 2005 and again in 2006 the yearly total zoomed to over 15,000, or an annual average of some 30 earmarks per member of Congress.
Such earmarks are also a gateway to corruption. Mr. Slivinski notes that indicted Republican lobbyist Jack Abramoff once called the Appropriations Committee, birthplace of most earmarks, a "favor factory." Rep. Floyd Flake (R-AZ) refers to earmarks as "the currency of corruption." California Republican, former Rep. Randy Cunningham, who confessed to taking bribes for promises of earmarks, got a jail sentence of eight years and four months.
At the National Review summit of conservatives, held in Washington in January, former Florida Governor Jeb Bush told his audience that Republicans lost the 2006 elections because they abandoned their principles of limited government and fiscal responsibility. David Boaz, executive vice president of the Cato Institute, points out that:
The Republican Congress came to power in 1994 promising "the end of government that is too big, too intrusive, and too easy with the public's money." But for the past six years, with Republicans controlling both the White House and Congress, they have instead delivered the biggest spending increases and the biggest expansion of entitlements since Lyndon Johnson, the federalization of education, the McCain-Feingold restrictions of political speech, and the Sarbanes-Oxley regulatory burden. When you combine that with a misguided war and a series of scandals that remind voters why no party should stay in power too long, is it any wonder that conservatives were dispirited in the 2006 elections?
David Keene, chairman of the American Conservative Union, notes that:
The historic core of the (conservative) movement has revolved around the relationship of the citizen to the state with conservatives of most, if not all, stripes arguing that a small government that is minimally involved in running the economy and the way people live their lives is superior to a larger government that wants to do more and more "for the people." In power, however, conservative politicians have tried to retain the rhetoric of small government while governing in a way barely distinguishable from their Democratic opponents.
Discussing the decline of conservatism and conservative ideas, Paul M. Weyrich, chairman of the Free Congress Research and Education Foundation, and William S. Lind, director of its Center for Cultural Conservatism, write in The American Conservative:
Conservatism has become so weak in ideas that during the presidency of George W. Bush, the word "conservative" could be and was applied with scant objection to policies that were starkly anti-conservative. Americans witnessed "conservative" Wilsonianism, if not Jacobinism, in foreign policy and an unnecessary foreign war; record "conservative" de-industrialization and dispossession of the middle class in the name of Ricardian free trade and Benthamite utilitarianism. No wonder the American people are confused and disillusioned by conservatism if these are its actions when in power. . . . If conservatism is to be re-established as an intellectual force, and not merely a label for whatever the establishment does to its own benefit, it must first reawaken intellectually.
A supreme irony of today's Big Spending-Big Government Republicans, argues William H. Peterson, an adjunct scholar at the Heritage Foundation and the Ludwig von Mises Institute:
. . . is their run-in with the anti-Big Government thinking of the American voters themselves. For according to polls such as ABC News/Washington Post and CBS/New York Times, American voters prefer a smaller state.
Indeed, poll numbers for the last 28 years on Americans opting for smaller government trend upward-from 44 percent for smaller government against 41 percent for larger government in 1978 to 64 percent for smaller government against only 22 percent for larger government in 2004.
In Stephen Silvinski's view:
It seems there is a large constituency that would respond favorably to a political party that can enunciate a clear program to make the federal government smaller, less powerful and less intrusive. It's those sorts of voters-Republicans, Democrats and independents alike-who catapulted Reagan to the White House. Those voters are still up for grabs. The Republican Party cannot take them for granted anymore.
Before the 2006 election, conservative commentators Kate O'Beirne and Rich Lowry, writing in National Review, had one word to describe the Republican Congress' approach to spending the big deficits: "Incontinence." They argued that the relevant question for conservatives was not "Can this Congress be saved?" but "Is it worth saving?"
Not long before his death, Nobel Prize winning economist Milton Friedman said that the previous four years of Republican spending increases were "disgraceful" and a betrayal of the party's principles. "I'm disgusted by it," he declared:
For the first time in many years, the Republicans have control of Congress. But once in power, the spending limits were off and it's disgraceful because they went against their principles.
Federal spending as a share of the entire economy was 18.4 percent when Mr. Bush took office in 2001. Since then, the government's annual spending levels have grown by $610 billion or to 20.2 percent of the economy, according to figures compiled by the Heritage Foundation. "This is not a happy time for fiscal conservatives. We have had way too much spending," said John F. Cogan, an economist at the Hoover Institution who has been a frequent adviser to the Bush White House.
The critiques of the Bush administration and the Republican Congress have been increasingly harsh-and perhaps the harshest of these is from conservatives. Columnist George Will, discussing the administration's Iraq policy, wrote: "This administration cannot be trusted to govern if it cannot be counted on to think and, having thought, to have second thoughts." Robert Kaga, a neoconservative supporter of the Iraq war, wrote:
All but the most blindly devoted Bush supporters can see that Bush administration officials have no clue about what to do in Iraq tomorrow, much less a month from now.
In a book published in 2004, former Bush Treasury secretary Paul H. O'Neill described Bush as "a blind man in a room full of deaf people" and said that policymakers put politics before sound policy judgments. O'Neill said that "the biggest difference" between his time in government in the 1970s and in the Bush administration:
. . . is that our group was mostly about evidence and analysis, and Karl (Rove), Dick (Cheney), (Bush communications strategist) Karen Hughes and the gang seemed to be mostly about politics.
The growth of government power, the diminution of individual freedom, and a soaring deficit are not the policies one would expect from a self-proclaimed conservative president and Congress. Still, perhaps we should not be too surprised.
The Founding Fathers understood very well that freedom was not man's natural state. Their entire political philosophy was based on a fear of government power and the need to limit and control that power very strictly. It was their fear of total government that initially caused them to rebel against the arbitrary rule of King George III. In the Constitution they tried their best to construct a form of government that through a series of checks and balances and a clear division of powers, would protect the individual. They believed that government was a necessary evil, not a positive good.
Yet, the Founding Fathers would not be surprised to see the many limitations upon individual freedom that have come into existence. In a letter to Edward Carrington, Thomas Jefferson wrote that: "The natural progress of things is for liberty to yield and government to gain ground." He noted that:
One of the most profound preferences in human nature is for satisfying one's needs and desires with the least possible exertion; for appropriating wealth produced by the labor of others, rather than producing it by one's own labor . . . the stronger and more centralized the government the safer would be the guarantee of such monopolies; in other words, the stronger the government, the weaker the producer, the less consideration need be given him and the more might be taken away from him.
The written and spoken words of the men who led the Revolution give us numerous examples of their fear and suspicion of power and the men who held it. Samuel Adams asserted that:
There is a degree of watchfulness over all men possessed of power or influence upon which the liberties of mankind much depend. It is necessary to guard against the infirmities of the best as well as the wickedness of the worst of men.
Therefore, "Jealousy is the best security of public liberty."
Conservatives, if they are sincere in their advocacy of limited government and fiscal responsibility, must be as vigilant when Republicans are in power as when Democrats are in control. There is a tendency for the party in power-whichever party it may be-to expand that power and build upon it. We have seen this tendency in full bloom with the Bush administration. Finally, perhaps too late, conservatives have now come to understand that reality.
In October, 2006, according to the U.S. Census Bureau, the population of the United States reached 300 million, behind only that of China and India.
One key ingredient in this population growth has been immigration. Over the past four decades, immigrants, primarily from Mexico and Latin America, have reshaped the country's ethnic makeup. Of the newest 100 million Americans, according to the Pew Hispanic Center, 53 percent are either immigrants or their descendants.
Nearly half of the nation's children under 5 belong to a racial or ethnic minority. The face of the future is clear in our schools. Writing in the Smithsonian Magazine, Joel Garrau notes that:
Our kindergartens now prefigure the country as a whole, circa 2050-a place where non-Hispanic whites are a slight majority. . . . The numerical study of who we are and how we got that way does have a refreshing habit of focusing our attention on what's important, long-term, about our culture and values-where we're headed and what makes us tick.
Many believe that the changing racial and ethnic makeup of the nation signals a fundamental change for our society. Political historian Michael Barone disagrees. In his important book, The New Americans, Barone, a senior writer at U.S. News and World Report, reminds us that the U.S. has never been a homogeneous, monoethnic nation:
The American colonies, as historian David Hackett Fisher teaches in "Albion's Seed," were settled by distinctive groups from different parts of the British Isles, with distinctive folkways, distinctive behaviors in everything from politics to sexual behavior. And this is not to mention the German immigrants who formed 40 percent of Pennsylvania's population in the Revolutionary years and who, Benjamin Franklin feared, would never be assimilated. Many different religious groups-Catholics and Mennonites, Shakers and Jews-established communities and congregations, making the thirteen colonies and the new nation more religiously diverse than any place in Europe. We were already, in John F. Kennedy's phrase, a nation of immigrants.
Barone shows how the new Americans of today can be interwoven into the fabric of American life just as immigrants have been interwoven throughout history. He believes, however, that it is essential we heed the lessons of America's past, and avoid misguided policies and programs-such as bilingual education-that hinder rather than help assimilation. The Melting Pot, he believes, can work as well today as it already has.
"The minority groups of 2000," writes Barone:
. . . resemble in important ways immigrant groups of 1900. . . . America, in the future, will be multiracial and multiethnic, but it will not-or should not-be multicultural in the sense of containing ethnic communities marked off from and adversarial to the larger society, any more than today's America consists of unassimilated and adversarial communities of Irish, Italians, or Jews. . . . We are not in a wholly new place in American history. We've been here before.
While the American society of a century ago sought to assimilate immigrants and make sure that they were taught the English language and the history, culture, and values of their new country, many in today's society, particularly among the nation's elites, have abandoned that goal.
In Barone's view,
In the last third of the twentieth century . . . elite Americans have not been preoccupied with immigration and have tended to regard "Americanization" as an uncouth expression of nationalistic pride or a form of bigotry. . . . Elites came to see Americanization as the unfair subjection of members of other races and cultures. They came to celebrate . . . an America that would be made up of separate and disparate "multicultural" groups, fenced off in their own communities, entitled to make demands on the larger society, but without any responsibility to assimilate to American mores.
Programs that have been adopted in recent years, Barone argues have hindered the integration of newer immigrants into the American society:
By stepping back from the prevalent view of the immigrant and minority groups, we see how misguided some of our policies and programs are. It is absurd, for instance, to grant immigrants quotas and preferences that are based on past discrimination because, as John Miller points out, "foreign-born newcomers almost by definition cannot have experienced a past history of discrimination in the United States." Even more absurd and counterproductive have been the so-called bilingual education programs, which have kept Latino immigrants' children in Spanish-language instruction and denied them knowledge of English that they need to advance in American society. What these immigrants need is what Americanization supplied the immigrants a hundred years ago-a knowledge of English and basic reading and mathematics skills, an appreciation of the American civic culture, a fair chance of moving ahead as far as their abilities will take them. We need to learn the good lessons our forebears taught, even as we strive to avoid their mistakes.
Most immigrants, Barone shows, are hard-working and are committed to making better lives for themselves in the American society. They are not the problem. He believes that:
The greatest obstacle to the interweaving of blacks, Latinos, and Asians into the fabric of American life is not so much the immigrants themselves or the great masses of the American people; it is the American elite. The American elite of a century ago may have looked on immigrants with distaste. . . . But it also championed the cause of Americanization and promoted assimilation of immigrants into the mainstream. . . . What is important now is to discard the notion that we are at a totally new place in American history, that we are about to change from a white-bread nation to a collection of peoples of color. On the contrary, the new Americans of today, like the new Americans of the past, can be interwoven into the fabric of American life. In many ways, that is already happening, and rapidly. In can happen even more rapidly if all of us realize that interweaving is part of the basic character of the country and that the descendants of the new Americans of today can be as much an integral part of their country, and as capable of working their way into its highest levels, as the descendants of the new Americans of a hundred years ago.
Clearly, the time has come to fire up the melting pot. Former Colorado Governor Richard Lamm makes this point:
The U.S. is at a crossroads. If it does not consciously move toward greater integration, it will inevitably drift toward more fragmentation. Cultural divisiveness is not a bedrock upon which a nation can be built. It is inherently unstable. . . . America can accept additional immigrants, but must be sure that they become Americans. We can be Joseph's coat of many nations, but we must be united. One of the common glues that hold us together is language-the English language. We should be color-blind but linguistically cohesive. We should be a rainbow but not a cacophony. We should welcome different peoples but not adopt different languages. We can teach English through bilingual education, but we should take great care not to become a bilingual society.
Professor Seymour Martin Lipset points out that:
The history of bilingual and bicultural societies that do not assimilate are histories of turmoil, tension and tragedy. Canada, Belgium, Malaysia, Lebanon-all face crises of national existence in which minorities press for autonomy, if not independence. Pakistan and Cyprus have divided. Nigeria suppressed an ethnic rebellion. France faces difficulties with its Basques, Bretons and Corsicans."
Remembering the way American public schools once served to bring children of immigrants into the mainstream, Fotine Z. Nicholas, who taught for 30 years in the New York City schools and for many years wrote an education column for a Greek-American weekly, notes:
I recall with nostalgia the way things used to be. At P.S. 82 in Manhattan, 90 percent of the students had European-born parents. Our teachers were mostly of Irish origin, and they tried hard to homogenize us. We might refer to ourselves as Czech or Hungarian or Greek but we developed a sense of pride in being American. . . . There were two unifying factors: the attitude of our teachers and the English language. . . . After we started school, we spoke only English to our siblings, our classmates and our friends. We studied and wrote in English, we played in English, we thought in English.
Discussing recent bilingual education programs, Mrs. Nicholas declares that:
It was a simple concept at first: Why not teach children English by means of the home language? A decade later, "disadvantaged" children were still being taught in their parents' language. As federal money poured into the program, it gradually became self-perpetuating. . . . Bilingual education seems to be developing into a permanent means of ethnic compartmentalization. Cultural pluralism may be the norm for a multi-ethnic nation, but is the family's role to build a cultural identity in children. The School's role is to help them enter the mainstream of school life, and eventually, the mainstream of the United States of America.
America has been a nation much beloved. Germans have loved Germany. Frenchmen have loved France. Swedes have loved Sweden. This of course, is only natural. Yet, America is not simply another country. To think so is to miss the point of our history. America has been believed not only by native Americans, but by men and women throughout the world who have yearned for freedom.
America dreamed a bigger dream than any nation in the history of man. It was a dream of a free society in which a man's race, or religion or ethnic origin would be completely beside the point. It was a dream of common nationality in which the only price to be paid was a commitment to fulfill the responsibilities of citizenship.
In the 1840s, Herman Melville wrote that "We are the heirs of all time and with all nations we divide our inheritance." If you kill an American, he said, you shed the blood of the entire world.
At a celebration in New York several years ago of the 150th anniversary of Norwegian immigration, news commentator Eric Sevareid, whose grandfather emigrated from Norway, addressed the group-in the form of a letter to his grandfather. He said:
You knew that freedom and equality are not found but created. . . . This grandson believes this is what you did. I have seen much of the world. Were I now asked to name some region on earth where men and women lived in a surer climate of freedom and equality than that Northwest region where you settled--were I so asked I could not answer. I know of none.
In 1866, Lord Acton, the British Liberal leader, said that America was becoming the "distant magnet." Apart from the "millions who have crossed the ocean, who shall reckon the millions whose hearts and hopes are in the United States, to whom the rising sun is in the West?"
Our new immigrants must be taught our history and must understand that what drew them to America will be lost if it is replaced by an ethnic and racial Balkanization, which some appear to seek. The melting pot worked well in the past. It will work well in the future if we will permit it to do so. *
"Our major obligation is not to mistake slogans for solution." -Edward R. Murrow
The death of Milton Friedman in November has been mourned around the world by all who value freedom. Few men and women in our time have had as much influence in advancing free societies as did the Nobel Laureate economist.
Dr. Friedman insisted that largely unimpeded private competition produced better results than government systems. "Try talking French with someone who studied it in public school," he once said, "then with a Berlitz graduate." In the area of education, he was an early advocate of vouchers which would provide freedom of choice to the poor, a freedom already possessed by the affluent.
One of his most famous arguments dealt with the causes of the Depression. It was promoted not by changes in tariff laws or by the stock market crash, he said, but by the Federal Reserve Board's decisions to shrink the money supply for fear of inflation in 1929 and again in 1936. Those choices choked the life out of the economy and exacerbated a bad situation, he stated in A Monetary History of the United States, 1867-1960 (1963), co-written with Anna J. Schwartz. The book is considered the definitive history of the nation's money supply.
It was Milton Friedman's belief that free enterprise was the only form of economic organization consistent with other freedoms. In his important book, Capitalism and Freedom, he points out that:
The kind of economic organization that provides economic freedom directly, namely, competitive capitalism, also promotes political freedom because it separates economic power from political power and in this way enables one to offset the other.
He declares that:
Political freedom means the absence of coercion of a man by his fellow men. The fundamental threat to freedom is power to coerce, be it in the hands of a monarch, a dictator, an oligarchy, or a momentary majority. The preservation of freedom requires the elimination of such concentration of power to the fullest possible extent and the dispersal and distribution of whatever power cannot be eliminated-a system of checks and balances. By removing the organization of economic activity from the control of political authority, the market eliminates this source of coercive power. It enables economic strength to be a check to political power rather than a reinforcement.
Even those who disagreed with some of Friedman's ideas had great respect for him, and for his influence on America and the world. Former Harvard President and Secretary of the Treasury Lawrence Summers recalls that:
From what I've heard, Milton Friedman's participation on a government commission on the volunteer military in the late 1960s was a kind of intellectual version of the play "Twelve Angry Men." Gradually, through force of persistent argument and marshaling of evidence, he brought his fellow commission members around to the previously unthinkable view that both our national security and our broader interest would be served by a volunteer military.
Beyond Milton Friedman the economist, writes Summers:
. . . there was Milton Friedman the public philosopher. Ask reformers in any one of the countries behind what we used to call the Iron Curtain where they learned to contemplate alternatives to Communism during the closed era before the Berlin Wall fell and they will often tell you about reading Milton Friedman and realizing how different their world would be. . . . Milton Friedman and I probably never voted the same way in any election. . . . I have my list of areas where I believe Mr. Friedman oversimplified or was simply wrong. Nonetheless, like many others, I feel that I have lost a hero-a man whose success demonstrates that great ideas convincingly advanced can change the lives of people around the world.
In an era when most of his fellow economists were advocating one or another form of government control of regulation of the economy, Friedman became an ardent crusader for capitalism and economic freedom. He did not advocate capitalism because of sympathy for the rich. Friedman himself was born in New York, the son of poor Jewish immigrants. His father died when Milton was 15, leaving his mother with very little money to pay for their son's education, which became a struggle. His advocacy of capitalism came because in a society based upon free enterprise all citizens-both the rich and the poor, and the majority, who were middle class-would prosper. More important, freedom could exist only when the state did not control the economic lives of its citizens.
Assessing Friedman's influence, The Economist declared that:
When Mr. Friedman was attacking the growth of the state and trumpeting freedom of choice 50 years ago, few listened; now many do. Ideas that once seemed daft-ending peacetime military conscription, deregulating industries from transport to banking, the negative income tax, school vouchers-have become either reality or part of mainstream political discourse. And his impact was probably greatest in places where non-economists might not spot it; largely thanks to him, governments no longer believe they can buy permanently lower unemployment at the price of a little more inflation. You could even be forgiven for thinking that the whole world had been remade in Mr. Friedman's image. Communism no longer rules half of Europe. Even in China and Vietnam capitalism has taken hold. Politicians of left and right speak of the power, and sometimes the virtues, of market forces. No wonder those forces are so often held to be untrammeled, unfettered, or merely triumphant form Seattle to Shanghai.
Still, there is much in current trends that disturbed Milton Friedman. The size of the state-particularly in our own country-has been growing, which can be seen in the ratio of government spending to GDP. Since 1989, the year Ronald Reagan, the president most in tune with Friedman's ideas, left office, and the Berlin Wall came down, the U.S. government has grown just as fast as its economy. The state's portion of GDP is forecast to be 36.6 percent in 2006, up from 36.1 percent seventeen years ago. The public sector has grown as well in Europe's three largest economies-Britain, France, and Germany. Governments, whether in the hands of Republicans or Democrats, liberals or conservatives, seem as convinced as ever that they know best how to spend their citizens' money.
"Judged by practice," wrote Friedman and his wife Rose in their memoirs published eight years ago, "we have been, despite some successes, mostly on the losing side. Judged by ideas we have been on the winning side."
This may have been too modest an appraisal. In Capitalism and Freedom, published in 1962, Friedman lists 14 activities then undertaken by the U.S. government, "that cannot . . . validly be justified" by the principles he sets forth. These include price supports for farming; tariffs and import quotas; rent control; minimum wages; "detailed regulation of industries," including banks; forcing retirees to buy annuities; military conscription in time of peace; national parks; and the ban on carrying mail for profit.
Although government still does a lot of this-and many, even those who call themselves conservative, disagree with some of Friedman's objections-it does much less than it did, and little goes unquestioned.
Because of his great skills as a communicator, Friedman's views reached far beyond his fellow economists. For eighteen years he wrote a column in Newsweek magazine. He and his wife Rose wrote a best-selling book Free to Choose, which led to a television series by the same name. During the past decade, the Friedmans dedicated themselves to advancing school choice. It was their view that failing schools produced failing students, depriving young people of the tools they would need to attain economic independence. Friedman first proposed school vouchers in 1955, but it wasn't until 1996 that he and Rose started their foundation to take advantage of the growing interest in school choice.
Giving economically deprived young people a choice of schools, Friedman believed, would offer them the best opportunity to escape the cycle of poverty. He pointed to a 1999 National Opinion Poll for the Joint Center of Political and Economic Studies in which 60 percent of minorities support vouchers and 87 percent of black parents ages 26 to 35 did so.
Columnist Cal Thomas reports that:
The Friedman Foundation's web site answers virtually every objection to school choice. First, it really is a choice. Universal vouchers would allow all parents to direct funds set aside by the government for education to the school they believe will best serve their child, whether the school is public or private, religious or secular. This separates the government operation of schools from the government financing of them. Only those who could demonstrate economic need would be eligible for the vouchers, except for parents whose children attend public schools identified as failing. In such circumstances all parents would be offered vouchers. . . . If school choice becomes the U.S. norm, it will be Milton Friedman's real legacy. Every poor child liberated from a failed government school will owe him a debt of gratitude.
Would school choice hurt public schools by depriving them of needed funds? No, says Friedman: "Public schools pay attention when school choice is on the table." He cites Florida as an example, noting that after a school choice program began:
. . . schools identified as failing are already publicizing their efforts to improve by hiring more teachers, increasing funds for after-school tutoring and lowering class sizes. One superintendent, Earl Lennard, even vowed to take a 5 percent pay cut if his county's schools received a failing grade.
In Friedman's view, competition works in free markets as well as in school choice. In Florida, Cleveland, and Milwaukee, public schools have received more state and federal aid for their public schools since voucher programs were set up.
Ben Stein, the writer and son of economist Herbert Stein, a long-time friend of Milton Friedman, writes that:
He was a friend and mentor and inspiration all my life. . . . When I was a Columbia undergrad in the early 1960s, Friedman taught there for a year and was a good friend to me. He even used applied statistics to save me from romantic desperation when I was worried about replacing a girlfriend. If there were only one right woman for every right man, he advised, they would never find each other. . . . Friedman, as much as anyone, stood athwart history and cried "Stop" as it seemed headed towards collectivism-only he did it with a masterly, genius-level grasp of mathematics, history, and statistics. He proved, inasmuch as it can be proved, that free markets would not impoverish the poor but enrich them, would not ride roughshod over the downtrodden but would empower them. . . . When I learned he had died, I was despondent, but I also realized you cannot kill Friedman's exaltation of human liberty-not with a gun, not with a tank, not with terrorism, not even with heart disease. His ideas and faith in the human spirit are as implanted in civilization as those of any benevolent economist and social revolutionary since his idol Adam Smith, whom he so worthily followed.
Milton Friedman's was a life fully lived and his influence for good will continue into the future as men and women around the world come to understand the intrinsic link between freedom of speech, religious freedom, the freedom to govern oneself-and economic freedom which, as Friedman often pointed out, is simply democracy applied to the marketplace.
The groundbreaking ceremony for a memorial honoring the millions of people killed by Communist regimes was held in September near the U.S. Capitol. Elected officials and representatives of the Victims of Communism Memorial Foundation, founded in 1994, took shovels in hand to officially begin the project, a joint effort by dozens of organizations and individuals.
"This is a historic day," said Lee Edwards, chairman of the nonprofit foundation that spearheaded the project. "The memorial will serve to remind all of us that never again must nations and peoples permit so evil a tyranny to terrorize the world."
Paula J. Dobriansky, undersecretary of state for democracy and global affairs, said Communism "corroded the human experience in the 20th century."
Mrs. Dobriansky's father, Lev Dobriansky, a former ambassador to Bermuda, was instrumental in the push for the memorial. She said the groundbreaking essentially signifies the end of the Cold War. "The memorial built here will stand, after we no longer do." She said.
It will educate future generations about the misery caused by Communism, the massive resistance efforts and the fortitude of those who were victimized by it and ultimately overcame it.
Rep. Dana Rohrabacher (R-CA), who sponsored the legislation that authorized the memorial, said: "Today we proclaim that Communism is indeed dead, but we will never forget those who Communism murdered during its brief life on this planet."
The memorial is a replica of the "Goddess of Democracy" statue created by student activists in China's Tiananmen Square that was demolished by Communist tanks during the historic uprising in 1989. Modeled after the Statue of Liberty, a 10-foot bronze copy of the statute will now be erected in downtown Washington as a permanent tribute to the estimated 100 million people killed by various Communist regimes.
"There is no memorial to all the victims of Communism," said Lee Edwards, an historian and Heritage Foundation fellow.
We want to focus attention on the crimes of Communism and therefore educate people about why we fought and won the Cold War. We are still in a confrontation with Communist China. That's the reason we think we need to be here.
The dedication of the memorial is scheduled for June to coincide with the 20th anniversary of President Reagan's famed "tear down this wall" speech at the Brandenburg Gate in Berlin.
For many years a large number of liberals, intellectuals, and journalists in the U.S. and other Western countries expressed sympathy for Communism and what they believed was a noble "experiment" taking place in the Soviet Union and other Marxist regimes. With the end of the Cold War, and Western access to Soviet archives, the truth about Communism has slowly come to be known to all.
In 1999, for example, The Black Book of Communism, an 846-page academic study that blames Communism for the deaths of between 85 million and 100 million people world-wide, became a best-seller. Billed as the first global balance sheet of Communism, the Black Book estimates that the ideology claimed 45 million to 72 million in China, 20 million in the Soviet Union, between 1.3 million to 2.3 million in Cambodia, 2 million in North Korea, 1.7 million in Africa, 1.5 million in Afghanistan, 1 million in Vietnam, 1 million in Eastern Europe, and 150,000 in Latin America.
Editorially, The Wall Street Journal notes that:
Through all those years, leftist intellectuals insisted on disassociating Communism from the crimes committed in its name. They did not want to sully Communism's utopian notion of egalitarianism, a concept quite different from the equal opportunity practiced by democratic nations. What the intellectuals failed to see was that egalitarianism was merely an advertising slogan for a political movement whose leaders would settle for nothing less than absolute power. Both Lenin and Mao regarded themselves as gods, entitled to hammer human nature into the mold of the New Socialist Man. But the subjects of their experiments saw themselves as individuals, and after a long and bloody history, Communist parties began to give up on this enterprise.
Historian Richard Pipes observes: "The language of Communism is better than Nazism, but the basic philosophy is the same."
For many years Communism in the U.S. and other Western countries was portrayed as simply a political philosophy and movement, unconnected with the Soviet Union. With the end of the Cold War, American researchers were able to carefully examine Soviet archives and found clear proof that the Communist Party of the U.S. did the bidding of Soviet spymasters before, during, and after World War II. Among their discoveries is a previously unknown network of American Communists, answering to Soviet officials, who were assigned to penetrate the Manhattan Project which built the atomic bomb. The researches also found documents supporting Whittaker Chambers, the government's key witness against Alger Hiss who was convicted of perjury in his denials of espionage activities.
When Hiss died in 1996, many prominent journalists and academics continued to view him in heroic terms. ABC anchorman Peter Jennings, for example, ended his effusive comments about Hiss by reporting that Boris Yeltsin had declared that nothing in KGB files branded Hiss a Soviet espionage agent. The Russian president, of course, never made such a statement. The man who did say that he found nothing about Hiss in his files was Gen. Dmitri Volkogonov, who later recanted and admitted that he had not inspected the files of Soviet military intelligence, the place Chambers said he and Hiss were employed. CNN also quoted Gen. Volkogonov as exonerating Hiss of any guilt at the conclusion of its report of Hiss' death.
President Clinton's national security adviser Anthony Lake said on "Meet the Press" that the evidence against Hiss was "not conclusive." On National Public Radio's "All Things Considered," listeners heard only that Hiss had been "accused" of spying for the Soviets and that a few years earlier his innocence had been vindicated by Gen. Volkogonov. An Associated Press story claimed that Volkogonov had described Hiss as "a victim of Cold War hysteria and the McCarthy Red-hunting era"-something Volkogonov never said.
In their book The Secret World of American Communism, published by the Yale University Press, Professor Harvey Klehr of Emory University and John Earl Haynes, a specialist with the Library of Congress, after reviewing thousands of files in Moscow, found that the documents confirmed much of Whittaker Chambers' account of Soviet espionage. Chambers, for example, had testified that J. Peters, a foreign Communist had headed the Communist underground that Chambers joined in the early 1930s, and that Peters had persuaded Alger Hiss, then a State Department official, to cooperate with Soviet intelligence. Because of Peters' key role in Chambers' story, write the authors:
. . . revisionist historians and defenders of Hiss have often denied that Peters was involved in any part of the underground or even that there was such a thing as a Communist underground.
Victor Navasky of The Nation, for one, denigrated the notion that Peters worked for the underground.
Journalist Ralph De Toledano, who wrote an early book on the Hiss case, states that:
All efforts at exposing Stalin's great network in the U.S. found those who attempted to expose it being denounced or smeared as "fascists" even after Soviet Foreign Minister V. M. Molotov-during the period of the Hitler-Stalin pact when Communist unions were striking U.S. defense plants-said blandly that "fascism is a matter of taste." When smear did not work, ridicule was substituted. The U.S. Communist party was financed by one of the most inhuman and murderous regimes in the world that was stoutly denied by professors and pundits, some on the payroll.
When the Klehr-Haynes volume appeared several years ago, The Washington Times editorially stated that:
These revelations are significant for two main reasons. In the first place, they tend to undermine the credibility of those who have always scoffed at any allegations of Communist espionage or subversion. The truth is that, long before these documents came to light, many scholars, intelligence officials, and ex-Communists themselves knew the truth and tried to tell it to a world that refused to listen and often vilified them for their efforts. Now those men emerge in history as heroes. In the second place, these revelations ought to tell us something about the nature of the Soviet government, the Communism that animated it, and the stooges, innocent and not so innocent, who believed in them for so long. . . . Today the Soviet Union is defunct and most of its stooges dead (not a few by the Soviets' own hand), but in the light of the documents discovered by Mr. Klehr and Mr. Haynes, the history of this century will need to be rewritten, and many of its villains and heroes will have to change roles.
Those who have scoffed at any allegations of Communist espionage or subversion, those who defended Alger Hiss, the Rosenbergs, and others during all of these years, have much to answer for. American Communism, Klehr and Haines make clear:
. . . was a conspiracy financed by a hostile foreign power that recruited members for clandestine work, developed an elaborate underground apparatus and used that apparatus to collaborate with espionage services of that power.
Roger Kimball of The New Criterion said that, "The short word for such activities is treason."
All too often during the Cold War, many American journalists missed the story of Communism's evil and brutality.
The forerunner of the American reporters who accepted Communists at their own word may be Walter Duranty, who served as the correspondent for The New York Times in Moscow in the 1930s.
In the midst of the enforced famine in the Ukraine in the 1930s, Duranty visited the region and denied that starvation and death was rampant. In November, 1932, Duranty reported that "there is no famine or actual starvation nor is there likely to be." When the famine became widely known in the West, and reported in his own paper and by his own colleagues, playing down rather than denial became his method. Still denying famine, he spoke of "malnutrition," "food shortages," and "lower resistance."
In the Times of August 23, 1933, Duranty wrote: "Any report of a famine in Russia is today an exaggeration or malignant propaganda," and went on to declare:
The food shortage which has affected almost the whole population last year, and particularly the grain-producing provinces--that is, the Ukraine, the North Caucasus, the Lower Volga Region-has, however, caused heavy loss of life.
In his important book about Soviet collectivization and the terror-famine of the 1930s, The Harvest of Sorrow, Robert Conquest declares that Duranty's
. . . admission of two million extra deaths was made to appear regrettable, but not overwhelmingly important and not amounting to "famine." Moreover, he blamed it in part on the "flight of some peasants and the passive resistance of others." . . . Duranty blamed famine stories on emigres, encouraged by the rise of Hitler, and spoke of "the famine stories then current in Berlin, Riga, Vienna, and other places, where elements hostile to the Soviet Union were making an eleventh-hour attempt to avert American recognition by picturing the Soviet Union as a land of ruin and despair."
What Americans got was not the truth--but false reporting. Its influence was widespread. What Walter Duranty got was the highest honor in journalism--the Pulitzer Prize of 1932, complementing him for "dispassionate, interpretive reporting of the news from Russia." The citation declared that Duranty's dispatches--that the world now knows to have been false--were "marked by scholarship, profundity, impartiality, sound judgment, and exceptional clarity."
Walter Duranty was only one of many correspondents and writers in the 1920s and 1930s who fed their readers in the West a steady diet of disinformation about the Soviet Union. Louis Fischer, who wrote for The Nation, was also reluctant to tell his readers about the flaws in Soviet society. He, too, glossed over the searing famine of 1932-33. He once referred to what we now know as the "Gulags" as "a vast industrial organization and a big educational institution." In 1936, he informed his readers that the new Stalin Constitution showed that the dictatorship was "voluntarily abdicating" in favor of democracy.
So dominant was this type of reporting that it was difficult for the truth about the Soviet Union to penetrate much of the American press. Reporters such as Eugene Lyons and Freda Utley, both of whom started out as Soviet sympathizers, lost their entre into those publications favored by the intelligentsia when they tried to tell the truth about what was happening in Russia. Eugene Lyons has pointed out that writers who tried to portray the Soviet Union realistically during the 1930s were turned away by editors "with platitudes about not wishing to 'attack Russia.' "
Now, of course, the truth about Communism and its mass murder and depravity is well known. It is fitting that a monument to its victims should be erected in view of the U.S. Capitol. Those such as Lee Edwards who have spent so many years working for such a memorial deserve our thanks. *
"Gratitude is the sign of noble souls." --Aesop
Fifty years after the leaders of the civil rights movement raised the bar of opportunity for all races, too many black Americans are in crisis-having babies out of wedlock, dropping out of school and caught in a denigrating hip-hop culture.
In 2006, just after the January holiday honoring Dr. Martin Luther King, Jr., an animated Dr. King came to life on the Cartoon Network's new show "The Boondocks." Animator Aaron McGruder had created an older version of King who stood at the pulpit of a black church, looking out at gangster rappers in a fistfight, high school drop-outs calling each other "the N-word," and unmarried black teenage mothers dressed like prostitutes. "Is this it? It this what I got all those ass-whippings for? I had a dream once," he said, referring to the sacrifices he made during the civil rights struggles of the '50s and '60s. King's face twisted with disappointment. His voice dripped with disdain for what had become of his dream.
The words in this cartoon were rooted in a speech that actor and comedian Bill Cosby had given in 2004 on the 50th anniversary of the U.S. Supreme Court's decision in Brown v. Board of Education of Topeka. "Ladies and gentlemen, these people-they opened doors, they gave us the rights," he said, praising the lawyers and educators present.
But today, ladies and gentlemen, in our cities we have a 50 percent drop out rate (rates among young black men) in our neighborhoods. We have (the highest percentage of any American racial group with) men in prison. No longer is a person embarrassed because (she is) pregnant without a husband. No longer is a boy considered an embarrassment if he tries to run away from being the father . . . . Ladies and gentlemen, the lower and lower-middle class people are not holding up their end in this deal.
The problem weighing down black America 50 years after Brown had nothing to do with white people or the racism that clamped chains on slaves. "We can't blame white people," he said. Then he added, "Brown v. Board is no longer the white person's problem." He noted that:
. . . according to the National Center for Health Statistics, in 2004, 69.2 percent of black children were born to unwed mothers. That contrasts with 24.5 percent for white children and approximately 45 percent for Hispanic children.
He proclaimed, "Thank God that people who spent their lives breaking down segregation so that black people could have a chance for success don't know what is going on today."
In an important new book, Enough (Subtitled: The Phony Leaders, Dead-End Movements, and Culture of Failure That Are Undermining Black America-and What We Can Do About It), Juan Williams, a senior correspondent for National Public Radio, a political analyst for the Fox News Channel, and author of, among other books, Thurgood Marshall: American Revolutionary, inspired by Bill Cosby's speech, makes the case that while there is still racism, it is past time for black Americans to open their eyes to the "culture of failure" that exists within their community. He points to proud traditional black values-self-help, strong families, and belief in God-that sustained black people through generations of oppression, and takes aim at prominent black leaders-from Al Sharpton to Jesse Jackson to Marion Barry.
The negative behavior Cosby was railing against, writes Williams,
. . . was behavior that the NAACP, the black church, the Jesse Jackson activists, and the black intellectuals had long ago decided not to address. Not one civil rights group took up Cosby's call for marches and protests against drug dealers, pregnant teens, deadbeat dads, and hate-filled rap music that celebrates violence.
In March 2006, The New York Times reported on its front page that
. . . a huge pool of poorly educated black men is becoming ever more disconnected from mainstream society and to a far greater degree than comparable white or Hispanic men.
Williams laments that:
Since the days of Dr. King, no prominent black American had dared to stand apart from the civil rights groupthink and ask, "Where do we go from here?" (this was the title of King's last book). That self-imposed censorship shows in the stagnant pool of ideas from which we black people draw when looking for solutions. It shows in the tired arguments rehearsed from the same predictable ideological positions . . . . Hard-won victories seem in danger for being squandered.
Those who proclaim themselves leaders in the black community, in Williams' view, refuse to articulate established truths about what it takes to get ahead: strong families, education and hard work. He declares:
Every American has reason to ask about the seeming absence of strong black leadership. Where is strong black leadership to speak hard truth to those looking for direction. . . . Who will tell you that if you want to get a job you have to stay in school and spend more money on education than on disposable consumer goods? Where are the black leaders who are willing to stand tall and say that any black man who wants to be a success has to speak proper English? . . . The Jesse Jacksons and Julian Bonds, people who made a name for themselves in the 1960s . . . are still fighting the battles of the 1960s. Then there are the latecomers, such as Al Sharpton, whose contribution is to mimic the aging leaders. Neither the old timers nor their pale imitators recognize that national politics has changed and black people have changed.
Historically, Williams points out,
A streak of self-determination rises at every turn in the history of black American leadership. But since the stunning success of the modern civil-rights movement . . . the strong focus on self-determination has faded, at the moment when its impact could have been the most powerful. In its place is a tired rant by civil rights leaders about the power of white people-what white people have done wrong, what white people didn't do, and what white people should do. This rant puts black people in the role of hapless victims waiting for only one thing-white guilt to bail them out. The roots of this blacks-as-beggars approach from black leaders are planted in an old debate that is now too often distorted.
The most prominent voice for black liberation before the Civil War, Williams points out, belonged to Frederick Douglass, a former slave who secretly taught himself to read, then became a skilled worker in Baltimore's shipyards before escaping to freedom in the North:
It was Douglass who first called on black people to do for themselves when he wrote an editorial titled "Learn Trades or Starve." By the end of the 19th century, as the government's many promises to help former slaves turned out to be mostly empty words, a new black leader emerged. Booker T. Washington picked up on Douglass' legacy by proposing defiant black self-determination as the best strategy for black advancement. . . . The basis of Washington's strategy at Tuskegee had direct links to Douglass' theory of black self-reliance. His idea was that black people should capitalize on the skills and knowledge they had gained as slaves. People who had worked the land for others now had the chance to own that land and take the profits of their work for themselves.
Some Black leaders, Williams believes, misunderstood the later disagreement between W. E. B. Du Bois, who called for a "talented tenth" of black Americans to pursue higher education and the professions rather than the skilled labor advanced by Washington. He writes,
It is Du Bois' respectful criticism of Washington that misled some black leaders to this day to lose sight of the mainstream of agreement in the foundational black leadership tradition. That devotion of self-determination was established by Douglass, Washington, and Du Bois. . . . Du Bois in later writing about Washington, gave him credit by accurately describing him as "the greatest Negro leader since Frederick Douglass and the most distinguished man, white or black, who has come out of the South since the Civil War.
The largest political movements of black people, before the Brown decision sparked the civil rights movement of the1950s and 1960s, Williams declares,
. . . had self-determination as their hallmarks. Marcus Garvey's Negro Improvement Association, with its call for black economic power, worship of a black God, and even a return to Africa to be free of oppression, was an effort to move away from white domination and allow black people to take control of their lives. The Niagara Movement, which led to the creation of the NAACP, focused on strategies for defeating white political control of black people so that blacks could be free to determine their own future. . . . The Brown decision itself is an example of black American leadership focused on self-determination, in this case the right to get an equal share of tax dollars to educate their children. . . . The Montgomery Bus Boycott, featuring Rosa Parks' famous refusal to move to the back of the bus, is another example of black people organizing for self-determination. . . . It was a classic case of self-determination and it ended with a Supreme Court victory over racial discrimination in local public transit that increased the rights of all black Americans. Black pride in taking control of their own fate was defiant rejection of the image of blacks as victims, ignorant and lazy. It was driven by a raw faith in the power of black people to compete and thrive in a democratic, capitalist nation if given the chance to be equals.
The areas where men such as Booker T. Washington and W. E. B. Du Bois agreed are crucial to black Americans today:
They both stood, in the end, for black self-help and dignity-and ultimately for full citizenship rights. . . . Black leaders of all ideological stripes agree that the key to racial progress was black people helping themselves. King, for example, said he wanted above all else to get black people to shed the idea that they did not control their destiny, an idea he attributed to the power of racists to infect black people with self-defeating doubts about inferiority and create a psychological need to rely on whites for their well-being.
Rather than confronting the genuine problems facing the black community today, Williams charges, such self-appointed leaders as Jesse Jackson and Al Sharpton have seen fit to help themselves:
So far, the "blood of martyrs" strategy has had tragic results for the progress of poor black people, but it has worked magnificently for a few national black politicians. Prominent leaders like Al Sharpton and Jesse Jackson, neither of whom has ever won an election or held political office, have-through the force of their personalities and rhetoric, and the limitations of their ideas and strategies-slowed the emergence of any new model of national black leadership. Jackson did enrich his family. He got the country's top beer company, Budweiser, which he had boycotted in the 1980s with the slogan "Bud's a Dud," to sell a multimillion dollar beer distribution center to two of his sons. . . . Sharpton did take the Jackson model of black politics to a new low, however. . . . Sharpton took money from a company called LoanMax, in exchange for appearing in ads designed to lure poor black people into their financial web. . . . Sharpton took money from one white-owned company that wanted to force another white-owned firm, a cable television company called Charter Communications, to carry their programming. Having failed to negotiate a deal with Charter, the Detroit firm Adell Broadcasting Corp. hired Sharpton to stage a phony civil rights protest march in front of Charter Communications' offices. Sharpton got out of a limousine in March 2002 . . . to lead three busloads of protest marchers in chants of "No Justice, No Peace. . . . One of the protest organizers working with Sharpton said he got people to join the protest by pulling them out of homeless shelters, giving them a meal and $50. He told the Wall Street Journal that "I like to refer to it as a "rent a demonstration." His usual fee . . . was at least $10,000.
One of the current crusades launched by some civil rights groups, calling for "reparations" for slavery, is, Williams states, "a divisive, dead-end idea." It is particularly erroneous, he points out, to attach:
. . . the impact of slavery to the years beyond 1954 and the Brown decision. In the half-century since Brown, the levels of black education, income, and political power have all grown, evidence that most black people are taking advantage of newly opened doors. Today, half of all black families are middle-class, earning at least twice as much as the poverty line. Only one percent of African-American families made that claim in 1940. . . . To make the argument that slavery is responsible for today's social and economic problems facing poor black people is to take away all of their personal will, diminish their independence, and dismiss their intellect. And how can he (Randall Robinson, author of The Debt: What America Owes to Blacks) explain the fact that at the start of the 20th century black people had higher marriage rates than whites? In 1940 the out-of-wedlock birth rate for blacks was 19 percent. Today it is close to 70 percent. If slavery is the cause of today's social problems in the black community, why did black people in closer historical proximity to it do better than today's black community with regard to keeping families together?
When it comes to crime, Williams reports, black people make up 13 per cent of the nation's population, but in 2003 the nation's prison population was 44 percent black, according to the U.S. Department of Justice. One of every ten black men between the ages of 25 and 29 is in prison. In fact, more black men were in jail or prison than in college. Where, he asks, is the black leadership?
Instead of speaking out against gangs, drug-dealers, and pimps-and the clothes and hip-hop music that celebrated these outlaws as black heroes-left-wing intellectuals preached against the sins of the white racist American establishment. . . . Never a word was spoken about the need for black Americans to take up their own war on drugs and on crime as a matter of personal responsibility. . . . By 2004 federal data showed that black Americans-13 percent of the population-accounted for 37 percent of the violent crimes, 54 percent of arrests for robbery, and 51 percent of murders. Most of the victims of these violent criminals were their fellow black people.
It was because of all these negative trends that Bill Cosby felt the need to apologize to the early civil rights leaders. Williams notes that:
A generation dropping out of school and celebrating the gangster life is a shocking turn of events, a repudiation of hundreds of years of civil rights struggle. It is a rejection of the gift of opportunity. It is a collective act of contempt for the true black American identity-a strong, creative, loving people with deep faith in God, seeking a better life for the next generation.
Black success in the future, Williams writes, depends upon young people finishing high school and college, taking a job and holding it, marrying after finishing school and while holding a job. And the final step is to have children only after you are 21 and married.
In 2005, the Institute for American Values issued a study showing that over the last 50 years, basically the period after the Brown decision, "the percentage of black families headed by married couples declined from 78 percent to 34 percent." In the 30 years from 1950 to 1980, households headed by black women who never married jumped from 3.8 per thousand to 69.7 per thousand. In 1940, 75 percent of black children lived with both parents. By 1990 only 33 percent of black children lived with a mom and dad.
Despite this reality, Williams charges,
There is no trusted source with a pulpit or a microphone telling people in need about the path to a better life. There is no one calling this situation a crisis. . . . The nation's leading civil rights groups are missing in action.
This book is a cry for a new generation of black leadership to fill the vacuum left by those who have rejected the tradition of pride and self-determination. Juan Williams has performed a notable service with this volume, which deserves as wide an audience as possible. *
"People unfit for freedom-who cannot do much with it-are hungry for power." -Eric Hoffer
[Editor's note: This was written before the latest war begun by Hizballah, but as the article points to historical realities, hope remains that present-day hatreds need not be perpetual.]
Throughout the U.S., dialogue between American Jews and Muslims is increasing. According to The Jerusalem Report,
Both 9/11 and four years of intifada chilled relations between American Jews and Muslims, which had warmed notably during the Oslo period. Now dialogue is showing new signs of life. "And as the situation in the Middle East improves-which I think it will do now, please God," says Rabbi David Rosen, director of Inter-religious Affairs for the American Jewish Committee, "there will be greater willingness on the part of the Jewish community to take more risks."
Dialogue has resumed, often sparked by individuals or groups not in leadership positions in either community, according to the Report:
After Wall Street Journal reporter Daniel Pearl was killed by terrorists in Pakistan in 2002, his father, Judea Pearl, began a series of unscripted public dialogues with Akbar Ahmed, a professor of Islamic studies at American University in Washington, DC. Since an initial dialogue in Pittsburgh in October 2003, the two have appeared at gatherings around the U.S. and the United Kingdom, with Canada and several U.S. cities on the schedule for this year. Audiences are typically one-third Muslim and most of the rest Jewish, according to Pearl, an Israel-born professor of computer science at UCLA.
My main reason is to convince Muslims that we are not their enemies. We try to stress the commonalities, though we don't shy away from friction.
Another example of dialogue is the Children of Abraham organization, co-founded by a Jewish man and a Muslim woman in 2004 in New York and London to offer "internships" to Jewish and Muslim young people around the world. The interns' task is to photograph Jewish and Muslim life in their communities and then dialogue with each other via the Internet. The first group of 60 interns from 27 countries took about 2,000 photographs last summer and posted 3,000 messages on the organization's web site in discussions that continued after the internships ended.
Other groups include the American Islamic Forum for Democracy, started by Zhudi Jasser, a Phoenix-area physician, which believes in the compatibility of Islamic and American values. In the Boston area, Judith Obermayer, a retired mathematician, hosted the first meeting of a Jewish-Muslim dialogue group about two years ago. From a handful of organizers brought together by the head of the local branch of the American Jewish Committee, the group that includes academics, doctors, businesspeople and ordinary Muslims and Jews-has grown to the point where 75 people attended a recent dinner.
The American Jewish Committee's Rabbi Rosen declares:
The Talmud asks, "Who is a hero?" and answers: "He who makes his enemy into a friend."
Last September Jordan's King Abdullah told a gathering of American rabbis in Washington, DC, that Jews and Muslims are irrevocably "tied together by culture and history" and that he is willing to take radical measures to combat Muslim extremists. He declared:
We face a common threat: extremist distortions of religion and the wanton acts of violence that derive therefrom. Such abominations have already divided us from without for far too long.
Rabbi Marc Gopin of the Center for World Religions, Diplomacy and Conflict Resolution at George Mason University presented the king with a copy of the Hebrew Bible in both English and Hebrew. Secular leaders, he said, "need to learn from your [the king's] example, learn from true heroism of one who confronts his adversaries."
While some have argued that Muslim-Jewish enmity is a long-standing phenomenon, the historic record tells a far different story. Indeed, when Jews were being harshly persecuted in Christian Europe, they often found a Golden Age in Muslim lands.
In her book The Ornament of the World, Prof. Maria Rosa Menocal of Yale University explores the history of Jews under Muslim rule in Spain:
Throughout most of the invigorated peninsula, Arabic was adopted as the ultimate in classiness and distinction by the communities of the other two faiths. The new Islamic polity not only allowed Jews and Christians to survive but, following Qur'anic mandate, by and large protected them, and both the Jewish and Christian communities in al-Andalus became thoroughly Arabized within relatively few years of Abd al-Rahman's arrival in Cordoba. . . . In principle, all Islamic polities were (and are) required by Qur'anic injunction . . . to tolerate Christians and Jews living in their midst. But beyond that fundamental prescribed posture, al-Andalus was, from these beginnings, the site of memorable and distinctive interfaith relations. Here the Jewish community rose from the ashes of an abysmal existence under the Visigoths to the point that the emir who proclaimed himself caliph in the 10th century had a Jew as his foreign minister.
Living in the heart of the Arab world, Jews first served their apprenticeship in the sciences of Islamic intellectual masters and in time became their collaborators in developing the general culture of the region. A striking example of this breadth of interest was Maimonides (Rabbi Moses ben Maimon, 1135-1204), a native of Cordoba. What chiefly characterized Jewish thought in this period was its search for unity-the attempt to reconcile faith with reason, theology, and philosophy, the acceptance of authority with freedom of inquiry. In Arab countries in the Near East and North Africa, where there existed this free intermingling of cultures, there blossomed a rich and unique Jewish intellectuality in Arabic. Beginning with the 10th century, especially in the kingdom of Cordoba under the enlightened Omayyad caliphs Abd al-Rahman and his son, Al-Hakin, there appeared a galaxy of Jewish scholars, historians, philologists, grammarians, religious philosophers, mathematicians, astronomers, doctors, and poets. During the 11th century Ubn Usaibia, a Muslim scholar, listed 50 Jewish authors writing in Arabic on medical subjects alone.
As Karen Armstrong notes in A History of God:
The destruction of Muslim Spain was fatal for the Jews. In March 1492, a few weeks after the conquest of Granada, the Christian monarchs gave Spanish Jews the choice of baptism or expulsion. Many of the Spanish Jews were so attached to their home that they became Christians, though some continued to practice their faith in secret. . . . Some 150,000 Jews refused baptism, however, and were forcibly deported from Spain; they took refuge in Turkey, the Balkans, and North Africa. The Muslims of Spain had given Jews the best home they ever had in the diaspora, so the annihilation of Spanish Jewry was mourned by Jews throughout the world as the greatest disaster to have befallen their people since the destruction of the Temple in CE 70.
Jane S. Gerber, in her book The Jews of Spain, points out that:
In the 15th and 16th centuries . . . it was the Ottoman Empire, then at the zenith of her power, that alone afforded exiles a place where "their weary feet could find rest.". . . Her sultans-Bayezid II, Mehmet II, Suleiman the Magnificent-were dynamic, farsighted rulers who were delighted to receive the talented, skilled Jewish outcasts of Europe. . . . Bayezid II, responding to the expulsion from Spain, reportedly exclaimed, "You call Ferdinand a wise king, who impoverishes his country and enriches our own." He not only welcomed Sephardic exiles but ordered his provincial government to assist the wanderers by opening the borders. Indeed, the refugees would find the Ottoman state to be powerful, generous and tolerant.
On a recent visit to Andalusia--Cordoba, Seville and Granada, among other places-this writer observed the many remaining reminders of this Golden Age of Muslim-Jewish cooperation and amity. They serve to illustrate the lack of historic understanding of those who present the current impasse over the Israeli-Palestinian conflict as the latest in a long history of strife and conflict. The real story is far different-and far more hopeful. It may provide us with a genuine road map for the future.
More and more, the very term "Congressional ethics" appears to be an oxymoron. Recent examples of corruption are mounting in number and excess. Both Republicans and Democrats are involved.
In May a former aide to Rep. Bob Ney (R-Ohio), who later went to work with disgraced lobbyist Jack Abramoff, pleaded to guilty to conspiring to illegally influence Ney. A corporate executive pleaded guilty to bribing Rep. William Jefferson (D-LA). Federal prosecutors say they found a $90,000 payoff in Jefferson's freezer. Late last year Rep. Randy ''Duke" Cunningham (R-CA) resigned after confessing to taking $2.4 million in bribes, including a Rolls-Royce. In March Cunningham was sentenced to more than eight years in prison.
The F.BI. is now investigating the possibility that members of Congress were "steering," (influencing) contractors to hire a member's friends, family, or staff, or soliciting campaign contributions from them, in exchange for placing special benefits, or "earmarks," in legislation. "The potential for earmarks being abused is great," says a federal law enforcement official. Sources report that one of the members being examined is House Appropriations Committee Chairman Jerry Lewis (R-CA). Rep. Alan Mollohan (D-WY) stepped down as the ranking Democrat on the House Ethics Committee after it was revealed he directed millions in federal grants to groups set up by him and staffed by his friends. Those friends, in turn, contributed to his campaigns.
A study released in June by the Center for Public Integrity, American Public Media, and Northwestern University journalism students found that private sponsors paid nearly $50 million over five and a half years to send members of Congress and their staffs on at least 23,000 trips. The study is the first time that researchers have pinpointed the full cost of privately funded Congressional travel.
The researchers kept tabs on which offices filed incomplete, incorrect or late reports disclosing details of their travels. They singled out two lawmakers, Reps. Charles Rangel (D-NY) and Marcy Kaptur (D-Ohio), for failing to disclose until six weeks before the report was issued that the Cuban government and a New York grocery mogul paid for their April 2002 trip to Havana to meet President Fidel Castro. Earlier disclosure reports had specified only a Minneapolis-based conservation group as the sponsor.
Golf trips to Scotland are at the center of an expansive federal investigation of Congressional corruption that has resulted in plea agreements from lobbyists Jack Abramoff and Michael Scanlon. Scanlon was once a senior aide to former House Majority Leader Tom Delay (R-Texas). During the five and a half years ending in 2005, Delay's office spent about $500,000 of other people's money on travel, topping the report's list. That total is nearly three times the annual salary of a party leader in the House.
House Speaker J. Dennis Hastert (R-IL) made a $2 million profit last year on the sale of land five and a half miles from a highway project that he helped to finance with targeted federal funds. A House member from California received nearly double what he paid for a four-acre parcel near an Air Force base after securing $8 million for a planned freeway interchange 16 miles away. Another California Congressman obtained funding in last year's highway bill for street improvements near a planned residential and commercial development that he co-owns.
In all three cases, Hastert and Reps. Ken Calvert (R-CA) and Gary Miller (R-CA) say that they were securing funds their home districts wanted badly, and that in no way did the earmarks have any impact on the land values of their investments. But for watchdog groups, the cases---which involve home-district projects funded through narrowly written legislative language--represent a growing problem. Keith Ashdown, Vice President of the group Taxpayers for Common Sense, said:
The sound bites from politicians have always been that they're doing what's best for their districts, but we're starting to see a pattern that looks like they might be doing what's best for their pocketbooks.
The Los Angeles Times reported that Rep. John Murtha (D-PA), the ranking member on the defense appropriations subcommittee, has a brother, Robert Murtha, whose lobbying firm represents ten companies that received more than $20 million from last year's defense spending bill. The L.A. Times reported:
Clients of the lobbying firm KSA Consulting--whose top officials also include former Congressional aide Carmen V. Scialabba, who worked for Rep. Murtha as a Congressional aide for 27 years--received a total of $20.8 million from the bill.
In early 2004, according to Roll Call, Mr. Murtha "reportedly leaned on U.S. Navy officials to sign a contract to transfer the Hunters Point shipyard to the city of San Francisco." Laurence Pelosi, nephew of House Minority Leader Nancy Pelosi, at the time was an executive of the company that owned the rights to the land. The same article also reported how Murtha has been behind millions of dollars worth of earmarks in defense appropriations bills that went to companies owned by the children of fellow Pennsylvania Democrat Rep. Paul Kanjorski.
Lobbyists have given more than $103 million to members of Congress since 1998, according to a new report released by Public Citizen, a public interest group. The $103 million total is "nearly double" the previous estimates. Influence peddling, more and more, dominates Washington. The Economist notes that,
Individual lawmakers have immense power to take money out of the public purse for the narrowest of purposes. Any one of them can slip an extra paragraph into a bill to secure funding for a project that may have nothing to do with the bill's stated purpose. Such "earmarks" are often inserted at the last moment and pass without scrutiny. . . . Earmarks are an open invitation to corruption, since you only have to incentivise one Congressman to win a fat slice of federal cash, and there are lots of legal ways to do it. So long as the contribution conforms with campaign-finance laws and no legislative favor is explicitly traded, you are probably in the clear.
What is increasingly clear is that as government grows larger, corruption increases accordingly. In The Economist's view,
Lobbyists are not the disease, merely the symptom. Their numbers have doubled in the past five years, to 35,000, because federal spending has grown larger and more wasteful. Earmarks have proliferated . . . from 1,439 in 1995 to 13,997 last year. Politicians of both parties love them, because they allow an individual lawmaker to take credit for delivering a specific goody to his constituents.
If government did not have the power to bestow a variety of benefits and subsidies to particular interest groups, there would be little incentive to purchase influence in Washington. As government has grown larger and larger, the incentive to curry favor with politicians has grown along with it. Editorially, The Washington Examiner points out that:
The federal budget consumes a fifth or more of the nation's annual economic activity, with the bulk of that spending directly influenced by members of Congress and indirectly by their top aides. So why is anybody surprised that the beneficiaries of federal largesse spend millions of dollars skating right up to and sometimes past the letter of the law in order to influence the decision-makers who hold the purse strings?
In the Examiner's view:
The solution is not more regulations and rules that require teams of lawyers to understand and which crafty lobbyists, Congressional aides and other Washington insiders eventually will find new ways to evade. The solution is to reduce the size and scale of government. Only then will there be significantly fewer special interests buying plane and hotel tickets for members of Congress and their staffs.
Traditionally, conservatives have been wary and suspicious of government power. In his book Conservatism in America, Clinton Rossiter declared that:
Government, in the conservative view, is something like fire. Under control, it is the most useful of servants; out of control, it is ravaging tyrant.
In Capitalism and Freedom, Milton Friedman wrote that:
Government is necessary to preserve our freedom, it is an instrument through which we can exercise our freedom; yet by concentrating power in political hands, it is also a threat to freedom . . .
The Bush administration, rather than cutting back government power and spending, has expanded both, to the dismay of many traditional conservatives. David Boaz, executive vice president of the Cato Institute, laments that under this administration we have seen "a 48 percent increase in spending in just six years," a "federalization of public schools" and "the biggest entitlement since LBJ." In fact, federal spending is outstripping economic growth at a rate unseen in more than half a century. The federal government is currently spending 20.8 cents of every $1 the economy generates, up from 18.5 cents in 2001. That is the most rapid growth during one administration since Franklin Roosevelt, who served 1933-45, during the Depression and World War II.
Economist Milton Friedman says that the past four years of spending increases are "disgraceful" and a betrayal of Republican Party principles. "I'm disgusted by it," the winner of the 1976 Nobel Memorial Prize in Economic Sciences told the Washington Times:
For the first time in many years, the Republicans have control of Congress. But once in power, the spending limits were off and it's disgraceful because it went against their principles.
As government continues to grow, ethical lapses in Congress are likely to increase, And there is little inclination in the Congress to reform its own lax ethical rules, on the part of either party. Ten Ethics Committee members and their aides have enjoyed 400 privately financed trips worth $1 million in a recent five-year period, according to a study of the Center for Public Integrity. Leading the beneficiaries was Rep. Howard Berman (D-CA), now the ranking Democrat on the committee. The "ethics reform" which has thus far been approved by Congress leaves, in place the current system of permissive gift and travel rules, inadequate disclosure, and lax enforcement.
Several members of Congress have offered proposals that would represent genuine reform. Reps. Christopher Shays (R-CT) and Martin Meehan (D-MA) have introduced legislation that would stop lawmakers from borrowing corporate jets at cut-rate prices. It would bar corporations and other private interests that lobby Congress from footing the bill for Congressional travel. It would slow the revolving door by increasing the waiting period for lobbying former colleagues from one year to two. It would create an independent Office of Public Integrity to put some teeth into the enforcement of all these rules. Sadly, there is almost no prospect for the passage of this legislation.
While Congress seems incapable of policing its own ethical standards, it is unlikely to permit anyone else to do so. In March, a Senate committee rejected a bipartisan proposal to establish an independent office to oversee the enforcement of Congressional ethics and lobbying laws, signaling a reluctance to do anything to beef up the enforcement of its rules.
The Senate Committee on Homeland Security and Government Affairs voted 11 to 5 in March to defeat a proposal by its chairman, Senator Susan Collins (R-ME) and its ranking Democrat, Senator Joseph Lieberman (D-CT) that would have created an office of public integrity to toughen enforcement and combat the loss of reputation Congress has suffered after the guilty plea in January of former lobbyist Jack Abramoff. Democrats joined Republicans in killing the measure. The vote was described by government watchdog groups as the latest example of Congress's disinterest in real reform.
Norman Ornstein, a resident scholar at the American Enterprise Institute, states that:
Congress's lenders have shown they really don't care if their colleagues were taking bribes or using hookers, much less that the oversight-deprived contracting process is broken. They are happy that there's no ethics process to hold people accountable. If that's not a culture of corruption, I'd like a better definition.
Satirist Mark Russell once said that in Washington we do not really have a conservative party or a liberal party but only "a fund-raising party." As government grows larger, and the benefits to be had by purchasing influence increases, the ethical decline we now observe is likely to continue, probably to escalate. Thus far, voters have not held members of Congress accountable for these excesses. Until they do, little is likely to change. *
"The laws of man may bind him in chains or may put him to death, but they never can make him wise, virtuous, or happy." --John Quincy Adams
Starting in May, the 400th anniversary of the first British settlement in America at Jamestown, Virginia got under way.
The Godspeed, a $2.6 million replica of one of the three ships that carried the first settlers to Jamestown in 1607, sailed to six East Coast ports to generate interest in the "America's 400th Anniversary" commemoration. Its first stop was along the Potomac River in Alexandria, Virginia, only several blocks from this writer's home. In fact, it has special meaning for me because I was a freshman at the College of William and Mary--located only several miles from Jamestown--in 1957 when Jamestown's 350th anniversary was celebrated. It marked the first visit as queen to the U.S. by Queen Elizabeth II.
Governor Timothy Kaine of Virginia said, as the Godspeed set sail, that:
Today is the beginning of 18 months of commemoration of a moment not just critical to the history of Jamestown or Virginia or even America, but we begin to mark a moment that altered the path of the entire world and of human history.
He noted that American traditions of free enterprise, representative democracy and cultural diversity began at Jamestown.
Thirteen years before the Pilgrims landed in Massachusetts, a group of 104 English men and boys made the four-and-a-half month voyage to the banks of the James River to form a settlement in Virginia. Their goal of making a profit from the resources of the New World for the Virginia company's shareholders in London quickly took a back seat to pure survival as they confronted the harsh realities of their life in their new home.
The new Godspeed is 88 feet long with a 7-foot draft and a 71-foot mainmast. The ship has three masts with six square-shaped sails made of Oceanus cloth. The mainmast flies the historic British flag from the era, which combined the English Cross of St. George with the Scottish cross of St. Andrew. The hull has been decorated in a red and white diamond pattern, with a red and white half-diamond pattern on the beakhead. Erich Septh, master of the Godspeed, says, "The hull shape requires a greater sail area and the sails can be operated together or individually. It very closely recreates the Godspeed."
The original Godspeed set sail from London on December 20, 1606, for a four-month journey across the Atlantic Ocean. The operation was financed by the Virginia Company of London, a start-up venture with a business model based on extracting profits from the New World. With an initial stock offering of 10 pounds and 12 shillings, investors knew that the company was a high-risk proposition. The memory of Sir Walter Raleigh's disastrous "lost Colony" was still a fresh memory, and England's record of failure served to bolster its image as a third-rate colonial power. Yet the promise of gold and silver--instant wealth--proved to be an almost irresistible force.
The Susan Constant was the flagship of the Virginia Company's expedition, carrying 71 people. It was armed with cannons for protection against pirates, leading the way for the other two ships, the Godspeed, which carried 52, and the Discovery, which carried a mere 21. Unlike Raleigh's expedition, this voyage would include no women. About half of the passengers were gentlemen, members of the upper class who were seeking adventure and riches.
"The men had come to the enterprise with a range of motives, and their hopes and fantasies would have run likewise," writes historian David Price. "Most of the travelers were on board because they--like the Virginia Company itself--expected quick treasure."
Historian Samuel Eliot Morison writes that,
The colonists owned no property; they were working for stockholders overseas. Twice a day the men were marched to the fields or woods by beat of drum, twice marched back and into church. They led an almost hopeless existence, for there seemed to be no future. . . . No empire could have developed from a colony of this sort. . . . The first factor in the transition was tobacco. Its value for export was discovered in 1613 when John Rolfe, who married the Indian princess Pocohontas, imported seed from the West Indies, crossed it with the local Indian grown tobacco, and produced a smooth smoke which captured the English market. Virginia then went tobacco-mad; it was even grown in the streets of Jamestown.
Beyond this, reports Morison,
. . . the institution of private property was the second factor that saved Virginia. When, after seven years, the terms of the Company's hired men expired, those who chose to stay became tenant farmers and later were given their land outright. This made a tremendous difference. As Captain John Smith put it, "When our people were fed out of the common store, and laboured jointly together, glad was he who could slip from his labour, or slumber over his taske, he cared not how; nay, the most honest among them would hardly take so much true paines in a week, as now for themselves they will doe in a day." By 1617, a majority of the hardy, acclimated survivors were tenants. Within ten years tenant plantations extended 20 miles along the James River, and total European population of Virginia was about a thousand.
A third factor that ensured the success of Virginia was political, in the broadest sense. Captain John Smith put it, one sentence: "No man will go from hence to have lesse freedome there than here." In the English conception of freedom the first and most important was "a government of laws, not men." The Company ordered Governor Sir George Yeardley to abolish arbitrary rule, introduce English common law and due process, encourage private property, and summon a representative assembly. This assembly would have power with the appointed council, to pass local laws, subject to the Company's veto.
Former Supreme Court Justice Sandra Day O'Connor, who is serving as honorary chair of the national Jamestown commemoration, declares that,
The system of government that we have today was an outgrowth of those early settlements, and so I thought the anniversary was a worthy reason to try to remind citizens of our history. In the United States today, public schools have pretty much stopped teaching government, civics and American history. It gets tossed in occasionally, but it's no longer a major focus for children. That's a great concern to me because I truly don't know how long we can survive as a strong nation if our younger citizens don't understand the nature of our government, why it was formed that way, and how they can participate and should participate as citizens. That's something you have to learn. It just isn't handed down in the genetic pool.
The present effort to replace the teaching of our traditional culture and literature with "multiculturalism" and the attacks upon the work of "dead white males" that is implicit in this assault upon the so-called traditional "Eurocentric" curriculum overlook one important fact: the United States has a culture of its own, and it is this American culture that has attracted men and women of every race, nationality and religion. They have come to our shores for something we had and they did not. Few have been disappointed.
What is this American culture that has been so appealing? In his thoughtful book America's British Culture, Russell Kirk, one of our foremost men of letters, points out that contemporary America is a product of the long evolution of law, governmental structure, religion, philosophy and literature of the large Western world and, more particularly, Great Britain, through which this Western culture in its British form reached the New World.
Dr. Kirk notes that,
So dominant has the British culture been in America . . . from the 17th century to the present, that if somehow the British elements could be eliminated from all the cultural patterns of the United States, why, Americans would be left with no coherent culture in public or private life.
In four major fashions, Kirk points out, the British experience, for more than a dozen generations, has shaped the United States.
In Kirk's view,
The first of these . . . is the English language and the wealth of great literature in that language. . . . The second . . . is the rule of law, American common law and positive law being derived chiefly from English law. This body of law gives fuller protection to the individual person than does the legal system of any other country. The third of these ways is representative government, patterned upon British institutions that began to develop in medieval times, and patterned especially upon "the mother of Parliaments" at Westminster. The fourth . . . is a body of mores, or moral habits and beliefs and conventions and customs, joined to certain intellectual disciplines. These compose an ethical heritage . . .
The very language of our current discussions about the law--the "rights" of the accused, the "right" to privacy, the presumption of innocence, "equality" under the law--all are derived very specifically from the British experience, and can be found in no other legal tradition.
The English common law, Kirk writes,
. . . gives to those who come within its jurisdiction privileges unknown in civil or Roman law, where generally the interest of the State looms first. Under the common law, for instance, a defendant cannot be compelled to testify if he chooses to remain silent; he is saved from self-incrimination. A complex series of writs, under common law, has made access to justice relatively easy for the individual. No person may be imprisoned without a warrant, and the accused must be tried speedily. . . . In European civil law . . . the accused person was presumed to be guilty as charged by a prosecutor; the judge determined the issue to be settled in a case at law. . . . But under the common law of England, the plaintiff and the defendant, or the prosecutor and the defendant, are regarded as adversaries, on an equal footing . . . the judge remains neutral. A defendant in a criminal case is presumed to be innocent unless the evidence proves him to be guilty beyond a reasonable doubt.
The English common law is founded upon the assertion of the supremacy of law. As Bracton and other medieval scholars in the law expressed it, even the king himself was "under law." And when the colonists declared independence, it was not to be free of English law, but, quite to the contrary, because the government in London had denied them their traditional rights as Englishmen.
Thus, the First Continental Congress' Declaration and Resolves (Oct. 14, 1774) declared that
. . . the respective colonies are entitled to the common law of England, and more especially the great and estimable privilege of being tried by their peers of that vicinage, according to the course of that law.
The patriots were asserting their claim to enjoy what Edmund Burke called "the charted rights of Englishmen"--not the abstract claims of perfect liberty that would be asserted 15 years later in France.
The chartered rights went back to the Magna Carta. In his Commentaries on the Laws of England--an American edition of which was published in Philadelphia in 1771-72--William Blackstone found in the Magna Carta the expression of three absolute rights: life, liberty and property. He traced back to the Great Charter the doctrine of due process of law. The ancient right to trial by a jury of one's peers was closely examined by Blackstone. American colonists cited Blackstone for authority that no tax might be imposed upon them, constitutionally, without the act and consent of their own legislature.
The fact that the majority of present-day Americans cannot trace their individual ancestry to England bears little relationship to the British nature of American culture. Russell Kirk argues that:
Two centuries after the first U.S. census was taken, nearly every race and nationality in the world had contributed to the American population, but the culture of America remains British. . . . The many millions of newcomers to the U.S. have accepted integration into the British-descended American culture with little protest, and often with great willingness.
Sadly, our schools have moved away from teaching our history. In 1991, for example, the Social Studies Syllabus Review Committee of the State of New York issued a report embracing the notion of "multicultural education" rejecting "previous ideals of assimilation to an Anglo-American model." Professor Arthur M. Schlesinger, Jr., a member of the Syllabus Review Committee, strongly dissented. He declared:
The underlying philosophy of that report, as I read it, is that ethnicity is the defining experience for most Americans, that ethnic ties are permanent and indelible, and that the division into ethnic groups establishes the basic structure of American society and that a main objective of public education should be the protection, strengthening, celebration and perpetuation of ethnic origins and identities. Implicit in the report is the classification of all Americans according to ethnic and racial criteria.
Professor Schlesinger points out that America's language and political purposes and institutions are derived from Britain: "To pretend otherwise is to falsify history. To teach otherwise is to mislead our students." He adds that:
. . . The British legacy has been modified, enriched and reconstituted by the absorption of non-Anglo cultures and traditions as well as the distinctive experiences of American life.
Dr. Schlesinger concluded by asking his colleagues:
. . . to consider what kind of nation we will have if we press further down the road of cultural separatism and ethnic fragmentation, if we institutionalize the classification of our citizens by ethnic and racial criteria and if we abandon our historic commitment to an American identity. What will hold our people together then?
Since those days, things have accelerated in this negative direction, as much of our current debate over immigration illustrates. In many schools, bi-lingual education has replaced the teaching of English to newcomers. In most schools, the teaching of our history has been downgraded. If we do not transmit our own history, culture and values to our students--particularly to those who have come from other places with other traditions and values--what future can we foresee for the American society?
The 400th anniversary of America's beginnings at Jamestown should provide us with a much-needed impetus to review our teaching of history and our transmission of American culture and values to the next generation. After all, it all really began at Jamestown in 1607.
Marjorie Meyer Arsht, now 91, has led an extraordinary life. Her new memoir, All the Way from Yoakum (Texas A & M Press), tells the story of a remarkable woman who became a leading light in Houston and Texas politics as one of the founders of the modern Republican Party of Texas. In a long life filled with both tragedy and joy, she remained steadfast in her determination to make a contribution.
Former President George H. W. Bush said this of Marjorie Arsht:
As President of the United States, I was privileged to meet kings and queens, presidents and prime ministers, and even a few dictators and despots along the way. But I'm not sure any of them compared to Marjorie Arsht, a life force in Texas politics for as long as I can remember. Marjorie was a true pioneer, opening doors for women, and for Republicans-a rare breed in Texas before she came along. However, until I read All the Way from Yoakum, even I did not fully understand what Marjorie was about. In this revealing, funny and poignant memoir, Marjorie shares all the joys and heartaches of her remarkable life, proving what Barbara and I already knew: this girl from Yoakum is truly a Texas legend.
In 1914, Marjorie Meyer was born in the small town of Yoakum, Texas. She writes that:
Yoakum today remains much as it was when my parents arrived, an unpretentious little South Texas town, located 35 miles south of Interstate 10 between Houston and San Antonio. Over the last half-century, the population of approximately 4,000 souls has remained relatively constant. The town reflects that same measured pace of living it enjoyed through its past.
Marjorie's was one of the few Jewish families in Yoakum:
No Jewish house of worship ever existed there, and only two or three Jewish families ever lived in Yoakum at one time. On the holy days of Yom Kippur and Rosh Hashana, my father took me to Houston or San Antonio. . . . The tradition of both my parents' families . . . was distinctly Reform. . . . Many of the old traditions, such as dietary laws and ritualistic circumcision, were no longer considered mandatory . . .
Growing up in Yoakum, Arsht recalls,
For a couple of reasons, I think it was possible for me to remain largely unaware of prejudice against Jews. One is that our practice of Reform Judaism did not appear markedly different from the variations of religious practice common to Christian denominations. The other is that I often went to church services with my friends. If I were spending the night at someone's house on Saturday night, I would go to church with her and her family on Sunday morning. I frequently accompanied my friend Nina Vance, a Presbyterian, to her Wednesday night prayer meeting. (Later Nina founded and for many years directed Houston's distinguished Alley Theater.) Also, while in high school, I went to the Catholic convent for music and French lessons, often attending a Catholic Mass. The diversity of religious experiences helped me develop the tolerance necessary to understand that the basis for all religions is essentially the same.
A prodigy at school, Marjorie, in a much more flexible educational era, graduated from high school at fourteen and entered Rice Institute, as Rice University as then called. She arrived in Houston, where family members, who owned Foley Brothers department store, welcomed her. "On my first day at Rice," she reports,
I was scared to death. I knew the names of the buildings from a map in the catalogue. Besides Lovett Hall, where I stood, there were only a few structures--the biology building, the chemistry building, a dormitory for boys, and Cohen House, which looked like a home. Uncle George and Aunt Ester Cohen, my father's sister, together with Uncle George's sister, Gladys, had given Cohen House to Rice Institute in honor of their mother and father, Agnes and Robert I. Cohen of Galveston. It still serves as the faculty club.
A Phi Beta Kappa graduate of Rice at 18, Marjorie was on her way to graduate study at the Sorbonne in Paris. Before leaving Texas, she had been given strict orders to visit her paternal grandfather's relatives in Strasbourg:
When my grandfather, Achille Meyer, was born in Wolfsheim, a suburb of Strasbourg, the province was German. . . . More observant of Jewish ritual than my family, they had prayers and candles and the "breaking of the bread" on Friday night. . . . We visited Wolfsheim as promised and discussed the war clouds looking over Europe. I knew that trouble was brewing because clairvoyant Jews were already seeking refuge in Paris. But, for me, war seemed so remote as to be almost unimaginable.
When Germany marched into Poland on September 1, 1939, a date that was also the first anniversary of Marjorie's marriage to Raymond Arsht, she remembers that,
Our thoughts, of course, were for my relatives, who were in grave danger. . . . Very early in the war, the Germans overran Strasbourg, that charming city of joy and laughter. Poor, sturdy Gustave was shot in the street as a hostage; his wife, Sarah, and two daughters went to concentration camps. Storm troopers invaded Arthur's home and shot him, Francine and their daughter in cold blood. Their son was swimming and escaped, but was later killed in a railway accident. Mathieu Dreyfus, a diabetic, died in the south of France from a lack of insulin. We didn't know then that Andrew and Paulette, with their infant daughter Danielle, had been hidden by a French farm family in their barn. At the time, I felt we would never see them or Strasbourg again.
After returning from Paris, Marjorie entered a master's degree program at Columbia University. She taught school in Yoakum and, after her marriage, moved to Houston. She was active as a member of Temple Beth Israel, the oldest Reform congregation in Texas, and was a firm opponent of Jewish nationalism, arguing that Judaism was a religion of universal values, not a nationality, and that Americans of the Jewish faith were Americans by nationality and Jews by religion, just as their fellow citizens were Presbyterians or Methodists or Catholics. Her views led her to a leadership position in the American council for Judaism, which promoted her classical Reform philosophy.
It was as president of Temple Beth El's Sisterhood that Majorie was led to her role in Texas politics. Each year, one of the programs was reserved for public affairs and featured a speaker. In 1960, two years before her tenure, Majorie
. . . noticed in the temple bulletin that the next program would present a Republican, Bob Overstreet, and a Democrat, Wally Miller, both of whom were candidates for the Texas legislature. At the time, the Republican Party in Texas was so small that a state convention could have been held in almost anyone's living room. I had been voting Republican for many years, ever since President Roosevelt took the U.S. off the gold standard when I was in school in France. I had watched then, with horror, as the value of my money declined by more than a third overnight. Before I went to Europe, I had been too young to vote, but when I returned, my first residential vote was cast for Wendell Willkie. . . . My father had been a Democrat's Democrat. He fed me the Constitution at breakfast, lunch and dinner. Arguments and debate were in my blood, along with my father's philosophy that no opinion is worth having if it can't be defended or promulgated. By the time I became president of the Sisterhood, my father's conservative Democrat philosophy had become the Republican Party platform.
After hearing Republican Overstreet and Democrat Miller, Marjorie made her first political contribution, a check to Overstreet for five dollars. "That one check put me on what few lists existed at the time," she states.
Bob Overstreet lost his election, as did every other GOP candidate at the local level. Texans had become accustomed to voting for Republicans nationally, but they remained Democrats at the state level. They contended that a two-party state wasn't needed because Texas actually had two parties, one liberal and one conservative, all within the Democratic Party, of course. . . . I received call after call to help with one project or another. I became a Republican activist at the local level.
The formation of the John Birch Society in 1958 and the controversy it engendered caused Marjorie to contribute an article to The Houston Press. Her thesis was:
Concern over the Birch Society should force everyone to examine the reasons for its birth. People who support it are reacting to the sharply leftward trend of government policies. . . . Extremism breeds extremism. Such a problem existed after the War Between the States, when Southerners, having no recourse against the abuses of Reconstruction, formed the Ku Klux Klan. All such organizations eventually fall into disrepute because they are extra-judicial. We should address ourselves to reclaiming balance in our institutions and public policies in order to negate the impetus for creating groups such as the John Birch Society.
After her article appeared, Marjorie was called by Bob Overstreet who said that, "John Tower saw the editorial you wrote and asked me to bring you to Austin next week to meet him." Tower, then in the midst of his run for the U.S. Senate, began using Marjorie's arguments in addressing all questions posed to him about the Birch Society. Writes Marjorie:
That association with John Tower was the beginning of a lifelong friendship, and I became an integral part of every one of the Tower campaigns, holding high-level volunteer positions. In 1961, Tower, the little professor from Wichita Falls, defeated the very conservative, crusty West Texan rancher William Blakely, with the help of liberal Democrats who wanted to purge their party of conservatives.
In 1962, Marjorie ran as a Republican for the Texas state legislature. She received the endorsements of all three Houston newspapers, the Press, the Post and the Chronicle, as well as The Informer, then Houston's leading black newspaper. The Informer told its readers that "Mrs. Arsht is a Republican and a conservative, but not a squinty-eyed reactionary. . . . She stands for sound, responsible two party government in Texas."
Of the election results, Marjorie states,
I received 48.9 per cent of the vote countywide, which was amazing. West of Main Street, I ran ahead of the governor. Part of my platform had been a plea for single-member districts. Had such lines been drawn at the time, my life might well have taken an entirely different turn.
Recalling the political atmosphere in Texas at that time, Marjorie points out that Republicans were traditionally less conservative than many Democrats and the party was opposed to segregation, which many Democrats embraced. Slowly, right-wing Democrats began to join the Republican Party, creating conflict between "Old," more moderate and racially inclusive Republicans, and "new," far more conservative members.
In 1963, Jimmy Bertron, chairman of the Harris County Republican Party--who Marjorie calls one of "us"--called:
"Marjorie, I don't want it generally known just now, but I'm moving to Florida. I think I've found a good replacement for us, so we could get a head start in warding off a takeover of the chairmanship in a special election. Would you bring some of our precinct people together to meet him?" Marjorie asked, "Who is it?" "His name is George Bush. His wife's name is Barbara. They're newcomers to Houston, and you're going to love them." An evening reception at the Arsht home represented the beginning of George Bush's campaign for county chairman.
It also was, Marjorie declares,
. . . the beginning of a long and rewarding friendship. Since then I have repeatedly been asked, "Did you have any idea at that time that you were dealing with someone who would eventually become President of the United States." My answer has always been the same: "The Bushes were charming. We were very pleased but the fact is we were merely looking for a county chairman. And we were delighted that we felt we had a winner."
When George Bush ran for the U.S. Senate in 1964 against Ralph Yarborough, Marjorie was asked to host a dinner for black Republicans at her home, something unprecedented in Texas at that time. Her description of that event paints a picture of a society undergoing dramatic change, with Marjorie herself at the forefront:
On the evening of the dinner, Craig Peper said to Raymond Arsht, "I want to tell you who is coming this evening" (a few years later I completely identified with the movie "Guess Who's Coming to Dinner") "George Bush has invited a black lawyer from Washington and some of his professional black friends in Houston for dinner tonight. It should be an interesting evening." Ray looked as though he were speaking a foreign language, and then he turned to me, "Have you lost your mind?" Then he looked toward the front of the house, "Do you realized the front of this house is all glass? "Without waiting for me to answer, he moved to the front to draw the draperies, and then realized that wouldn't really do any good since it was summer and still light outside. I turned to explain to Ray who the other guests were so he would know there were others of our friends also involved. "Do they know who's invited" When I nodded, he said with a pained smile, "So I'm the only one surprised?" I nodded again.
Around that time the doorbell rang, and the Bushes came in with their elegant, tall, handsome guest, Grant Reynolds. Ray's innate good manners prevented our guest from knowing he was in a state of shock. As the others entered, another unforeseen circumstance caused Ray to shed his bewilderment. Although I had decided on a buffet dinner, I had not anticipated the response of Harold Brown, my black chauffeur and bartender, and Annie Turner, my cook. They were totally baffled. They had never served a black person as a special guest in a white family's home before and both suffered a kind of "brain paralysis." One guest asked for a Bloody Mary, and when Ray saw Harold reach for a small wine glass, he knew that he had to help behind the bar, which took his mind off the guests milling around the lanai.
There is, of course, much in the memoir about Marjorie's private life, her happy marriage to Ray, who was active in the oil business, and the tragic loss of her oldest daughter Margot, to a virulent form of ALS (Lou Gehrig's disease), which also took the lives of two grandchildren. Her two surviving children, Alan and Leslye, have been the joy of her life, along with her grandchildren.
After her husband's sudden and untimely death, Marjorie discovered that a series of business reverses had placed her in a fragile economic situation. She went back to teaching, then into real estate. At the same time, Texas Governor Bill Clements named her to the board of Texas State University, an historically black institution in Houston. She made friends with another member of the board, Maurice Barksdale, a black Republican from Ft. Worth who was an authority on public housing. He was named Deputy Assistant Secretary of Housing and Urban Development by President Reagan and in 1983 Marjorie, then a 69-year-old grandmother, went to Washington as his speechwriter.
In 1988, Marjorie was selected as a delegate from Texas to the Republican National Convention that nominated George Bush for president. Bob Schieffer on CBS interviewed Marjorie as the oldest delegate but, she notes, "I heard later that another woman was really older than I but had not given her correct age on the form."
Writing in The New York Times, (Nov. 10, 1988), Maureen Dowd points out that:
Mrs. Arsht first introduced George and Barbara Bush to local political leaders in her living room twenty-five years ago. . . . Marjorie Arsht talked approvingly of the next Secretary of State . . . "Jimmy Baker grew up here. . . . I knew him when he was in short pants."
This book is a moving account of Marjorie's life. This writer has known Marjorie Arsht for nearly fifty years and much of the material reveals sides of her life previously known to few outside her immediate family. It is, beyond this, a document of historical significance, portraying an era of tremendous change and transformation, particularly in the South. All of us can truly draw inspiration from this story. *
"Good intentions will always be pleaded for every assumption of authority. It is hardly too strong to say that the Constitution was made to guard the people against the dangers of good intentions. There are men in all ages who mean to govern well, but they mean to govern. They promise to be good masters, but they mean to be masters." --Daniel Webster