Emmett Till's "Then" Isn't Our "Now"*

             Sixty-one years ago this month, a Tallahatchie County, Mississippi, jury acquitted Roy Bryant and J. W. Milam despite a pile of  damning evidence that the two had abducted, beaten, and then shot Emmett Till, a young Chicagoan scarcely a month past his fourteenth birthday, for violating a strict racial taboo by whistling at Bryant's wife. Since that day, the Emmett Till case has often been cited as both a catalyst for the Civil Rights movement and, more recently, as a trigger for black mobilization on a scale comparable to the "Black Lives Matter" movement. Understandable as they might be, these leaps to conclusions and connections simply don't square with either historical or contemporary reality.

 After Emmett's body was fished from the Tallahatchie River, with his neck bound by barbed wire to a cotton gin fan, his mother, Mamie Till Bradley, was determined to let "the world see what they had done to my boy" and insisted on an open-casket funeral which revealed a corpse so horribly bloated and bludgeoned that it was barely recognizable as human. Not only did tens of thousands of mourners in Chicago file by the horrific sight in the casket, but when Jet Magazine ran an exclusive photograph of Emmett Till's battered and bloated head, that issue sold out so quickly that more copies were printed, and the photo ran the following week as well. The trial itself drew coverage from the major television networks and more than seventy reporters and photographers, representing such major print outlets as the New York Times, Life, Look, and Time, and several black newspapers and magazines and a sprinkling of foreign publications as well.

The trial proceedings offered a real-life template for a fictional southern courtroom drama straight out of central casting, complete with a big-bellied sheriff spewing racial epithets and a defense attorney who exhorted an all-white jury in a 63 percent black county to do their "Anglo-Saxon" duty by freeing the defendants. Compounding the affront to justice, in January 1956, Look Magazine published a story in which Bryant and Milam, their tongues loosened by a nice paycheck and the shield of double jeopardy, admitted to the crime, with Milam explaining coldly he had decided to "make an example" of young Emmett "just so everbody [sic] can know how me and my folks stand."

Yet, even after singer Nat Cole was brutally beaten onstage by Klansmen in Birmingham a few months later, neither the television networks or major U.S. papers like the New York Times chose to make southern racial violence a focal point, and with Cold War anxieties rendering social agitation seriously suspect, a year after Emmett's slaying scarcely 6 percent of Gallup Poll respondents outside the South thought civil rights was the nation's most pressing issue. Meanwhile, fearful of a savage backlash from white southerners, television executives forced award-winning screenwriter and "Twilight Zone" creator Rod Serling to eviscerate not one but two screen plays (the first in 1956 and another 1958) based on the Emmett Till story by cutting out any reference or suggestion of southern settings, characters, or racial practices before they were aired.

 If John Egerton was correct in observing that while the Till case "stirred the nation's conscience momentarily, the attention span was short and the South slipped back into the shadows," what of its supposed catalytic effect on the crusade to end racial injustice? It was true enough that when questioned about it, Rosa Parks admitted that thinking of poor Emmett had strengthened her resolve in the three-month interval between his death and her act of defiance that sparked the Montgomery Bus Boycott in December 1955. Yet plans for such an action had not only been in the works well before Till's slaying, but were modeled on a similar boycott by blacks in Baton Rouge two years earlier. NAACP leaders in Mississippi and elsewhere were also filing petitions for compliance with the Brown decision before Till's murder.

            While the adult generation of black activists were vividly familiar with the South's long history of racial atrocities, their children, especially those in Emmett's age group, were much more vulnerable to the horrors of the Till affair, especially the terrifying casket photo so jealously guarded by Jet that few of their white peers ever saw it. Young Cleveland Sellers could not shake the feeling that the ghastly figure in the casket "could have been me or any other black kid around that same age," They were only teenagers in 1955, but it was surely no coincidence that, when a new decade dawned with less than 1 percent of school-age southern black children in integrated classrooms and southern black voter registration only 4 percent higher than it had been in 1956, it was four frustrated members of the Till generation who boldly took seats at an all-white lunch counter in Greensboro in 1960. This action, in turn, helped to spawn the Student Non-Violent Coordinating Committee whose more aggressive and confrontational approach attracted a number of Till's peers, including Sellers and Joyce Ladner, who could easily identify "ten SNCC workers who saw that picture [of Till's body] in Jet magazine, [and] remember it as the key thing about their youth that was emblazoned in their minds."

 Instead of an immediate and dramatic spark for black activism, the Emmett Till tragedy proved more akin to a seed pod, which, at maturity, released a deferred but timely burst of pent-up energy and anger from a young adult generation whose adolescence had been taken hostage by fear. Current concerns about an extended spate of controversial killings of black Americans have stirred several well-known filmmakers to revisit the Till story. This is welcome news, especially if they resist such facile comparisons as likening the impact of the 1955 murder of Emmett Till to that of  the 2014 Ferguson, Missouri slaying of Michael Brown, which is seen as triggering the "Black Lives Matter" movement, on the grounds that "both events galvanized a black community that had been unheard and spawned movements around what many saw as particularly egregious racial incidents." In reality, the Black Lives Matter campaign testifies to nothing so much as the hard-won advances in black, political, social, economic, and technological empowerment that have marked the last three generations.  Such a rapid, aggressive, coordinated and broad-based response would have been unthinkable to black leaders struggling in the 1950s to mobilize their impoverished, disfranchised, uneducated, and historically brutalized constituencies in an era of virtually unchecked racial terrorism when, by any valid measure, black lives mattered far less than they do today.

*A modified version of this essay appeared under a different title on TIME.COM.




The Atlanta Journal-Constitution

September 12, 2001 Wednesday, Home Edition

Americans left to fear unseen enemy;


SOURCE: For the Journal-Constitution

SECTION: Editorial; Pg. 23A

LENGTH: 368 words

On Jan. 6, 1941, President Franklin D. Roosevelt promised to forge "a world founded upon four essential freedoms," including "freedom from fear."
But our victory in World War II soon dissolved into a nuclear arms race fueled by the Cold War. The generation that spent portions of their childhoods practicing for direct nuclear hits on their elementary schools can hardly look back with much nostalgia on that era.

Yet, even as the Cold War ended and we breathed a collective sigh of relief at the diminished likelihood of a global nuclear holocaust, we were already slipping into a new era of fear and uncertainty, one in which the enemy could be internal, as well as external, and essentially invisible, one in which extravagant defense budgets and massive missile stockpiles count for less than the ruthless and calculated fanaticism of relatively small numbers of unseen and often unknown enemies.
Our inability to protect even the Pentagon and perhaps even the White House or the Capitol served chilling notice that, when all is said and done, a terrorist can get closer to President Bush, than the latter, for all his resources, can get to him. An unseen enemy can make not just the residents of New York or Washington afraid, but can implant that fear into the hearts of the rest of America as well.
This reality came through to me in a number of ways, including the cancellation of classes at the University of Georgia and the anxious investigation of a "suspicious" van parked near the federal building in Athens. However, it was local reaction here in Hart County that I found most enlightening. The mayor of Hartwell, a woman of Lebanese extraction and Episcopal faith, urged citizens to offer their prayers for the victims and their families "in their own tradition." To that end, churches in town and throughout the county opened their doors to the prayerful.
Yet, for all the sincere expressions of grief and compassion, I feel certain that explicitly or not, those prayers also embodied a personal plea for the freedom from fear that, despite our victories in World War II and the Cold War, seems more elusive now than it did when Roosevelt promised to pursue it 60 years ago.

In their "Open Letter to the American People,"released  last week, a group called "Historians Against Trump" declared that "the lessons of history compel us to speak out against [Donald] Trump." Their motives, they insisted, were not partisan in the least, but rather they were simply a collection of schools, teachers, public historians, and graduate students united by their common conviction "that the candidacy of Donald J. Trump poses a threat to American democracy." There followed an indictment whose list of particulars gave no hint of academic expertise but could have been assembled by anyone who owns a television or computer, much less reads a newspaper now and then. Yet the statement suggested that a well-defined professional skill set left its historian-signatories well equipped to topple the Trump campaign and build "an inclusive civil society in its place.

As is frequently the case with letters or other statements drafted by a committee whose members are passionate about the rightness and importance of their cause, this one occasionally waxed a bit grandiose in some of its language and imagery.  In this and the exposure it received, the historians' impassioned missive amounted a big, fat, hanging curveball tossed squarely in the wheelhouse of none other than the switch-hitting, language-bending, career-contrarian critic of practically everything, Stanley Fish. Once tagged, ironically enough, as "the Donald Trump of American academia," in his early incarnation as a literary theorist and campus wheeler-dealer, this "brash, noisy entrepreneur of the intellect," seemed to stoke much the same public outrage against the Academy that the shape-shifting Fish now undertakes to exploit himself, courtesy of the bully platform afforded him by The New York Times.  

At any rate, the historians' "open letter" afforded an irresistible opportunity for Fish to do precisely what he loves best, i.e., play word games, preferably, as in this case, with unsuspecting adversaries. For example, mocking the writers' insistence that "as historians, we consider diverse viewpoints while acknowledging our own limitations and subjectivity," he found "very little acknowledgment of limitations and subjectivity" in their apparent conflation of "political opinions" with "indisputable, impartially arrived at truths," as in: "Donald Trump's presidential campaign is a campaign of violence: violence against individuals and groups; against memory and accountability; against historical analysis and fact." "How's that," Fish asked, "for cool, temperate and disinterested analysis?"

             Possibly a bit juiced by his merciless flaying of yet another offending text, Fish went on to boldly declare that historians "are wrong to insert themselves into the political process under the banner of academic expertise." He may have barely worked up a sweat in puncturing the presumptuous rhetoric of writers whose zeal  may have occasionally run roughshod over their discretion, but he was not exactly free from presumption himself when he lectured the parties to the document on the actual nature of their job, which is,  to wit:  "To teach students how to handle archival materials, how to distinguish between likeable and unreliable evidence, how to build a persuasive account of a disputed event, in short, how to perform as historians, not as seers or gurus."

            Not surprisingly, like many academics, some historians have taken none too kindly to being told where "their competence lies" or having the parameters of their discipline defined by someone who is neither a fellow practitioner nor much of a fan of parameters himself. Taken at face value, this little interdisciplinary dustup might seem at first glance like little more than simply another tempest in the faculty lounge teapot, and a largely contrived one at that. I no more believe that the overwhelming majority of the people who signed on with "Historians Against Trump" really meant to suggest that their academic credentials entitle them to speak more authoritatively on current affairs than others--nor do I believe that Stanley Fish actually believes it either--than I believe that either Fish or anyone else can make a legitimate argument that those credentials should inhibit such activity.   Even if, as I suspect, this latter suggestion was offered largely as a deliberate provocation, it requires at least something of a response because, regardless of the trappings in which it might be delivered, we have never been in more urgent need of historically informed social and political commentary than we are right now.

Though they are certain to face accusations of favoritism from one side or the other if not both, historians who venture into these waters incur no obligation to the candidates themselves. If they have done their dead-level best to offer their readers a balanced, detached view of relevant historical phenomena from which they may reach their own conclusions, scholars are not party to partisanship simply because the implications of their work prove more favorable to one aspirant than the other. The matter of what parts of the past are deemed relevant will inevitably be shaped in large part by the candidates' positions on the most salient issues of the campaign, although the obvious concerns that go largely unaddressed in the partisan sphere are still fair game in the historical arena. For example, the effects of the high tariff policies of the 1920s in fostering and exacerbating economic distress at home and abroad clearly deserve attention in light of Donald Trump's apparent disposition to protectionism in some form and circumstances. On the other hand, however, there is the equally critical issue of already enormous and still widening gaps in wealth and income that were generally blown off by the Republican administrations of the pre-Depression era and, though they loom equally portentous today, still seem closer to the margins of the current campaign than the core.

Clearly, candidates who embrace what are perceived to be extreme positions are inviting the most expansive examination of their historical antecedents, and this year's GOP nominee is no exception.  Flipping through the pages of American history, it is pretty hard to find much of an upside to recurrent appeals to xenophobia, which have never ended other than badly, either for the demonized immigrants themselves or for the nation as a whole. When it comes to the politics of fear and guilt by association and innuendo, Donald Trump may still be a dive or two shy of plumbing the depths reached by red-baiting Wisconsin Sen. Joseph McCarthy in the 1950s, but it is hard to imagine McCarthy resisting a knowing wink at Trump's suggestion of a link between Sen. Ted Cruz's father and Lee Harvey Oswald. 

            Trump's unfiltered addiction to the spotlight virtually mandates a search for his personal and policy precursors. This does not mean, however, that Hillary Clinton, who has, for obvious reasons, sought aggressively to minimize the exposure of her past, has earned any reprieve from the historical third-degree. Clinton, for example, has been more circumspect in her attitude toward recent controversial free-trade agreements like TPP, but like her husband, she should forever bear the yoke of the hideous NAFTA treaty, which ruined the lives of thousands of U.S. textile and apparel workers, devastated their communities, and left them crippled in their efforts to recover. Though Clinton has tried to distance herself from NAFTA, President Obama was on the mark back when he quipped that she said "great things about NAFTA until she started running for president."  It is also worth noting that Hillary's email fiasco is hardly the first manifestation of an obsession with secrecy and a desire to use it for political protection and aggrandizement. If you don't find this a troubling inclination for a presidential candidate, then you're either too old or too young to remember Watergate and the national trauma it inflicted.

Anyone cognizant of historical processes and the critical importance of the discrete contexts in which particular events and trends have played out also understands that such comparisons and analogies should be advanced as cautiously by scholars as they are received by readers. Proceeding cautiously, however, is not the same as proceeding timidly, and in this case, it is eminently preferable to not proceeding at all.  Stanley Fish and others may well be content to give the last word to the old duffer in the New Yorker cartoon who allows that while "Those who do not study history are doomed to repeat it. . . .Those who do study history are doomed to stand by helplessly while everyone else repeats it." I trust, however, that the great majority of  my colleagues will agree that no one who is possessed of a genuine historical consciousness is by any means "helpless," much less "doomed"--or perhaps even entitled--to simply "stand by" and allow whatever lessons the past affords to go not just unheeded, but unheard. 

(This piece also appears on The History News Network, albeit under a less forthright and somewhat unrepresentative title.)

It's not as if the Ol' Bloviator and his wonderful bride needed any further confirmation of Gavin Stevens's famous declaration in Requiem for A Nun that the past is neither "dead" nor "even past," but if we had, we definitely got it a few weeks back after my long suffering bride and I made the twisty trek from Lexington, Virginia, where the O.B. was teaching at that point, across the mountains to Appomattox National Historical Park.

The exhibits and artifacts were impressive but our real destination was the McLean House, where the actual surrender took place. (There is a misconception that this happened in the courthouse because at the time this tiny hamlet was known as "Appomattox Courthouse.") This had come to pass because poor Wilmer McLean happened to be the first person Lee's aide, Col. Charles Marshall, encountered upon arrival at Appomattox Courthouse. When they pressed McLean about a suitable site for the surrender, he first offered a dusty, unfurnished building nearby that struck Marshall as not quite up to snuff for one of the most critical meetings in the nation's history.  McLean then offered an on-the-spot, Medallion-miles-be-damned upgrade, the parlor of his home.  Lest ol' Wilmer be seen as churlish and inhospitable, it is important to note that he had pretty good reason for reluctance in handing over his home to the Confederates, having done the same with his previous residence, near Manassas, as a hospital and command post for General P.G. T. Beauregard during the first major battle of the war at Bull Run # 1.  His house had taken a cannonball to the chimney during the fight, and, after seeing it and his and his wife's 1,200 -acre plantation ravaged by war, he removed himself and his family some 120 miles to the south to the near-obscurity of Appomattox Court House, where he thought surely the war would not find them again. (Even today, any soldiers approaching the town from the west, might deem this a fair surmisal on Wilmer's part.) Yet, Wilmer McLean seemed destined to have, as he was later to say, "the war beg[i]n in my front yard and [end] in my parlor."  

After the proceedings were concluded, Wilmer's coerced hospitality would be rewarded with a locust-like stripping of his furnishings and even pieces of his house itself by Yankee souvenir-seekers who took most anything not nailed down and tore out a lot that was, especially in the "surrender room," where Lee accompanied by a single aide, sat at the desk on the left and Grant, surrounded by several members his staff, sat at the one on the right.

mclean parlor small.jpg

All in all, at 20' x 16' it seemed like a mighty tight space for such a momentous event. The carefully reconstructed courthouse, dwellings, store, etc. definitely took us back and underscored what a tiny, out of the way place the village of Appomattox Courthouse had been in April 1865.

It had been a satisfying experience and a sobering one, though perhaps not nearly so much as the one that awaited us. As we approached Appomattox, we had at one point found ourselves in the midst of what seemed like a caravan of trucks and SUVs, all of them with humongous Confederate battle flags flapping all over the place. The O.B. remarked at the time that he hoped to hell they weren't headed to the same place we were, but they all whupped into a truck stop, and we headed on. We noticed as we neared the park that there were four state trooper cars with flashing bubble-gum machines along the road and several park rangers as if they were awaiting either a Donald Trump rally or Bonnie and Clyde in a stolen get-away car. All of this had told the O.B. somehow that we had not seen the last of that ostentatious band of flaggers, and sure enough, upon exiting, as we came upon a little Confederate cemetery on the edge of the park, there they were, apparently holding some sort of rally, replete with flags whose profusion is not done justice by the photo below, taken by yours truly when we wheeled into the parking lot to get a better look.


As the O.B. stood in the parking lot a hundred feet or so from the proceedings trying to get the widest-angle image an iPhone can deliver, he noticed the approach of a right good sized fellow whose grim countenance and purposeful stride said that he was less than thrilled by the O.B.'s attempt to capture the event for posterity. Thereupon ensued the following exchange.

He:  "What are you up to, buddy?"

O.B.: "Taking some photos."

He: "I see that." (Slight, but pregnant pause.) "Would you like to join us?"

O.B.:  "Not really. Just been over at the park and wanted to see what was up. This is public property, isn't it?"

Instead of replying, he turned away, doubtless after concluding that it would not say a whole lot for his version of southern honor if he curb-stomped a rickety old geezer six inches shorter and thirty years older than he, especially in plain sight of a couple of park policemen. The incident might have seemed less striking had we not just been hammered with the park service's emphasis on Appomattox as the place where, thanks largely to two reasonable and heroic men, America came together again. Suffice it to say, you certainly could not prove any such thing by the crowd at the cemetery, who gave little indication they were aware of what actually transpired about a half-mile to the east in Wilmer McLean's parlor.

The O.B. regrets not pressing on another 100 miles or so east of Appomattox to take in his family's first North American "home place" near Petersburg, where Ambrose Cobbs, late of Willesborough, in the South East of England, claimed his 350-acre headright grant in 1639. (Each new colonist was granted 50 acres of land for every "head" he brought, including his. Ambrose arrived with his wife, Ann, children Robert and Margaret, along with three men indentured to Ambrose in exchange for his paying their passage.  Hence Ambrose was credited for 7 heads at 50 acres each= 350 acres.)

Fuming about this missed opportunity to get better in touch with his family's past did spark the O. B.'s curiosity about how his ancestors fared in their early years in Virginia. Turns out that they did pretty well. Ambrose's son, Robert, and grandson Ambrose would both serve as vestrymen of Bruton Parish Church in Williamsburg, and Robert was appointed sheriff of York County in 1682.  According to historian Christine Eisel, however, Robert's rapid social and political ascent did spark some jealousy:

"In October, 1658, Elizabeth Frith Woods, along with Johanna Poynter and Elianor

Cooper, plotted to post a libelous document on the Marston parish church door. As recorded by

the county court clerk, Elizabeth wrote:

"Gentlemen this is to give you all notice that we have a new fine trade come up amongst us. One of our Vestrymen is turned Mirkin maker. Thomas Bromfield by name, and alsohis wife and goodwife Cobbs, one of our Churchwarden's wife, they make one very handsome Mirkin amongst them and sent it to ye neighbors."

The three women maligned [vestrymen] Thomas Bromfield, Robert Cobbs (by implication) and their wives by accusing them of making mirkens. Mirken was a slang term used to describe a "pubic wig" for women.

The device was most often associated with prostitutes and sexually promiscuous women of low standing. A mirken was designed to hide the deformities that could occur from mercury treatment for syphilis and/or gonorrhea, or to temporarily replace pubic hair that was shaved due to body lice. The women did not accuse anyone of wearing mirkens; they accused them of making mirkens, an accusation that carried layers of meaning. They did not imply that the Bromfields and Cobbses engaged in loose sexual activity themselves; rather, they implied that the Bromfields and Cobbses associated with such people, who were beneath the standing of proper vestrymen and their wives. The women also implied that the Bromfields and Cobbses insulted their neighbors by sending mirkens to them. Further, Woods and her conspirators implied that the Bromfields and Cobbses were covering up some improper and ugly activity, just as a mirken was designed to cover or disguise a deformity." *

According to Eisel, the women were eventually dismissed as vicious gossips, and two of their husbands were fined a whopping 10,000 pounds of tobacco for their wives' efforts to defame my 7X great grandpa Bobby, but from the looks of it, things got hairy for a while.

*(From "SEVERAL UNHANDSOME WORDS": THE POLITICS OF GOSSIP IN EARLY VIRGINIA." Christine Eisel, PhD. Dissertation, Bowling Green University, 2012)



This little piece is a fuller version of an essay, which, in accordance with the Ol' Bloviator's quixotic crusade to better educate the Yankees on matters historical, was posted up yonder at TIME.COM.

Reflecting on recent calls for stripping the name of Robert E. Lee, a slave owner who went to war in slavery's defense, from Washington & Lee University, historian Emory Thomas noted that since the school's other namesake, George Washington, was also a slaveholder, and raised the awkward possibility that one of the country's most distinguished liberal arts institutions might be known one day simply as "&." Thomas spoke with tongue securely in cheek, but the scenario he posited seemed a logical, if absurd, progression of the current obsession with de-christening institutions, buildings, parks, or thoroughfares named for someone with ties to slavery. However well-intentioned such efforts may be, recent explorations by several historians suggest how truly monumental the task of rooting out connections with such an indisputably powerful, intricately pervasive, and ultimately integral institution would be.

African slave labor had been introduced on the tobacco plantations of the seventeenth century Chesapeake, but slavery's emergence as a truly dominant force in national and international commerce and finance awaited the arrival in 1793 of Eli Whitney's fabulous cotton gin, which spurred the explosive spread of cotton-growing and slavery across the southern interior and into the new southwestern states of Alabama and Mississippi. The booming southwestern cotton frontier proved an irresistible magnet for both people, free and unfree, and financial investment. Some struggling Upper South planters opted to relocate with their slaves in tow. With slave prices rising meteorically in response to soaring demand, and stoked as well by a congressional ban on further importation after 1808, many others simply consigned their increasingly valuable human property to a massive stream of bound labor destined first for the lucrative slave markets of the Southwest. Cotton accounted for nearly one-third of the value of U.S. exported merchandise by 1820, and closer to two-thirds by 1860, more than three-fourths of it going to Great Britain.

Maintaining this fibrous connection between southern slave plantations and the voracious looms of Lancashire required myriad supporting ventures in production, trade, services, and financing on both sides of the Atlantic. With the American banking system still wracked with growing pains in the early nineteenth century, English firms like Baring Brothers marketed high-yield bonds backed by the slaveholdings of planters in Louisiana and elsewhere, while profits extracted from the slave trade supplied vital capital for the nascent Barclays Bank. As the American financial system matured, a wide range of domestic banks got in on this act. Two of these, Citizens' Bank and Canal Bank of Louisiana, which accepted roughly 13,000 slaves as collateral and came to own well over a thousand slaves outright, became cogs in the great financial wheel that became J. P. Morgan Chase. Likewise, Moses Taylor, director of the City Bank of New York, the forerunner of Citibank, managed the fruits of the tireless exertions of slaves on large sugar plantations and was also deeply involved in the illicit importation of slaves into Cuba.

Northern shippers also profited handsomely after 1808 in the brisk interstate transfer in slaves that saw some one million bondsmen transferred by sea as well as land from the Upper to the Lower South between 1810 and 1860. Thus it was not in New Orleans but Providence that some of the state's most prosperous and influential citizens gathered at what the local newspaper described as "a very numerous and respectable" meeting, on November 2, 1835, to unanimously endorse several resolutions condemning the actions of recently formed anti-slavery societies in the free states, declaring "coercive measures for the abolition of slavery" a "violation of the sacred rights of property" and "dangerous to the existing friendship and of business between different sections of our country." This proclamation was altogether fitting. Rhode Island had sent more than twice as many ships to Africa for slaves than all of the other colonies or states combined, many of them as part of the infamous Triangular Trade in New England rum, African slaves and southern or Caribbean molasses and sugar. Across the region, a sizable workforce was also employed in building the vessels requisite to these activities. Although slavery was said to be the "peculiar institution" of the South, so pervasive were Boston's entanglements with it that one wonders whether when the Lowells spoke only to the Cabots, the subject of their common ties to the slave trade ever came up.

As for New York, surely there are few cities, North or South, where so many prominent physical fixtures are tied to slavery, even down to key sports venues like Madison Square Garden, Citi Field, and the Barclay Center. These disturbing reminders are actually less incongruous than they seem.  Even though the international slave trade had been illegal for more than half a century, this illicit commerce was being conducted so brazenly in the city, the London Times dubbed New York "the greatest slave trading market in the world" in 1860.This appellation seemed to trouble the city's Episcopalians less than their Anglican brethren across the water, however. More than once the convention of the Diocese of New York declined by an "overwhelming majority" even to discuss resolutions asking the Bishop and clergy of the Diocese to speak out against a practice, so blatantly contrary to "the teachings of the Church" and "the laws of God."

            Ironically, in an era when so much wealth was derived from pursuits directly related to slavery the two institutions seemingly most deserving of philanthropy were churches and colleges. Surely no institution of higher learning has confronted its historical indebtedness to slavery and the slave trade more forthrightly than Brown University, whose principal early benefactors included the Brown brothers, who, operating as under the name of Nicholas Brown and Company raked in hefty profits from trading and transporting slaves. All told, at least thirty members of Brown's early governing board at one time owned or captained slave ships. Meanwhile, Tench Francis, who wrote the insurance for some of the Brown Company's slaving voyages, became one of the founding trustees of the University of Pennsylvania, whose ranks presented a virtual who's who of Philadelphia's high-profile slave traders.  And so it goes, from Rutgers, to Columbia, to Yale and Harvard, all of which and others detailed in Craig Steven Wilder's Ebony and Ivory, benefited significantly at some point from the largesse of men who owned or trafficked in human beings.

Although we might quibble about matters of degree, there is no escaping the critical role of slavery in facilitating our development as a nation. Historian Calvin Schermerhorn has it right when he calls enslaved Africans laboring in southern cotton fields "the strengths and sinews of a robust capitalist system." By maximizing the output of labor-intensive cotton agriculture in order to keep pace with the demands of mechanized textile production abroad, slavery established a vital and timely reciprocity with the Industrial Revolution that would first stabilize and then position this country for its remarkably swift journey from the periphery to the core of the world economy. Lest they exaggerate what can be achieved by simply scouring the taint of slavery from the faces of a variety of American institutions and edifices, those who propose to do so would do well to heed the words of a former bondsman featured in the title of Edward Baptist's recent book on slavery and American capitalism, for they are truly reacting to a story whose "half has never been told."


(This image may well get the job done better than the 1,500 words that follow.)

The Ol' Bloviator has never been loath to mouth off about any and all matters political, and he considers it quite the triumph of self-restraint that he is only now breaking silence on the cascading lunacy that is the 2016 presidential race. The O.B. has always considered American politics the finest comedic spectacle out there, and thus the almost ideal target for his normally irrepressible impulses to mock and ridicule. However, where earlier presidential contests have offered at least a modest challenge to those impulses, this one offers such an unbroken stream profound ignorance, reckless stupidity, and over-the-top meanness that no one who has even walked by a TV set or a newsstand needs any help in understanding that what we are witnessing has the earmarks of a potential tragedy masquerading as epic farce.

With sincere apologies to his esteemed colleagues in political science, the O.B. can tell you without so much as a glance at exit polls that, in primary elections especially, people are more motivated not just to vote but to vote a certain way when they are angry than when they are reasonably content. This, of course, explains why at lot of folks outside the South voted for George Wallace in the 1968 presidential primaries only to drop ol' George like a hot sweet potato before heading to the polls that November. It was easy enough to interpret surging support in the polls for both Donald Trump and Bernie Sanders as indicative of just such a "blowing-off -steam-before-coming-to-my-senses" reaction among primary voters of both parties. In fact, Bernie already trails Hillary 503-70 in the scramble for the 2,383 delegates needed for the Democratic nomination. With the Super Tuesday slog through Dixie looming large and menacing, the dedicated dreamer of the impossible and impractical dream and the sturdy band of zealots who have fallen under his spell may well be looking at their last dance among the sugar plums.

 Not so for the Donald, however. Indeed, not only "No," but Hell No!" The guy whose very entry into the Republican field was lustily hooted at by every professional and amateur pundit--not to mention several hair stylists--from Harvard to Hahira is not only still standing but looking at excellent odds of being the last one doing so. Since he was edged out in the quadrennial Iowa contest to see who can cram the most "Bevs" and "Berts" into a middle-school cafeteria by the equally scary Ted Cruz, Trump, whom the Wall Street Journal can only bring itself to refer to as "the businessman," has kicked some serious booty among the wishfully disbelieving.

            For months, we waited expectantly for the next in an almost daily progression of Trumpisms, each aggressively insensitive enough in its own right to make Archie Bunker seem like the Dalai Lama, to finally take him down. Meanwhile, the imperturbable Mr. T. proceeded merrily along down, curb-stomping his opponents verbally while besting them first in--then largely at--the polls. Although he Republican establishment finally seems ready to act forcefully against yon Donald's threat to their party, it appears that they may have stuck with their Nero act a Virginia Reel or two too long. At this too-late date, barring a groundswell of folks desperate enough to cross the Rubio-con with Marco, or indisputable revelations of ol' Donnie's excessive fondness for farm animals--and even this is no sure thing-- he stands somewhere between "quite likely" and "all but certain" to show up at the July GOP confab in Cleveland (That desperate to win back Ohio, are we?) with enough delegates in his pocket to collect the nomination on the first ballot. Any proportional expression of the perceived improbability of this just a few months ago being impossible, the O.B. can only call upon one of his Mama's favorite maxims to suggest that somewhere, surely, the band is tuning up to play "Who'da' Thought It?"

            It is tempting simply to conclude that Republicans brought Donald Trump on themselves through the tolerance, even deference that they have increasingly shown to a polarizing array of reckless, loud-mouthed spewers of meanness, and vitriol in recent years. (True to form, John Kasich, in all likelihood the most electable aspirant still in the Republican race, has been unable to get the fatal monkey of moderation off his back and is struggling simply to stay in the race until the March 15 Ohio primary, where, ironically enough, he represents one of the few feeble hopes for slowing down the Trump juggernaut.) What pleasure may be taken in seeing the Republicans being force-fed the bitter fruits of their own venality, however, must be tempered by the fact that their unscrupulousness has taken the rest of us and, for that matter, the rest of the world to the threshold of an era where rage Trumps (Sorry!) reason not just frequently but consistently and thoroughly.

Unfortunately, joining ol' Pilate at the washbowl is not an option in this case, nor is a self-righteous recusal to the moral high ground, because few of us can escape some measure of responsibility for the currently appalling state of American politics. For example, how many self-professed God-fearing Christians apparently didn't fear Him enough to step up and cry "Enough!" when his name was mocked and exploited by self-serving posturers like Jerry Falwell, Sr., and, more recently, Jr., whose endorsement of Donald Trump as a man who "lives a life of loving and helping others as Jesus taught in the great commandment" truly sickened even so hardened a cynic as the O.B. in its brazen hypocrisy. We Bible-Belters who have been hit more than one hypothetical such as "How would Jesus be received if he visited your community tomorrow?" certainly have good reason to retaliate by asking how He might fare with today's power-brokering preacher/politicos if he came back determined to run for office with the Sermon on the Mount as his platform. How much does "blessed are the poor in spirit" or "the meek" or "the merciful" or "the peacemakers" resemble anything that ever came out of the mouth of Falwell's man Trump, or the self-styled uber-evangelical, Ted Cruz, for that matter?

            Finally, there is also more than a whiff of culpability among many in the Democratic camp currently watching the Republican Rocky Horror Show with the smug, self-satisfied amusement afforded only by the miseries of an adversary.  Each confident but failed prophesy that Trump's latest shot at the canon of political correctness would ultimately cost him his big toe should simply underscore the depth of the Democratic left's disconnect with prevailing popular attitudes. The casual presumption that everyone with at least a modicum of intelligence should share their views on transgender issues, any and all attempts to curb illegal immigration, buildings named for racists, sexists, imperialists, and pretty much everything else that might offend anyone but conservatives has finally grown so stultifying that many liberals in the media and (gasp!) academia have cried out for relief.

There is no doubt that Donald Trump benefits inordinately and even proudly from the support of the people whom he affectionately (for now, at least) calls "the poorly educated," as well as the folks who, as Lewis Grizzard put it, "think the moonshot's fake and wrestlin's real." (Note here surveys suggesting that nearly one in five Trump supporters remains unpersuaded that the Emancipation Proclamation was such a hot idea, and in South Carolina, nearly one in four still wish the South had come out of the Recent Unpleasantness on top.) For all that, however, Trump's troops are actually drawn from a reasonably broad demographic, and polls consistently show him stronger among self-identified moderates than Tea Partiers or rock-ribbed conservative regulars. Some of this may be written off to Mr. T's lack of ideological consistency--his extremism is more of a selective, or even knee-jerk sort. But the point here is not simply that he is pushing a lot of the right anger buttons across a broad spectrum of Republicans, including those still registered as Democrats, but also that there are so many "hot" buttons that work in his favor.

It hardly seems necessary even to suggest that Bernie's ranks are heavily populated not only by those who are salivating for a piece of his pie-in-the-sky but by those who simply cannot stomach Hillary. Yet, even the sharpest of Mr. Sanders's jabs at his opponent seems like the thrust of a butter knife compared to the chainsaw approach Trump has thus far wielded so effectively against his rivals. Whatever happens from here on out, the fact that D.T.'s ostentatious contempt for his fellow Republicans has played so well for this long with so many of the rank-and-file cannot portend well for the GOP. To a lesser but still notable extent, the protracted dalliance with Bernie suggests that a lot of Dems don't particularly care for their party establishment either. The larger, more portentous question looming over this election, however, is not simply which party's' levers get the most pulls in November but how many voters will pull either one with their other hands clamped over their noses and, beyond that, how much longer will they tolerate such a necessity.

Deutschland Meet Southland (in 1600 words)

  [Some of the Ol' Bloviator's friends in Germany asked him to boil the essence of southern history and identity down to the tidy sum of 1600 words, for the benefit of public school teachers who will be devoting an instructional unit to the American South.  This exercise in hypercompression (some might call it "bliviting") took the O.B. a lot longer than he expected and hammered home Kenny Rogers's wisdom about the importance of knowing "what to keep, and what to throw away." Needless to say, the O.B. had to do a great deal of throwing away, so please keep that in mind if you are troubled by what you don't read below. If that doesn't work, by all means, take your own shot at being a southern-fried oracle in 1600 words.]        

In the United States, the "South" can be defined in many ways, including its geography (roughly the same latitude as Spain, Portugal, and Southern Italy) its relatively warm, humid climate average high temp above 22 C.), racial population mix (Black to white ratio: South 30%; United States 18%), and strong religious commitment (highest church attendance rates in the U.S.). The most unifying characteristic of the South, however, remains its history, and the most important factors in that history are African slavery and the Civil War, 1861-1865, to which slavery was the major contributing cause. In this sense, the eleven states that went to war in defense of slavery present the most cohesive representation of the South.

Slavery flourished first in Virginia in response to the labor requirements of growing tobacco, the colony's principal crop, and spread to the rice plantations farther down the Atlantic Coast as well as the sugar plantations of Louisiana. Slaves were also employed in growing cotton, which was first confined to the warm, moist coastal areas of Georgia and the Carolinas suitable for growing the finer, long-stranded variety whose fibers could be separated from the seeds by hand.  Even with slaves doing the work, this was still a slow and arduous process until 1793, when, in the face of mounting demand from British textile manufacturers, inventor Eli Whitney perfected his cotton "engine" or "gin," a machine that could efficiently extract the seeds from the fibers, even in hardier, shorter-stranded cotton that could be grown across much of the South. In response to Whitney's invention, southern cotton production exploded from 3,000 bales (227 kg each) in 1790 to the more than 3.8 million bales that by 1860 accounted for 58 percent of the total value of U.S. exports and 75 percent of the world's cotton supply. High demand for cotton meant higher slave prices as well, and by 1860, with slaves accounting for roughly two-thirds of their wealth, southern planters were among the richest people not only in the United States, but in the entire world.

With mounting national opposition to slavery threatening their wealth and status by the end of the 1850s, slaveholders came increasingly to advocate withdrawal from the federal union even if it meant taking up arms against it. Barely one-third of southern white families owned slaves in 1861, but the ensuing death and destruction of the Civil War brought economic devastation to the entire South. Destroying slavery also meant destroying the $4 billion value attached to the slave population, leaving the region sorely lacking in the capital needed not only to rebuild southern agriculture but to finance the South's industrial development, which by the end of the Civil War lagged even farther behind that of the northern states than it had at the beginning. The lack of capital or skilled labor condemned the South to a pattern of slow industrial growth dominated by manufacturers looking to take advantage of its vast pool of cheap, unskilled labor.

Meanwhile, with actual cash so hard to come by, southern cotton production slipped into a system in which larger landholdings were divided into separate plots, each farmed by a family of "sharecroppers." Instead of wages, sharecroppers received a designated share of the proceeds from the crops they produced after charges for the supplies and food, advanced to them on credit at extremely high interest rates, had been deducted. In combination with a general decline in cotton prices, this very inefficient and often exploitive way of farming caused millions of southerners, black and white, to sink deeper and deeper into unrelenting debt and poverty. It was small wonder that per capita income in the South was barely half the national average in 1900 or that malnutrition and chronic disease were also widespread.

             Most white southerners blamed the Republican Party for the Civil War and the destruction of slavery and fiercely resisted its efforts to assist newly freed blacks. The overthrow of the last Republican state governments in 1877 not only marked the end of the "Reconstruction" era, but set the stage for the South to become a fortress of Democratic Party support for more than three-quarters of a century.  With both the region's industrial and agricultural economies heavily dependent on cheap and easily controlled labor, restoring white supremacy over the former slaves became a priority. The resulting system of economic and social repression included not only rigid racial segregation, but a variety of discriminatory restrictions that prevented the great majority of southern blacks and quite a few poorer whites from continuing to vote.

These tightly interconnected economic, racial, and political arrangements survived largely intact until the Great Depression of the 1930s brought federal incentives to reduce farm production, which, in turn, led to massive evictions of sharecroppers. World War II drew even more southerners away from farming and spurred the development of a mechanical cotton picker that reduced the need for farm labor even further. Rapidly declining agricultural employment dictated a much more aggressive campaign to bring industry to the South, and the sharp wartime increase in personal income set the stage for an influx of faster-growing, sometimes better-paying manufacturers attracted by an expanding base of more affluent metropolitan consumers.

Meanwhile, black veterans returning from World War II after fighting for democracy overseas were determined to have it for themselves back home.  They played a key role in rallying support for the National Association for the Advancement of Colored People (NAACP) and its push for racial equality that led the U.S Supreme Court to outlaw public school segregation in 1954 (Brown vs. Board of Education). The ensuing campaign of public protests and civil disobedience headed by the Rev. Martin Luther King, Jr., generated the pressures necessary to prompt Congress, with considerable prodding by President Lyndon B. Johnson, to pass legislation prohibiting racial discrimination by employers or public businesses and aggressively guaranteeing the voting rights of southern blacks. The Voting Rights Act of 1965 led to widespread black voting (for the Democratic Party), and the South soon led the nation in the number of blacks holding elected office. On the other hand, after liberal northern Democrats joined President Johnson in calling for the new civil rights measures, a majority of white southerners abruptly switched their allegiance to the more conservative Republican Party, although in recent elections Democratic presidential candidates have regained strength in states like Virginia and Florida, which have attracted many new residents from outside the region.

Some parts of the South have enjoyed remarkable economic progress since the 1960s, as rising global competition encouraged more northern industrialists to move their production facilities to the South, which still offered the lower labor and other operating costs that also spurred investments by a number of international manufacturers, including German automakers BMW and Mercedes. The South now boasts twenty metropolitan areas with populations of 1 million or more. Yet there are many pockets of enduring poverty, especially in rural areas with heavily black populations. Eight of the ten poorest states are in the South, which also lags behind most of the rest of the U.S. in categories like support for public education and public health and leads in the incidence of health problems like obesity, diabetes, and susceptibility to strokes and heart disease.

As poor as it might be in certain respects, the South is undeniably rich in culture. Although its original white settlers came from Great Britain and Western Europe, its cultural heritage was also shaped by Native Americans and enslaved Africans, who brought with them a rich bounty of foods, spices, and cooking techniques. The South's famed barbecue derives from "Barbacoa," a technique for slow-cooking and roasting meat likely adopted from the native population of the Caribbean and West Indies by slaves and the whites who deposited them there before they were imported to the southern colonies. Slaves were largely responsible for the pepper and vinegar sauces spread across the barbecued meat, although later German immigrants to the Carolinas insisted on a mustard-based sauce. Finally, "grits" became a fundamental staple of the southern diet after Native Americans were observed soaking ground corn in a mixture of water and ashes prior to boiling in order to unlock its full nutrient content.

No element of the South's culture has had more influence on the culture of the U.S. and other nations than its music. While the ballads and fiddle tunes brought by British settlers provided the foundation for what would become country music, the work songs and field hollers that were a vital part of the slaves' African heritage formed the basis of the blues. These musical forms did not always respect the South's racial divisions. There was more interaction than many realized as both the blues and country music grew more commercialized and, as members of both races left the farm in droves, more urbanized as well. When local radio stations and recording studios in cities like Memphis and New Orleans began to feature the work of both black and white performers after World War II, the closer contact and familiarity bred the revolutionary new sound that would become "rock 'n roll." Elvis Presley quickly won an enormous youthful following as a white singer who sounded "black," but if he succeeded by borrowing heavily from black stylings, he also helped to open the door to white audiences much wider for a host of black performers ranging from Little Richard to Chuck Berry.

            If the South gave America its most characteristic music, its writers also contributed some of its greatest literature. The region's striking racial and economic disparities and injustices and its stark extremes of religious piety and violent cruelty helped to fuel the creative instincts of a host of brilliant writers, white and black, from William Faulkner, Thomas Wolfe, and Carson McCullers to Langston Hughes, Richard Wright, and Alice Walker.  In the end, however, like the southern people themselves, more than anything, the works of these writers reveal a common struggle with the enduring presence of a past that, for them, as Faulkner writes, is "never dead" and "not even past."


A Note from the Ol' Bloviator: This is an updated version of a piece that appeared last week up yonder at TIME.COM.

A court hearing in New Orleans last week focused on one of today's thorniest issues, the role and representation of the past in everyday life: Should officials go ahead with last month's city council decision to relocate three statues of Confederate leaders and a memorial to an uprising against Louisiana's Reconstruction-Era Republican government? Or are heritage and preservation groups justified in their claim that moving the monuments would not only violate both the constitution and state law, but also destroy the "integrity of the historic landscape of New Orleans"? At this stage, the presiding judge seemed skeptical that those trying to block the move could show compelling legal grounds for their position, but it is more than certain that this tilt is far from over.

The impulse behind the decision to move these offending symbols, like similar efforts elsewhere, is certainly understandable, especially in the wake of last summer's horror in Charleston, S.C. Yet, however problematic public monuments to the defenders and beneficiaries of slavery and Jim Crow might appear today, we would do well to consider an enduring human reality easily as troubling to historians as it is reassuring to politicians: People forget--quickly, and with minimal encouragement--especially when it makes them feel better and divests them of obligations they are eager to shed. It is more tempting to hide what's painful than to confront it, but rarely does this path of lesser resistance take a nation or society where it really needs to go.

For example, as recently as half a century ago, U.S. history textbooks attempted to soften and rationalize the subjugation of African Americans, Native Americans, immigrants and other minorities. Meanwhile, Soviet Premier Nikita Khrushchev was relentlessly toppling statues and renaming buildings and even cities in hopes of purging the public memory of the brutal and embarrassing horrors visited on the populace by former premier Josef Stalin. In these and any number of similar cases stretching back to antiquity, efforts to make the past less troubling or more inspiring have left untold generations uninformed about the realities of their history or its ramifications for the present.

As a case in point, the history behind the controversial monuments in New Orleans is far more complicated and far-reaching in its consequences than many Americans realize. Most memorials of that type were not constructed immediately after the Civil War, but at least a generation after the final overthrow of Reconstruction in 1877, and they served not simply to reaffirm the rightness of the South's "Lost Cause," but to rally white southerners to a new campaign to restore their racial supremacy. As the 19th century drew to a close, the move to monumentalize the Lost Cause went hand-in-hand with campaigns for segregation and disfranchisement that, replete with incendiary rhetoric, more than once fueled outbreaks of mass violence against blacks, including the infamous Wilmington, N.C., riot of 1898. The principal instigator of this racial pogrom, which left at least two dozen blacks dead, was ex-Confederate and former congressman Alfred Moore Waddell, a vigorous proponent of erecting monuments to the state's "fallen sons," who also warned that the only means of fully securing their heroic legacy was stripping black men of the vote by any means necessary even if "we have to choke the Cape Fear [River] with carcasses."

To some, this additional layer of interpretation might seem only to make these already painful symbols even more so. To others, however, a fuller understanding of the breadth and complexity of the their implications could make them seem less like taunting reminders of victimization and more like powerful testaments to the enormous adversity black Americans have been forced to surmount. Properly annotated and situated away from official government buildings and grounds where citizens would be forced to encounter them, some of these monuments might ultimately serve a valid educational, and even socially redemptive purpose. Some historians and preservationists argue that, rather than concealing them, we should simply do a better job of explaining them. The National Trust for Historic Preservation's Stephanie Meeks has noted that the organization believes "we actually need more historic sites properly interpreted, to help us contextualize and come to terms with this difficult past." Atlanta History Center President Sheffield Hale and his colleagues have actually drafted a template for supplementing the original information on a Confederate monument with language linking the Lost Cause to efforts to restore supremacy, and pointing out that "celebrations of the Lost Cause often went hand-in-hand with campaigns to enact laws mandating 'Jim Crow' segregation and disenfranchising African American voters which also sparked racial violence, including lynching, well into the twentieth century."

The mirage of a post-racial America has long since evaporated, but it is nonetheless important to recognize that the contemporary grievances of today's minorities are deeply rooted in a long history of discrimination and imposed disadvantage. Presented in broader historical perspective, monuments like those in New Orleans could contribute to that recognition. As historian David Blight noted in response to those who find it upsetting that Yale's Calhoun College bears the name of a prominent defender of slavery and advocate for slaveholding interests, "The past should really trouble us. I don't want the past to ever make us feel good." Ta-Nehisi Coates seemed to feel much the same when he explained, "I don't know if I want to forget that, at some point, somebody was crazy enough to have a monument to Nathan Bedford Forrest. That's a statement about what society was. That shouldn't be forgotten." Yet, with black anger and frustration clearly approaching the boiling point and the Supreme Court seemingly poised to deliver the final blow to race-based college admissions policies crafted initially to counter the effects of centuries of systematic racial oppression, instead acknowledging the enduring consequences of an unjust past, we seem increasingly intent on sweeping aside many of the most vivid reminders of why there is still so much to overcome. 

Blasts From Christmases Past

(This post from 2009 actually originated in 2005 and has passed through several iterations, which, taken together, offer a bittersweet chronicle of how much things have changed--and how little.)

Although his ode to "Trees" was the first piece of verse committed to memory by several generations of American schoolchildren, Alfred Joyce Kilmer had a lot to overcome, including the fact that his parents chose to identify him by his middle name.  After surviving what, one presumes, were dozens of playground brawls about his moniker, Kilmer had the further  misfortune to become a poet whose work not only made sense but actually rhymed.  This, of course, amounted to the kiss of death among literary critics, so much so, that the effete highbrows at his alma mater,Columbia, now pay homage to him with an annual "Bad Poetry Contest." 

As I first did some four years ago, I beg to offer Joyce Kilmer's "Kings'" which might not be great poetry, but still strikes me as damn good and ironic insight, worthy both of the immediate season and the times in which we live:


The Kings of the earth are men of might,
And cities are burned for their delight,
And the skies rain death in the silent night,
And the hills belch death all day!
But the King of Heaven, Who made them all,
Is fair and gentle, and very small;
He lies in the straw, by the oxen's stall 


I posted this verse in 2005 as part of a critique of a warrior president who seemed to believe he had been elected king.  Now, here it is again,even as the winner of the  Nobel Peace Prize who is our new commander-in-chief orders the escalation of American involvement in Afghanistan on the premise that this is the best way to achieve peace in that region. To invoke an old analogy, fighting for peace strikes me as about as efficacious as  fornicating for chastity, and  any  "peace" achieved by wielding the proverbial big stick is likely to last only until the other guy finds a bigger stick.

(The following are excerpts, first from the 2010 follow-up,after the birth in May of our precious grandson, Barrett, and then from 2011, anticipating the impending arrival of our equally precious granddaughter,Virginia.) 

2010: When I read constantly about our courageous young men who are being killed or horribly maimed every day in Afghanistan, I can't help but question the reasons behind such sacrificial slaughter and remember that many of these young heroes are not even two decades removed from the warm, cuddly, infinitely curious and wide-eyed little boy. I can't wait to hold as close as I can for as long as I can.... 

2011: Barrett remains all those things, although he is now fully ambulatory and picking up new words ( Careful, Grandpa OB!) at the rate of about one per minute.  He has no idea, of course, that , God willing, at the tender age of twenty months, he will soon be assuming the awesome responsibility of being big brother to a newly arrived little sister.  Thus, bless his heart,  this stands to be his last year as the only star in the Christmas firmament for his doting and utterly devoted parents and grandparents. 

(The OB is bustin' proud to report that Barrett assumed the mantle of Big Bro-hood like a champ, and that Virginia only made that firmament even more dazzling. Ten years after the initial posting, however, the senseless slaughter in the Middle East continues.)


[This would have saddened but doubtless not surprised] Joyce Kilmer, who knew about these things,after all, for he served in the vaunted "War to End all Wars,"until the summer of 1918, about a year after he wrote "Kings," when he was killed by a sniper at the Second Battle of the Marne.

Obviously, the Ol' Bloviator is in a bit of a somber mood right now, but he hasn't forgotten that this is supposed to be a season of hope and good cheer, and it is in that spirit that he presents the second [now eighth] annual Redneck Festival of Lights (Mash below) as may be witnessed any evening these days in front of the humble abode that he shares with the long-suffering Ms. OB, who, needless to say, both enjoys and deserves the deepest sympathies of the neighbors.  If you can't come by to admire the Ol' Bloviator's artistry firsthand, let me wish you the happiest and safest of holidays.  In other words, as they used to say in the country,"Have a good'un," or as they still say over at Ga. Tech, "Felice Bobby Dodd!"

truck 2015.MOV



A lot has happened since the Ol Bloviator was called away from these cyber-pages to attend to numerous loose ends, attending both to his research, and the rules and regs pertaining both to his imminent retirement from his longsuffering employer and alma mater (3x) and to the utterly chaotic foreplay leading up to getting laid low by the USG's default on its health care promises to its employees. (More to follow on this soon.) Among the things that have been crying out for his unsolicited commentary is the raging controversy within the academic realm over whether freedom of expression in practically any form must be restricted to avoid giving offense various racial, ethnic, cultural, and sexual orientation groupings within the student population.  The following is the O.B's take on this matter as it might be viewed through the eyes of one on the nation's leading champions of free speech in the twentieth century, the late historian C. Vann Woodward. It is a slightly up dated version of a piece posted on The History News Network.


When it appeared in 1975, Yale's Free Speech Policy made such a forceful case for the absolute necessity of protecting free expression on campus that it was quickly adopted as a model for a number of other universities. Forty years later, however, events at Yale and elsewhere demonstrate that many of the old certainties about the nature and primary importance free speech in the academic arena are anything but certain. Yale's policy was once better known as "The Woodward Report," in reference to C. Vann Woodward, the distinguished historian and public intellectual who chaired the committee charged with drafting the report. The association was indisputably fitting, for in addition to his role as an early and ardent crusader against racial injustice and exclusion, Woodward had been a champion of free speech and dissent since his young adulthood. In 1930, at age 21, fresh out of Emory University and teaching English at Georgia Tech, he became one of 62 signatories to a petition protesting arbitrary arrests and police harassment of communist spokespersons in Atlanta and demanding they "should be protected in their constitutional rights of free speech and assemblage." Two years later, he had helped to mount a defense effort for Angelo Herndon, a young black communist organizer who was arrested and imprisoned on charges of "inciting insurrection" under an obscure Georgia law dredged up from the Reconstruction era. Woodward would again risk his job and reputation by stoutly affirming the loyalty of embattled German-born faculty at Scripps College, where he was teaching at the beginning of World War II.

These and other such activities presaged Woodward's prominent role in supporting his Johns Hopkins colleague, international affairs expert Owen Lattimore, whose tolerant views on the Soviet Union led Senator Joseph R. McCarthy to condemn him as "the top Soviet espionage agent in the United States" in 1950. Though McCarthy and countless others urged Johns Hopkins administrators to fire Lattimore, Woodward was in the front ranks of a faculty cohort who succeeded in persuading the Hopkins higher-ups to retain him even after he was indicted by the Justice Department in 1952 and up until he was finally cleared of all charges in 1955.

Like many others born of the Cold War anxieties of the 1950s, Lattimore's case fell into a general pattern stretching back to the Early National Era in which individuals or groups who challenged the prevailing verities or the practices of the reigning political majority, provoked determined efforts to silence them either through the resources of government or by any other coercive means available. The next decade, however, would bring a striking new twist to free-speech debates and conflicts, in the academic realm especially, where it would frequently be those bent on fundamental alteration of the system who sought to silence its increasingly outmanned defenders.

Even as a champion of free speech and dissent, Woodward drew the line at what he saw as the unreasonable demands and bullying tactics of militant black students and anti-Vietnam war activists who succeeded in shutting down some universities for days at a time in the 1960s and early 1970s. On the other hand, rather than damaging, he thought periodic disagreements, even potentially volatile ones, sparked by the expression of controversial or unpopular views were vital to maintaining a vibrant, energized campus intellectual environment. He was more than a little dismayed, then, in September 1963, when then-provost and acting president Kingman Brewster persuaded a student organization to rescind its invitation to Alabama Governor George Wallace to speak at Yale. Brewster would later be named president in his own right, but he was clearly chastened by the backlash against his use of his office to restrict free speech on a campus where it was supposed to have been such a hallowed tradition. Though Woodward had been at Yale barely a year at that point, he had not hesitated to let Brewster know of his disapproval. Nearly a decade later, he would be, if anything, even more upset when student protestors were allowed to physically prevent General William Westmoreland from taking the podium in 1972 and two years later when they succeeded in shouting down a debate featuring physicist and black-inferiority theorist William A. Shockley. In the wake of the Shockley debacle, Brewster asked Woodward to chair a committee to draft a policy that would reaffirm "the principles of free speech" at Yale.

Unlike many documents constructed by committee, the Woodward Report would have both an immediate and a lasting impact, owing in no small measure to its eloquent and compelling argument for free speech as the absolute and inviolable principle by which all universities worthy of the name must abide: "The history of intellectual growth and discovery clearly demonstrates the need for unfettered freedom, the right to think the unthinkable, discuss the unmentionable, and challenge the unchallengeable. . . . We value freedom of expression precisely because it provides a forum for the new, the provocative, the disturbing, and the unorthodox."

            Such assertions seemed very much in tune with the spirit of an era of zealously composed and just as zealously ripped down bulletin-board treatises and competing bullhorns echoing across college campuses. Although the report's authors conceded that "if a university is a place for knowledge, it is also a special kind of small society," they ultimately concluded that "it cannot make its primary and dominant value the fostering of friendship, solidarity, harmony, civility, or mutual respect" and remain true to 'its central purpose." Indeed, they added for good measure, "It may sometimes be necessary in a university for civility and mutual respect to be superseded by the need to guarantee free expression."

            Woodward remained true to this principle in 1986, when he took up the cause of Wayne Dick, a student who had been placed on probation for posting flyers that mocked "Gay and Lesbian Awareness Days" by announcing "Bestiality Awareness Days." Dick had cited the Woodward Report in his defense, and his new champion explained that "certainly I don't agree with his ideas, but they all come under the protection of free speech." Yale's executive committee agreed to a re-hearing of his case and cleared Dick after Woodward recruited several influential witnesses to testify in his behalf, including Yale's Law School dean, who conceded that Dick's actions were "tasteless, even disgusting" but allowed "that's beside the point. Free expression is more important than civility in a university."

            This point of view did not go unchallenged at Yale or elsewhere, even in 1986, and, needless to say, it can hardly be said to hold sway today, when protestors at Amherst are demanding "extensive training for racial and cultural competency" and possible disciplinary action against fellow students who had posted placards upholding "Free Speech" and declaring "All Lives Matter." Likewise, it is difficult to imagine anything farther from the ideals expressed in the Woodward Report, than the recent viral video from Yale itself, in which a student berates Professor Nicholas Kristakis, master of Silliman residential college--and implicitly, his wife, the associate master--for failing to endorse the campus intercultural affairs committee's call for students to avoid potentially offensive Halloween costumes. Their job, she insists is not to create "intellectual space" but "a place of comfort and home." Student demands for the couple's ouster at Silliman have yet to bear fruit, but it seems a fair bet that Dr. Christakis's decision to take a sabbatical next term and Ms. Kristakis's plan to step away from her role as a lecturer at Yale represent something more than your old everyday, garden-variety coincidence.

            Woodward himself seemed to anticipate some of the current conflicts as early as 1989, when he observed that, while it was "majority opinion" that had been offended by Shockley's appearance at Yale in 1974, fifteen years later it was "mainly minority groups that fe[lt] offended by unrestricted free speech." In condemning "opinions and speech held repugnant or offensive" as "harassment," Woodward thought, minority spokesmen were resorting to "much the same rhetoric of shock and anger" once leveled at "the public sentiments of Professor Shockley and General Westmoreland."

The problem with Woodward's ironic observation was that he was comparing the feelings of an undifferentiated campus majority in the Westmoreland-Shockley cases of the early 1970s to the feelings of a sharply defined racial minority at the end of the 1980s. This discrepancy reveals much about the aims and expectations of the dedicated liberal crusaders of Woodward's generation. Speaking out forcefully against racial injustice at a time when it really meant something to do so, their goal was integration rather than racial or cultural diversity, which, rather than an end in itself, was for them, more of a stage in a larger process of assimilation. Intellectual diversity was another matter entirely, however, for they had dedicated much of their lives to toppling the tyranny of majority opinion and defending its victims. It was hardly surprising, then, that their ideal campus was one where the free expression of ideas mattered above all and racial or cultural distinctions and the attendant sensitivities mattered progressively less.

To say the least, Woodward and others of his cohort failed to account for such possibilities as racial, ethnic, cultural, and sexual-preference minorities actually embracing and demanding respect for identities that they had been expected to lose to the swift currents of the social mainstream. At any rate, the Woodward Report's insistence that freedom of speech on college campuses is not a debatable proposition rings true these days only because, as Professor Shockley learned at Yale, it is impossible to debate anything in the midst of a shouting match, in this case, between those seeking to bolster old protections for free expression, and those (currently enjoying the decibel advantage) demanding new protections from it.



"To orate verbosely and windily."

Bloviate is most closely associated with President Warren G. Harding, who used it frequently and was given to long winded speeches. H.L. Mencken said of Harding:

"He writes the worst English that I've ever encountered. It reminds me of a string of wet sponges; it reminds me of tattered washing on the line; it reminds me of stale bean soup, of college yells, of dogs barking idiotically through endless nights. It is so bad that a sort of grandeur creeps into it. It drags itself out of the dark abysm of pish, and crawls insanely up the top most pinnacle of posh. It is rumble and bumble. It is flap and doodle. It is balder and dash."

Cobbloviate dedicates itself to maintaining the high standards established by President Harding and described so eloquently by Mr. Mencken. However,the bloviations recorded here do not necessarily reflect the opinions of the mangement of Flagpole.com,nor,for that matter, are they very likely to be in accord with those of any sane, right-thinking individual or group anywhere in the known universe.

Monthly Archives