THIS APRIL SHOWERED US WITH HISTORY

(April may not always be "the cruelest month" for historians, but with the wind-down of the Civil War Sesquicentennial observance, this surely has been one of the busiest on record.  What follows is a modified version of a piece that the Ol' Bloviator did for TIME.COM on April 9, the 150th anniversary of Robert E. Lee's surrender at Appomattox.)

Confederate leaders may have believed they had built a unified nation when they framed a new government and sent their troops off to war with hearty assurances of a quick and glorious victory in 1861. Amid the centennial observance of these events, however, Robert Penn Warren would suggest that a sense of common southern identity had actually been "born" only on April 9, 1865, when "Lee handed Grant his sword" at Appomattox. Indeed, even in the wake of Fort Sumter, many enlistees had vowed to fight only "in defense of Virginia" or "my home state," and some even restricted their allegiances to "the loved ones who call upon me to defend their homes from pillage."

            The challenge of instilling new national loyalties in a population whose regional loyalties were in many cases still suspect loomed even more daunting because Confederate identity would have to be constructed on the fly. The delegates gathered in Montgomery in early February 1861  managed to draw up the constitutional and governing framework of the Confederate States of America in only five weeks, but in scarcely five more weeks, their brand new nation-state would be plunged into a war that many of them had persuaded their constituents--and perhaps even themselves--would never come.

Reluctant to acknowledge the hard truth of their own Vice-President Alexander Stephens's declaration that slavery was the fundamental "cornerstone" of their new nation's existence, Confederate propagandists were reduced to the none-too-compelling rationale that they were not actually repudiating the Union but seeking simply to restore what they saw as its founding principles of state sovereignty and federal restraint.  The Confederacy's identity and persona would thus be assembled from recycled components, largely appropriated from the nation its people had just abandoned and would soon be fighting. Not the least of these was a constitution that, save for a wrinkle or two, was basically a replica of the one that, until recently, Confederates had been swearing to honor and protect since 1789. Having co-opted the founding document of the nation they were leaving, they also seized on its founding father by placing a likeness of George Washington on the Great Seal of the Confederacy as well as on its currency, bonds, and postage stamps. Finally, in the "Stars and Bars," with its circle of white stars on a blue field surrounded by bands of red and white, the Confederates adopted a national flag whose pronounced resemblance to "the Stars and Stripes" quickly proved it unsuitable for the battlefield.  In view of all this symbolic copycatting, the number of southerners who actually continued to celebrate July 4th well into the hostilities seems a bit less surprising.

Although secessionist firebrands like Henry Lewis Benning had led their fellow southerners out of the Union under the banner of "state rights," their new government was actually no more a "confederacy"(and perhaps even less so) than their old one, but rather the "consolidated Republic" dominated by slaveholders that Benning had envisioned from the start. Even early on, when the military effort was going well, the exigencies of wartime demanded centralized control of production and distribution, leading quickly to complaints over shortages, inefficiency, and corruption.  Further inflamed by obstructionist politicians like Georgia's Joseph E. Brown, Confederate critics would grow exponentially more strident and intemperate as the tide of war turned.

            One enduring constant, however, was widespread public affection for the outmanned, under-supplied Confederate fighting men whose valor and resilience quickly commanded the enduring loyalty on the home front that the Confederacy as a political entity had failed to elicit. This shift in allegiance came through in the popular habit of displaying, not the national flag, but the starred St. Andrews Cross that had supplanted it on the battlefield, where, General P. G. T. Beauregard noted, it had been "consecrated by the best blood of our country." For all its inspirational value, however, the battle flag itself conveyed little sense of attachment, either to an unpopular government or to any cause other than military success.

Within days of Lee's surrender, poet-priest Father Abram Ryan immortalized that now "Conquered Banner," which, though furled at Appomattox, was "wreathed around with glory" and destined to "live on in song and story." Like General John B. Gordon's equally sentimental first-hand account of the striking of the colors that day and, for that matter, like the colors themselves, Ryan's weepy ode would be pressed into service repeatedly in the years to come in order to rally white southerners yet again to the defense of their racial institutions. This time, however, instead of the decidedly parochial, localist constituency that had confronted them in 1861, postbellum southern nationalists could draw on the experience and legacy of men who had not only fought shoulder to shoulder with comrades drawn from distant states but in many cases traversed a vast region inhabited by people whose lives and values seemed strikingly similar to their own. As a consequence, these men and their descendants, noted W. J. Cash in his 1941 classic, Mind of the South, were now more likely to respond to the word "southern" with an emotion once reserved solely for "Virginia, or Carolina, or Georgia." If strickly speaking, the birth of the South as what Cash called "an object of patriotism" had actually come somewhere in the course of a fierce, four-year "conflict with the Yankee," Warren would surely have been justified nonetheless in dating its confirmation to the ceremonial acknowledgement of the Confederacy's bitterly painful but ultimately unifying failure that, 150 years ago, marked this month at Appomattox.

Five days later , of course, Grant's commander-in-chief would be dead, a fact that further boosts the historic import of this particular April because it  seemingly impinges  so  heavily on the long- and short-term meaning and consequences of what happened at Appomattox, though not nearly so much, the OB submits, as the counterfactual scenario offered by James Thurber in "If Grant Had Been  Drinking at Appomattox:"

The soft thudding sound of horses' hooves came through the open window. Shultz hurriedly walked over and looked out. "Hoof steps," said Grant, with a curious chortle. "It is General Lee and his staff," said Shultz. "Show him in," said the General, taking another drink. "And see what the boys in the back room will have." Shultz walked smartly over to the door, opened it, saluted, and stood aside.

General Lee, dignified against the blue of the April sky, magnificent in his dress uniform, stood for a moment framed in the doorway. He walked in, followed by his staff. They bowed, and stood silent. General Grant stared at them. He only had one boot on and his jacket was unbuttoned.

"I know who you are," said Grant. "You're Robert Browning, the poet." "This is General Robert E. Lee," said one of his staff, coldly. "Oh," said Grant. "I thought he was Robert Browning. He certainly looks like Robert Browning. There was a poet for you. Lee: Browning. Did ya ever read 'How They Brought the Good News from Ghent to Aix'? 'Up Derek, to saddle, up Derek, away; up Dunder, up Blitzen, up, Prancer, up Dancer, up Bouncer, up Vixen, up -'".

"Shall we proceed at once to the matter in hand?" asked General Lee, his eyes disdainfully taking in the disordered room. "Some of the boys was wrassling here last night," explained Grant. "I threw Sherman, or some general a whole lot like Sherman. It was pretty dark." He handed a bottle of Scotch to the commanding officer of the Southern armies, who stood holding it, in amazement and discomfiture. "Get a glass, somebody," said Grant, .looking straight at General Longstreet. "Didn't I meet you at Cold Harbor?" he asked. General Longstreet did not answer.

"I should like to have this over with as soon as possible," said Lee. Grant looked vaguely at Shultz, who walked up close to him, frowning. "The surrender, sir, the surrender," said Corporal Shultz in a whisper. "Oh sure, sure," said Grant. He took another drink. "All right," he said. "Here we go." Slowly, sadly, he unbuckled his sword. Then he handed it to the astonished Lee. "There you are. General," said Grant. "We dam' near licked you. If I'd been feeling better we would of licked you."

NB: John Wilkes Booth was not exactly a tee-totaler, himself, as befits the only presidential assassin whose name adorns a cocktail. Makes you wonder what might have happened had he knocked back a few extra before proceeding to Ford's Theater the fateful evening of April 14, 1865.

THE NAMING GAME

One of the many undertakings that have kept the Ol' Bloviator away from these cozy confines of late was an effort to hack his way through a veritable genealogical jungle, an undertaking sometimes rendered all the more frustrating by the misleading trail markers left by genealogists intent on loading their family trees with as much high-end fruit as possible. This problem was particularly pertinent to the O.B.'s enterprise because it involved sorting out two strands of his own family. One of these was the prosperous and influential Cobbs of Athens, Georgia, (whose most notable figures, Howell and T.R.R. Cobb, played key roles in leading Georgia out of the Union in 1861) and the other, his branch, who settled about forty miles up the road in Hart County, where they embodied that class of whites that Populist Tom Watson called "the horny-handed sons of toil." In addition to dramatic disparities in wealth and lifestyle, one of the things the O.B. found most striking was that while his male forebears specialized in taking brides from the Smiths and the Sullivans; the Athens Cobb line offered union after union with the wealthy, powerful, near-dynastic clans of Ol' Virginny. Well before Howell Cobb and his younger brother T.R.R., a.k.a. Tom, ventured onto the sea of matrimony in the 1830s and 1840s, from their great-grandfather down through their father and uncles before them, their male forbears had made in-laws of some of the oldest and foremost families of the Old Dominion. Marrying oneself only to blood as least as good as your own, if not better, pretty much obligated you to advertise all those bonafides surging through your bloodline by giving your offspring first and middle names that flaunted these prestigious monikers. If you accompany the O.B. on a brief genealogical saunter, he will show you why the Athens Cobbs are a primary case in point. George Reade, who came to Virginia in 1640 and served on the Royal Council for the colony, begat Thomas Reade, whose daughter Mildred married wealthy land- and slave-holder Maj. Phillip Rootes. Their grandson, Thomas Reade Rootes became a prominent jurist and politico, whose spawn included Martha Jacquelin Rootes, who married Howell and Tom Cobb's uncle, Howell, and Sarah Robinson Rootes who married their father, John Addison Cobb. Needless to say, Howell and Tom eagerly embraced this "marry-up" legacy. As an aspiring jurist, Tom Cobb didn't exactly injure his prospects by his union with the daughter of soon-to-be state Supreme Court Justice Henry H. Lumpkin. Ditto and more so for Howell, who, whether by design or good fortune or most likely both, managed to win the affections of Mary Ann Lamar. Miss Lamar was not only strikingly attractive, but co-heir with her brother and Howell's college chum, John Basil Lamar, to her father's estate, which at his death in 1832, three years before she plighted her troth to Howell, included 220 slaves and more than 15,000 acres down in Baldwin and several adjacent Georgia counties.

Since, then as now, money also means never having to explain how your kid got his or her name, in this case the deadly combination of inbred wackiness and great wealth led to the grandiose likes of Lucius Quintus Cincinnatus Lamar (Kind of makes Thomas Reade Rootes seem to roll easily off your tongue, doesn't it?), Mirabeau Bonaparte Lamar, Lavoisier LeGrand Lamar (At least, its alliterative.), and my personal favorite, Gazaway Bugg Lamar, which is actually at least in part a mash-up of family names, for the Virginia Buggs were entwined with both the Lamars (through the union of Basil Lamar and Patience Bugg) and the Cobbs (through the union of the aforesaid T. R. R.'s uncle, Henry Willis Cobb, and Obedience Dutiful Bugg). Given their surnames, it is probably safe to assume that the Bugg girls grew up with minimal exposure to feminist doctrine. (Preachy and stultifying as the naming practices of the Buggs may seem, they seemed positively light-hearted compared to those of the Puritans. There was ol' "Praise-God" Barebone, who hung "If-Christ-had-not-died-for-thee-thou-hadst- been-damned" Barebone on his son. Not without cause, the O.B. maintains, was H. L. Mencken so fond of saying "Show me a Puritan, and I'll Show You a Son-of-a-Bitch.")

   We rednecks have taken our unfair share of ridicule for our supposedly incestuous proclivities.  Jeff Foxworthy to the contrary notwithstanding, however, the squirearchy's  penchant for continually re-fortifying the family creds at the altar that actually made their reunions prime opportunities to check out cousins by the dozens, undeterred by the very real prospect that the flesh you craved might already be your own. Margaret Mitchell was not just 'whuffin' about Ashley and Melanie; the best calculations we have suggest that, among the South's wealthiest planter families, slightly more than one in ten marriages made husband and wife of a pair of cousins. Such unions could produce circumstances that were not infrequently awkward and sometimes just plain weird. In the case of Howell and Tom, consider that their mother's sister, Martha Jacquelin Rootes, was first married to their Uncle Howell and then to Henry Jackson. This latter union produced Henry Rootes Jackson, who once again kept it in the family by marrying his cousin, Tom Cobb's daughter, Sarah Addison, whose great-aunt suddenly became her mother-in-law as well. Elsewhere, my Athens Cobb kinsman, Milton Leathers, recounts an incident stemming from the marrying and, for want of a better term, inbred, naming habits of two historically prominent local families, the Billupses and the Phinizys. It seems a young Phinizy descendant was taking his date to visit his grandmother. As they approached the house, "Cudd'n" Milton reports, the boy cautioned his date that "my grandmother has a real funny name. She's Mrs. Billups Phinizy." Clearly startled, the young lady responded, "That is really amazing. Because MY grandmother is Mrs. Phinizy Billups!" 

Although the pattern of intermarriage among the South's self-styled aristocratic families might seem to evoke that Homer and Jethro classic, "I'm My Own Grandpa,"  these entwinements served to consolidate and concentrate wealth by assuring that, insofar as possible, capital flowed within a tightly interconnected circle. In our day and time, some of these folks may even have married themselves into an anti-trust suit. The same could be said of social and political pull, of course. Every time Thomas Reade Rootes Cobb affixed his signature to a document or letter, he was flashing his ancestral bling, courtesy of a name that evoked not one but two power-house Virginia families and thus conveyed an almost hereditary right to rule. The problem here was that the born-on-third-base types like Tom actually internalized the aura they sought to project, rendering them infallible and invincible in their own minds as well.  Brigadier General Thomas Reade Rootes Cobb would be terminally disabused of this notion at Fredericksburg, in December 1862, by a minie` ball or piece of shrapnel that didn't give a damn about his bloodline but played hell with his blood flow. Having assured his fellow Georgians that secession would not even lead to war, much less to desolation and defeat, Tom Cobb had paid for his arrogance and presumption with his life. Unfortunately, however, by April 1865, the same could be said for roughly 750,000 of his fellow Americans.


(A somewhat briefer version of this piece appeared last week at TIME.COM on MLK Day, on which date, its author was actually giving a talk on Robert E. Lee to a wonderful audience at Washington & Lee. Go figure the odds on that. Confirmed masochists may view the video of the talk here, beginning about 19 minutes in.)


The Auburn Avenue neighborhood where Dr. Martin Luther King, Jr., was born in January 1929 was both a spatial and human embodiment of Atlanta's paradoxical reputation for both strict racial segregation and black economic success. Noted journalist and renowned apostle of the "New South," Henry W. Grady, may have strained the credulity of his New York audience in 1886  when he insisted that he bore no resentment toward his beloved Atlanta's arch-nemesis, General William Tecumseh Sherman, but  Grady's claim that "from the ashes he left us... we have raised a brave and beautiful city" was more than the idle boast of a shameless booster. Atlanta's speedily restored railroad connections and postbellum emergence as the Southeast's principal trade and transportation hub all but assured its magnetic allure. By 1900 it was home to 90,000 people, more than a third of whom were black. A bloody race riot in 1906 left at least a dozen and quite likely more black Atlantans dead, yet--with the city's "Forward Atlanta," crusade for economic growth proceeding apace--the city's black population continued to swell. It stood at 90,000 by the time King was born into a well-established black middle class of merchants, lawyers, educators (the city boasted six private black colleges well before 1900) and ministers, concentrated in the city's West Side on and around Auburn Avenue, which a prominent resident once called "the richest Negro street in the world."

            If Atlanta had established a reputation as a relative mecca of upward mobility for black Georgians looking to better themselves materially, it had proven no less a font of opportunity for those of a more spiritual bent, including the infant King's father and maternal grandfather, both of whom had been born into sharecropping families in nearby rural counties. Martin (né Michael) Luther King, Sr., had arrived in Atlanta as an aspiring, though scarcely literate, young minister in 1918. His determined efforts to improve himself and his circumstances did not suffer in the least from his fortuitous marriage to Alberta Williams, whose own father's meager rural origins had not prevented him from building his small congregation into the powerful Ebenezer Baptist Church, where, upon his death in 1931, he would be succeeded in the pulpit by his son-in-law.

Growing up, the younger Martin's solidly middle-class background offered some insulation from brutalities of the Jim Crow system, but there were no guarantees. Scarcely a year after King was born, Dennis Hubert, a sophomore at Morehouse College and also the son of a prominent black minister, was brutally murdered for allegedly insulting two young white women. For all this atrocity said about the limitations of middle-class standing for the city's blacks, the young man's white killers were arrested, convicted and sentenced to prison, an outcome highly unlikely, to say the least, in any rural county anywhere in the state at that point.

            It was not surprising that a historian of that era found Atlanta "quite evidently not proud of Georgia" or that, across the state, all but a very few whites heartily reciprocated the sentiment. Indeed, this was the primary reason that Georgia's overwhelming rural legislative majority had taken formal action in 1917 to quarantine the capital city's insidious racial and political moderation. This was accomplished through the brazenly anti-urban artifice of the "county-unit" electoral system, which effectively guaranteed that the preferences of voters in Atlanta, population 270,000 in 1930, could be neutralized completely by those of voters in the state's three smallest counties, which had a combined population of scarcely 10,000.

This was a situation tailor-made for a rustic, race-baiting demagogue like Eugene Talmadge. Peppering his speeches with the "n-word," stonewalling efforts to improve the schools, and reveling in the impotent rebukes of "them lying Atlanta newspapers," Talmadge claimed the governorship for the first of four times in 1932. For all he might have done to impede progress across the state as a whole, however, Talmadge's impact on Atlanta itself was notably less severe. Despite the economic reversals of the Great Depression, the infusions of cash from a variety of New Deal programs had already paid off for Atlantans by the end of the 1930s, with a greatly expanded and modernized infrastructure and dramatic improvement in schools, hospitals and other public institutions.

The overpowering urge to show the world that Atlanta was back and better than ever was more than apparent in December 1939 when the film version of Margaret Mitchell's Gone With The Wind premiered at the Loew's Grand Theater. In keeping with the city's now well-known penchant for self-promotion, PR-savvy Mayor William B. Hartsfield spared no exertion to assure a glittery Hollywood presence for the event including, of course, Clark Gable, Vivien Leigh and the film's other white actors. Fearing repercussions from local whites, however, he extended no such hospitality to Hattie McDaniel, Butterfly McQueen or other black cast members. In the end, the only black participants of note in the entire affair were the members of the choir at the Ebenezer Baptist Church, including the son of its pastor. Just shy of his 10th birthday, Martin King sang along as, in keeping with the film's blatant racial stereotyping, the group, dressed as slaves, performed spirituals for an all-white audience at a Junior League charity gala.

King and Hartsfield would cross paths frequently in the years to come. Under Hartsfield's leadership, Atlanta would leave a racially fraught Birmingham, Ala., in its dust as it rode the crest of World War II economic expansion to undisputed preeminence as the South's most dynamic city. Steadily changing with the times, the popular and uber-connected Hartsfield would draw on his gift for orchestration again and again as he presided over the desegregation of downtown businesses and the city's tiny but notably uneventful first steps toward integrating its public schools. Meanwhile, returned to share Ebenezer's bully pulpit with his father, the younger Rev. King began to cast doubt on the mayor's vaunted claim that his city was "too busy to hate" by consistently pushing the envelope of social change further and faster than Hartsfield had envisioned. This not only made King a sometimes troublesome presence for the image-obsessed Hartsfield, but vice versa, as the mayor's moderating interventions in conflicts over King's protests may have forestalled some of the uglier racial confrontations that ultimately served King's purposes best.

Atlanta had found its breezy, boosterist persona in the artful and charming Hartsfield. It would be slower, however, to acknowledge as its conscience the 1964 Nobel Laureate who, the day after returning from Oslo, immediately antagonized the local business establishment by venturing scarcely two blocks from his church to join workers picketing for better wages at the city's Scripto Pen Company. Not surprisingly, Hartsfield joined his mayoral successor, Ivan Allen, Jr., in a frantic effort to persuade  key white business leaders whose feathers King had just ruffled that, lest the world see their city as reluctant to embrace its globally acclaimed native son, they must, however grudgingly, lend their high-profile presence to an upcoming gala celebrating his achievement. Sure enough, among the 1,500 people in attendance on the appointed evening were several members of the local business elite, including none other than James V. Carmichael, the president of Scripto Pen. Ironically, but surely fittingly as well, some 30 years later, his plant's remains would be bulldozed in order to provide parking for visitors to the city's Martin Luther King, Jr., Historic District.


(A somewhat briefer version of this piece appeared last week at TIME.COM on MLK Day, at which point, its author was giving a talk on Robert E. Lee to a wonderful audience at Washington & Lee. Go figure the odds on that. Confirmed masochists may view the video of the talk here, beginning about 19 minutes in.)


The Auburn Avenue neighborhood where Dr. Martin Luther King, Jr., was born in January 1929 was both a spatial and human embodiment of Atlanta's paradoxical reputation for both strict racial segregation and black economic success. Noted journalist and renowned apostle of the "New South," Henry W. Grady, may have strained the credulity of his New York audience in 1886  when he insisted that he bore no resentment toward his beloved Atlanta's arch-nemesis, General William Tecumseh Sherman, but  Grady's claim that "from the ashes he left us... we have raised a brave and beautiful city" was more than the idle boast of a shameless booster. Atlanta's speedily restored railroad connections and postbellum emergence as the Southeast's principal trade and transportation hub all but assured its magnetic allure. By 1900 it was home to 90,000 people, more than a third of whom were black. A bloody race riot in 1906 left at least a dozen and quite likely more black Atlantans dead, yet--with the city's "Forward Atlanta," crusade for economic growth proceeding apace--the city's black population continued to swell. It stood at 90,000 by the time King was born into a well-established black middle class of merchants, lawyers, educators (the city boasted six private black colleges well before 1900) and ministers, concentrated in the city's West Side on and around Auburn Avenue, which a prominent resident once called "the richest Negro street in the world."

            If Atlanta had established a reputation as a relative mecca of upward mobility for black Georgians looking to better themselves materially, it had proven no less a font of opportunity for those of a more spiritual bent, including the infant King's father and maternal grandfather, both of whom had been born into sharecropping families in nearby rural counties. Martin (né Michael) Luther King, Sr., had arrived in Atlanta as an aspiring, though scarcely literate, young minister in 1918. His determined efforts to improve himself and his circumstances did not suffer in the least from his fortuitous marriage to Alberta Williams, whose own father's meager rural origins had not prevented him from building his small congregation into the powerful Ebenezer Baptist Church, where, upon his death in 1931, he would be succeeded in the pulpit by his son-in-law.

Growing up, the younger Martin's solidly middle-class background offered some insulation from brutalities of the Jim Crow system, but there were no guarantees. Scarcely a year after King was born, Dennis Hubert, a sophomore at Morehouse College and also the son of a prominent black minister, was brutally murdered for allegedly insulting two young white women. For all this atrocity said about the limitations of middle-class standing for the city's blacks, the young man's white killers were arrested, convicted and sentenced to prison, an outcome highly unlikely, to say the least, in any rural county anywhere in the state at that point.

            It was not surprising that a historian of that era found Atlanta "quite evidently not proud of Georgia" or that, across the state, all but a very few whites heartily reciprocated the sentiment. Indeed, this was the primary reason that Georgia's overwhelming rural legislative majority had taken formal action in 1917 to quarantine the capital city's insidious racial and political moderation. This was accomplished through the brazenly anti-urban artifice of the "county-unit" electoral system, which effectively guaranteed that the preferences of voters in Atlanta, population 270,000 in 1930, could be neutralized completely by those of voters in the state's three smallest counties, which had a combined population of scarcely 10,000.

This was a situation tailor-made for a rustic, race-baiting demagogue like Eugene Talmadge. Peppering his speeches with the "n-word," stonewalling efforts to improve the schools, and reveling in the impotent rebukes of "them lying Atlanta newspapers," Talmadge claimed the governorship for the first of four times in 1932. For all he might have done to impede progress across the state as a whole, however, Talmadge's impact on Atlanta itself was notably less severe. Despite the economic reversals of the Great Depression, the infusions of cash from a variety of New Deal programs had already paid off for Atlantans by the end of the 1930s, with a greatly expanded and modernized infrastructure and dramatic improvement in schools, hospitals and other public institutions.

The overpowering urge to show the world that Atlanta was back and better than ever was more than apparent in December 1939 when the film version of Margaret Mitchell's Gone With The Wind premiered at the Loew's Grand Theater. In keeping with the city's now well-known penchant for self-promotion, PR-savvy Mayor William B. Hartsfield spared no exertion to assure a glittery Hollywood presence for the event including, of course, Clark Gable, Vivien Leigh and the film's other white actors. Fearing repercussions from local whites, however, he extended no such hospitality to Hattie McDaniel, Butterfly McQueen or other black cast members. In the end, the only black participants of note in the entire affair were the members of the choir at the Ebenezer Baptist Church, including the son of its pastor. Just shy of his 10th birthday, Martin King sang along as, in keeping with the film's blatant racial stereotyping, the group, dressed as slaves, performed spirituals for an all-white audience at a Junior League charity gala.

King and Hartsfield would cross paths frequently in the years to come. Under Hartsfield's leadership, Atlanta would leave a racially fraught Birmingham, Ala., in its dust as it rode the crest of World War II economic expansion to undisputed preeminence as the South's most dynamic city. Steadily changing with the times, the popular and uber-connected Hartsfield would draw on his gift for orchestration again and again as he presided over the desegregation of downtown businesses and the city's tiny but notably uneventful first steps toward integrating its public schools. Meanwhile, returned to share Ebenezer's bully pulpit with his father, the younger Rev. King began to cast doubt on the mayor's vaunted claim that his city was "too busy to hate" by consistently pushing the envelope of social change further and faster than Hartsfield had envisioned. This not only made King a sometimes troublesome presence for the image-obsessed Hartsfield, but vice versa, as the mayor's moderating interventions in conflicts over King's protests may have forestalled some of the uglier racial confrontations that ultimately served King's purposes best.

Atlanta had found its breezy, boosterist persona in the artful and charming Hartsfield. It would be slower, however, to acknowledge as its conscience the 1964 Nobel Laureate who, the day after returning from Oslo, forfeited the acclaim of the local business establishment by venturing scarcely two blocks from his church to join workers picketing for better wages at the city's Scripto Pen Company. Not surprisingly, Hartsfield joined his mayoral successor, Ivan Allen, Jr., in a frantic effort to persuade  key white business leaders whose feathers King had just ruffled that, lest the world see their city as reluctant to embrace its globally acclaimed native son, they must lend their high-profile presence to an upcoming gala celebrating his achievement. Sure enough, among the 1,500 people in attendance that evening were several members of the local business elite, including none other than James V. Carmichael, the president of Scripto Pen. Ironically, but surely fittingly as well, some 30 years later, his plant's remains would be bulldozed in order to provide parking for visitors to the city's Martin Luther King, Jr., Historic District.

"Gone With the Wind": It's Not Too Late to Read the Book!

            The excitement and acclaim that greeted both the Peachtree and the Broadway premieres of producer David O. Selznick's adaptation of Gone With the Wind just before Christmas seventy-five years ago seems genuinely cringe-worthy today, after multiple indictments over recent years of Margaret Mitchell's novel as racist and historically distorted. Mitchell is clearly culpable on the first count, although by no means uniquely so, but latter-day critics who charge her with distorting history would be well advised to consider the history she had to work with and, in some aspects, even undertook to revise.

Released in mid-summer 1936, Mitchell's book had already sold more than a million copies in the U.S. alone by January, 1937. Rather than disappoint a multitude of adoring readers poring obsessively over their favorite lines, the screen writers ultimately opted for scrupulous fidelity to Mitchell's text. Yet, the film's opening credits, introducing it as "Margaret Mitchell's Story of the Old South," were more applicable to its dialogue than to some of the actual meanings Mitchell meant to convey. This much was clear to Mitchell and her more thoughtful readers--even before the first scene--in the scrolled lines setting the story in "a land of Cavaliers and cotton " where "the Age of Chivalry took its last bow." Mitchell took great exception to this spin on her story that, she consistently maintained, was actually intended to insert some historical realism in an Old South narrative long shrouded in fluttery romanticism. "I certainly had no intention of writing about Cavaliers," she insisted, pointing out that "practically all my characters, except the Virginia Wilkeses, were of sturdy yeoman stock."

Mitchell's words certainly rang true in her depiction of prominent planter Gerald O'Hara as a semi-literate "bogtrotter" who fled his native Ireland under suspicion for the murder of an English rent collector. "Loud-mouthed and blustering," Mitchell's Gerald proceeds to parlay his facility at poker and his "steady head for whiskey" into ownership of a run-down plantation, and after marrying well above his own social station, he ultimately satisfies his "ruthless longing" for a respected place in planter society.

In the film, by contrast, the means of Gerald's socioeconomic ascent is never addressed, much less the more questionable aspects of his Irish background. Mitchell had also presented Tara as a "clumsy, sprawling" structure with a simple whitewashed brick exterior. The filmmakers, however, remained deaf to her several pleas for an "ugly, sprawling and columnless" O'Hara residence in keeping with typical plantation houses in a Georgia upcountry still not long removed from the frontier. Despite Mitchell's attempts to revise key aspects of both popular and scholarly myth, producer Selznick made it clear that he had no intention of poking holes in what remained a delightfully marketable plantation legend. Thus, Mitchell was left to conclude that she and a tiny cadre of southern historical realists might "write the truth about the antebellum South . . . until Gabriel blows his trump, and everyone would go on believing the Hollywood version."

In truth, the film did a little better in capturing Mitchell's disdain for the legend of the white South's heroic "Redemption" from Reconstruction by a resurgent planter aristocracy. After the war, her high-minded, genteel families like the Wilkeses flounder and fail, especially Ashley, who seemed wonderfully grand in the Old South but proves woefully inept in the New. Scarlett, meanwhile, summons the grit and gall that is her patrimony from the low-born Gerald, rising above her despair in the garden at Twelve Oaks and heading off to a rebuilding Atlanta, where there was "still plenty of money to be made by anyone who isn't afraid to work--or to grab."

Scarlett quickly proves that she is hesitant to do neither. Her "harsh contact with the red earth of Tara" has transformed her into a thoroughgoing economic realist who grimly concedes that the Yankees were right about at least one thing: "It took money to be a lady."  Ironically, her only means of feeling like a lady again was to "make money for herself, as men made money."

Suffice it to say, Mitchell's black characters reveal no such complexity or depth but remain steadfastly and stereotypically one-dimensional. Hence, the widespread perception today of her novel as nothing more than what one critic called "a racist, revisionist Southern apologetic" written by a wealthy white Atlanta debutante still embittered about the outcome of the Civil War. This facile exercise in regional stereotyping is unfortunate, to say the least, especially given the current anger and division nationwide over what appears to be a pattern of undifferentiated racial profiling by law enforcement, the courts, and let's face it, a lot of white citizens as well. Accordingly, Americans would do well to reconsider such conveniently narrow sectional pigeonholing of a book that was actually quite compatible with white racial attitudes, both popular and scholarly, prevailing nationally at the end of the 1930s and well beyond. Such a reconsideration might even mean that the next time an Eric Garner is killed by police outside the South, we could at least be spared the long since predictable, almost willfully naive reaction registered by a recent "Justice for All" protester who exclaimed, "This isn't the Deep South. This isn't Mississippi in the 1960s. This is New York City in 2014."

Novelist Pat Conroy has suggested that, for still-angry and defiant white southerners, Gone With the Wind amounted to "a clenched fist raised to the North." This is doubtless correct, but there is little evidence that many white northerners interpreted it this way at the time. Nor was there much indication that Mitchell's racist language and depictions were particularly offensive to whites outside the South in an early 1939 Gallup survey suggesting that some 14 million Americans had read her book in its first 30 months in print and positing a likely national audience of some 56.5 million viewers for the eagerly anticipated film based on it.

 If neither Mitchell nor the great balance of her national readership appeared to give much thought to the disturbing racial realities behind the seductive southern legend, the same could just as easily be said of a great many white academic historians, North and South. Mitchell was thoroughly conversant with the relevant (white) scholarship at her disposal, and her airbrushed portrait of slavery and casual indulgence in racial stereotypes are hardly at odds perceptions offered by two distinguished Ivy League historians in the most widely used collegiate U.S. history textbook of the day. "Sambo," they assured students, did not fare badly in bondage because, despite the horror stories served up by the uptight abolitionists, "the majority of the slaves were adequately fed, well cared for, and apparently happy."

Likewise, Scarlett's charge that emancipation "just ruined the darkies" fairly echoed the sentiments of Columbia University's profoundly influential historian of Reconstruction, William A. Dunning, who insisted that "the freedmen . . . could not for generations be on the same social, moral and intellectual plane with the whites." The sole aim of Dunning and his many students and disciples, charged W. E. B. Dubois, was "to prove that the South was right in Reconstruction, the North vengeful or deceived and the Negro stupid."

 Such biased and offensive treatments had already passed for scholarship far too long when they finally came under concentrated assault by black activists and educators during World War II. The blatant hypocrisy of a Jim Crow army fighting in defense of freedom and democracy abroad, as well as the greater economic and political empowerment that the war engendered, had borne fruit in a more insistent, unremitting resolve. African Americans must at last be granted the full measure of both their rights as citizens and the dignity and respect those rights conferred. Still, although white and black scholars alike would soon be undertaking dramatic revisions of historical interpretations of slavery as well as Reconstruction, not until 1960s would either the now-notorious "Sambo" passage be excised from the still-popular textbook or the racist and inaccurate Dunningite portrayal of Reconstruction meet with full-blown refutation.

Although Gone With The Wind consistently ranks second only to The Holy Bible as Americans' favorite book, a new Economist poll shows that only 20 percent of Americans have actually read it, while less than 30 percent of those under thirty have even seen the movie. These figures might strike some as positive rather than negative indicators, but there is a real sense in which all Americans, regardless of age, race, or region, would benefit from reading Mitchell's book for what it is, not simply as a white southerner's distorted defense of her region's uniquely horrific racial past, but as a strikingly clear window into a national past whose burdens confront them even today. Although it may fall short of being a great one, Gone With The Wind is--and always was--a thoroughly American novel.


P.S. This bloviation is a streamlined version of a piece posted over at likethedew.com.

P. P.S. The ol' Bloviator knows "Cobbloviate Heads" near and far will not feel as though Christmas is really here until they receive the traditional greetings of the season, courtesy of his faithful ol' pickup, which is still flashing away after 20 years and 100k+ miles. Merry Christmas to you all, and, as always, to the Techsters, who may still be celebrating their-once-in-a-blue moon victory with a "Blue Moon" (ugh!) or several about now, "Felice Bobby Dodd!"


            The Ol' Bloviator has not gotten so old that he doesn't recall ranting about the "get-drunk-party-till-you-puke-or-pass out-or-both" culture that dominates the student scene at far too many of our universities these days. Since this comprehensive report on the pathological potential of  booze-fueled fraternity life ran in The Atlantic a while back, outrageous accounts of massive alcohol abuse linked to deaths, physical injury and especially to sexual assault, have become standard fare in major newspapers and magazines. Despite individual and programmatic efforts by campus administrators to curb it, binge drinking appears to be a regular activity for four in ten of today's students. Recent data shows roughly 1,800 college students die each year from some sort of alcohol-related injury, and some 97,000 annually report sexual assaults where alcohol was a contributing factor.

Escalating concerns about rapes committed on and around campus took on even greater urgency after Rolling Stone's recent piece about this problem at no-less-storied an institution than "Mr. Jefferson's University" in Charlottesville, which was already under serious federal scrutiny for its inadequate handling of previous sexual assault charges. RS's report centered on "Jackie," a female student who claimed that she had been brutally gang-raped as a freshman after attending a party at the Phi Kappa Psi house in 2012 and that, while apparently sympathetic, university officials discouraged her from pursuing her claim or discussing the incident publicly and took no action against her accused assailants.

Skeptical of some of the details of Jackie's account, the Washington Post  and other media outlets opted for a little fact-checking on their own and are now reporting that certain of her claims about the identity of her alleged assailant and the place and date of the alleged assault could not be corroborated, Rolling Stone's representatives admit that they may have given Jackie too much benefit of the doubt and that they ran the story without securing comment from those she accused. Jackie continues to stand by her account, however, and her supporters point out that confusion about details is not uncommon among deeply traumatized victims of sexual assault. Still, this sorry and reckless excuse for journalism is certain to bolster the skepticism of those who think the prevalence sexual victimization on campus is overblown.

            For their part, however, UVA administrators, who responded to the initial RS article by clamping down hard on Phi Kappa Psi and other campus fraternities, have not leapt forward to claim vindication merely by virtue of the holes poked in Jackie's story as it was reported. Rather, in what may be a classic case of better late than never, they have reaffirmed their awareness that university has some serious  'splainin' to do where handling sexual assault charges is concerned. Thus quoth UVA prez Teresa Sullivan: "Over the past two weeks, our community has been more focused than ever on one of the most difficult and critical issues facing higher education today: sexual violence on college campuses. Today's news must not alter this focus. Here at U.Va., the safety of our students must continue to be our top priority, for all students, and especially for survivors of sexual assault."

This stance is, to say the least, prudent. Not only because of the federal investigators who continue to hover about, but because UVA's history in this area demands it. The university's "honor code," which not only forbids acts of academic dishonesty but demands that students report such acts by others, is a genuine point of pride among students, faculty, and alums. The thing is, however, although  183 students have been expelled for honor-code violations since 1998, there is no record of  a single matriculant having been expelled for sexual assault, including those who have admitted to it. Given the revelations of countless investigations and surveys of the incidence of sexual assaults on campus, a ratio of 183-0 would seem pretty hard to justify.

For all the questions about the details of Jackie's personal account, the RS piece nonetheless provides credible evidence of an entrenched social hierarchy whose exclusiveness not only discourages female students from filing claims of sexual assault but aggressively stigmatizes and marginalizes those who do. The OB has always wished that his own university could achieve a greater semblance of the powerful sense of academic purpose that pervades the UVA campus, and he still does. Secretly at least, he has also been taken with the notion as one student put it, "the most impressive person at UVA is the person who gets straight A's and goes to all the parties." The more he ponders the significance of such a student role model, however, the more the O.B. is forced to consider its full implications. What happens to all the kids bent on establishing their bonafides as both budding scholars and big-time drinkers when pursuing both goals proves mutually exclusive? Outfit yourself with the emotional maturity of an eighteen-year-old, even a very bright one, and venture a guess as to which aim is most likely to be compromised.

            All of the dangerous and potentially disastrous possibilities that arise when young people are put in a situation where they are free to choose beers over books (and most anything else) are brought home quite literally in this Chronicle of Higher Education story that shows our beloved Classic City virtually Dawg paddling in '"a river of booze.".As these things go, this piece seems reasonably balanced, notably more so than the RS expose on UVA. There are concerned people, like UGA Police Chief Jimmy Williamson and alcohol counseling specialist Liz Prince, who seem to be doing what they can to reduce underage drinking or excessive drinking in a downtown which offers 50 bars within a quarter of a mile of campus, as well as roughly that many more restaurants that also serve alcohol.

            Ironically, legend has it that Athens was chosen over nearby Watkinsville as the site for the nation's first state-chartered university because the latter was already home to a prospering tavern likely to corrupt the college lads. The writers trace Athens's history as  "a big booze town" to the 1980s when, with downtown businesses closing or migrating out to "mall-ville" and only a relatively few bars downtown, UGA officials began trying to cut down on drinking at frat houses, even issuing a ban on keggers. Fearful that downtown would continue hemorrhaging businesses to the 'burbs and eager to accommodate thirsty young collegians, municipal officials did not limit the number of bars or restaurants that started to pop up, especially after the city's music scene exploded. Despite credible efforts to make bar owners and bouncers more accountable, however, for local officials it all came down to, as one tavern-keeper put it, "they hate we're here, but they love the money." One reckons so, since Athens-Clarke County reportedly collects seven cents on the dollar for every mixed drink, in addition to a three-cent excise tax and a twenty-two-cent levy for every bottle of booze emptied. Needless to say, the proprietors of Athens's drinking establishments are not particularly opposed to making money either, and they scramble mightily to keep their places packed into the wee hours. To remain competitive, some bars resort to unannounced "specials" involving one-cent beers, free drinks for women, etc., all of which are spread instantly across a vast network of texters and tweeters leading, practically in the blinking of an already bloodshot eye, to wholesale migration of committed young boozers from one watering hole to another. And so it goes, until mandatory closing hours force them to disgorge their drunken denizens onto the streets of Athens, where the scene can easily turn from celebratory to scary in the drooping of an eyelid.

For example, a Chronicle writer looks on as UGA police discover a young man "lying on a public bench, at the end of a trail of vomit. He is unconscious; his front pocket gapes, a wallet falling partway out. An officer shakes him, and again, finally rousing him. 'How much,' the officer demands, 'have you had to drink?'" The kid's response of "Zero, Zero?" is needless to say, undermined by his present condition and circumstances; the trusty Breathalyzer simply confirms the obvious, and he is off to jail. "I can't just leave him on a bench with a citation in his pocket," Chief Williamson explains.  "A citation's not going to sober him up."

There is also the student who "has tripped and fallen after a night out and hit her head. Officers arrive to find Jacqueline, a nineteen-year-old with long, honey-colored hair, stretched out on the cold slab of a bus stop, surrounded by concerned friends. After falling she was unresponsive, for maybe thirty seconds, maybe a minute or two--no one seems quite clear--but long enough to prompt a call to 911. Now an egg-shaped welt has begun to swell next to her right eye, and her speech is slurred. Asked who is the president of the United States, she names her sorority president." (This is no laughing matter, of course, but the image of Barack Obama trying to bring a meeting of chatty Tri-Delts to order might well serve as a metaphor for his efforts with the Senate.) In this case, Jacqueline is bundled off to the ER,  but UGA's campus cops are reportedly making 900-1,000 underage drinking arrests a year, and although they have caught considerable flak for being too aggressive on this front, even casual observers of the early morning scene downtown will surely see this figure as indicative of a restrained approach.

It is hardly news that college students drink a lot and always have, but if you are using this to persuade yourself that there is nothing to be bothered about here, your head is buried not in sand, but concrete. As the writers note, "Average blood-alcohol levels in students stopped by the police have risen steadily--this year one blew a 0.33, more than four times the legal limit. With heavier drinking, the police now make drunk-driving arrests in midmorning, pulling over students on their way to class still intoxicated from the night before."

The O.B. has no reason to doubt this based on the number of students he has encountered in morning classes who show up smelling as if they just crawled out of a vat of Natty Light and proceed immediately to surrender themselves to the clutches of Morpheus in a head-thrown-back, mouth-wide-open-pose that seems de rigueur when sleeping off a world-class bender. It is hard to think of a more underweighted or unrepresentative stat than the 25 percent of college students who admit to academic difficulties brought on by alcohol abuse. If you could throw in those who don't even recognize this has happened and those who do but simply won't admit it, that number would doubtless shoot up dramatically.

We might well yammer back and forth forever about whether universities or law enforcement officials have done enough to try to curb student alcohol abuse without realizing that we are letting one critical group of culpables off scot free. Chief Williamson notes that the mother of the aforementioned young "Zero, Zero," who was found virtually insensate on a public bench, practically begging to be robbed and/or assaulted, did not take kindly to his arresting her innocent little boy. He is quick--and correct--to point out that, thanks to this kind of indulgent excuse for parenting, too many freshmen show up in Athens with a firmly established drinking habit as part of their baggage. Though he speaks to thousands of students a year about the dangers of excessive drinking, "How can I do something in five minutes," he asks, that their parents "couldn't do in 18 years?" The Chronicle writer adds that "too many parents have failed to talk to their children about responsible alcohol use. They've looked the other way. They've dismissed binge drinking and other risky behavior with, 'Kids will be kids.'" In reality, so thinks the O.B. anyway, they have actually done worse than that by trying to be kids along with their kids, succumbing to some nostalgia-blinded notion that it's OK to relive their own collegiate years through their children, as if the perils and pressures awaiting their college-bound offspring are no different than they were thirty years ago. The O.B was around back then, and, in outright defiance of fate, gravity, and public opinion,  he is still around today. He knows better, and if the Moms and Dads of today's collegians would drop the Peter Pan fantasy and face up to reality, they would, too. It's much easier, though, to abandon any pretense of trying seriously to discourage underage and/or excessive drinking, wink at fake I.D.s and reports of prodigious alcohol ingestion, and chuckle about Tara and Trey simply being chips off the old one-time champion chugger block. This may be a sure-fire way to endear yourselves to your kids but it's  also a no less certain means of putting them at greater risk. The O.B. has never been too keen about universities operating in loco parentis, but by golly, when the parents abdicate their responsibilities and go plumb loco themselves, a poor substitute seems better than none.



  






Electile Dysfunction

           Now that, for the time being at least, the last mud pie has been flung and the last stink bomb hurled, the Ol' Bloviator deems it safe to emerge from his bunker, where he was fully prepared to slurp down a cyanide capsule the very next time ol' Zig-Zag Zell talked up Michelle Nunn in one ad only to endorse Nathan (Double) Deal-er in the following one. In fact, the O.B. even dares at this point  to toss out a few little "drive-by" observations about this most recent demonstration of our state's chronic electile dysfunction.

The first is that a bunch of blindly optimistic liberals high on polling data churned out by everybody and his first cousin who happens to have a telephone and a calculator is a recipe for a resurrection that turns out to be a wake . We can go a long ways toward explaining how so many pollsters could be wrong about what unfolded in Georgia by allowing for the fact that their ranks are so swollen that they were probably surveying each other half the time. It seems that one presumed short cut to institutional legitimacy these days is opening up a brand new polling center. (Ask yourself if there was really ever any reason to suspect that a Quinnipiac University existed before there was a Quinnipiac poll. Didn't think so.)  As a result, you've got a bunch of pollsters who have so little experience and training in survey research that they not only don't know what they're doing, they don't even know why they are doing it. Things only get worse when you throw in a bunch of political media slugs who are no less addicted to "momentum shifts" than their counterparts who call football games. Recall how many times you have heard sportscasters seize on the fact that Vandy actually made two consecutive first downs at the end of the first half  as evidence that Bama will have a fight on their hands in the second, and you can better understand why some of the liberal persuasion in these parts were all prepared to really whoop it up when the Democratic governor- and senator-elects rode down Peachtree through a blizzard of tickertape in an open Mustang ragtop on loan from Barrack Obama. Beloved, as ol' Brother Dave Gardner would likely say, clear your heads of such foolishness. The demographers and survey researchers and assorted sunshine pumpers leaping to absurd conclusions may shout all they please that Georgia is getting "bluer" by the minute, but they would be a lot more accurate -and get a lot less attention, of course--if they described it as gradually "purpling" instead.

GeorgiaCountycorrectcrop.jpg

As a testament to that gradualism, that map yonder shows the 34 Georgia counties (in blue) carried by Barack Obama in 2008. These also account for all the counties carried in 2014 by Democratic senatorial candidate Michelle Nunn and gubernatorial aspirant Jason Carter, except for the two (in lighter blue), Henry (carried by both) and Wilkinson (carried by Carter). In many of the old Obama counties, the margin was razor thin to non-existent. Nunn battled to a flat-footed tie down in Baker, which Carter lost by 13 votes. The counties captured by Nunn and Carter include all those with black majorities, and, save for the little hotbed of sedition and free love that we Athenians call home, none of their remaining counties are less than 40% black. Black ballots were clearly very much a factor in about the only good news to come out of this otherwise disastrous election for the Democrats, the breakthrough in Henry County, where the black population share has now grown to 40 %. Mitt Romney managed a 3,000-vote win there two years ago, but both Nunn and Carter squoze by this time with about 400 votes to spare.

However pleased Democrats may be to see some apparent movement in their direction in Henry, things were at an almost dead calm in the six additional metro counties that have gone Democratic in the last two presidential elections. Nunn and Carter ran within a point of Obama's percentages in 2012 in all of them. Obama gobbled up about 98% of the black vote statewide in that contest, compared to 92% for Nunn and 89% for Carter this year. As always for Democrats in these parts, however, the problem was not with the black support. Exit polls show Nunn and Carter receiving but 23% of the total white vote, precisely the share apparently claimed by Obama in 2012.

 It might be worth noting that there was something of a departure from recent precedent along gender lines among whites this time out. In recent years, the gap between the voting preferences of white women and white men in the South has been negligible, and, if anything, enthusiasm for the Repubs was slightly higher among the former.  The eight-point advantage Nunn enjoyed among white women as opposed to white men in this election might simply be ascribed to gender loyalty, were it not for the nine-point male-female differential favoring Carter. As with most such shifts in voter behavior, we won't know what, if anything, this one means until it's election time again.

One thing we definitely know hasn't changed is the rock-hard resistance of working class  white southerners to any and all Democratic entreaties and advances. Five majority white counties showed average weekly wages below $500 in 2012. Sure enough, that sweat-shoppin' outsourcin' son of a gun, David Perdue, carried all of them resoundingly, three of them by more than 80%. In fact, ol' down-sizin' Dave actually ran a teency bit stronger with whites making less than thirty grand a year than among those knocking down more than a hundred.

It is no less striking, of course,  that white Georgians would re-elect a governor who, by all rights, should be stamping out license plates, instead of signing bills into law. One thing is clear, both Carter's and Nunn's disappointing showings demonstrate that political coattails go threadbare in a hurry once nobody is actually wearing the coat itself.

            There was a time when moderates could sell themselves as more conservative than they were, as Jason' grandpa did in 1970, when he managed to pull in enough Wallace and even Maddox voters with a bunch of jawboning against busing and government social programs to whup that liberal elitist Carl Sanders in the Demo Primary and breeze into the governor's mansion past hapless Hal Suit, the nominee of a bunch of equally hapless Georgia Republicans. Not so today, however. The Republicans are firmly ensconced at the top of the political pyramid, and there is absolutely no chance of their letting you con voters into thinking you are anywhere near as conservative as they are. Still, to varying degrees, both Carter and Nunn were ultimately reduced to employing what amounts to the "I'm-more-like-my-opponent-than-you -think" strategy, and their altogether predictable failure simply affirms that if you're running against a Baptist preacher, "What a Friend We Have in Jesus" just doesn't cut it as a campaign theme.

(The data cited above was drawn almost exclusively from CNN Election Central. Any errors you detect are almost certainly theirs. A somewhat briefer version of this rant will show up in honest-to-God ink this week in America's favorite indie, The Flagpole.)

 

WHY THE SOUTH "BALED" OUT AND SCOTLAND DIDN'T

From where the Ol' Bloviator sits, it's fair to say that the South and Scotland go back a ways. For example, the cult of the "Lost Cause" that sprang up in the aftermath of the South's failed fight for independence had something of an antecedent in the fabled "lost cause" of the Scottish Jacobites whose four-decade struggle to restore to the Stuart monarchy of Scotland to its rightful seat on the thrones of England, Scotland, and Ireland was heartily romanticized in the novels of Sir Walter Scott. Scott's glorification of the swashbuckling supporters of the Stuart restoration was so popular with the southern upper classes in the antebellum era that Mark Twain famously cited their affliction with the "Sir Walter Disease" as the principal cause of the Civil War.

Beyond that, the strategically critical Confederate defeat at Gettysburg in 1863 is sometimes compared to the 1746 Battle of Culloden, where Jacobite forces, representing by no means all of the Scots, but comprised in large measure of wild and exceedingly hairy (not to mention altogether ungovernable) Highlanders, were crushed by Hanoverian forces representing George II. Unlike Gettysburg, the matter in dispute at Culloden was not separation from Great Britain, but actually reunification under a Scottish monarch. On the other hand, there are similarities in the fact that the Confederate forces at Gettysburg were there largely at the behest of an aggressive slave-holding minority who saw their interests being better served in an independent southern nation, while, the Highlanders saw returning the Stuarts to the British throne as their best bet for retaining their cherished independence to rut, drink, brawl, and pillage as they damn well pleased.

            There are similar parallels with Thursday's vote on Scotland's secession, although the "nays" had it in this case, and with 85 percent of those eligible showing up to weigh in on the matter directly, it was far more democratic than the process by which the South left the United States 153 years earlier, when only Virginia and Tennessee required a popular referendum to certify their respective legislatures' votes for secession. Perhaps the most striking parallel lies in the economic centerpiece of the secessionist appeal in both cases.

            Sorry [Scottish nationalist] Alex Salmond, but compared to the South's position in the global economy in 1860, today's Scotland is something of a bit player. By the 1820s, the southern states had already become the world's premier supplier of cotton, and by the late antebellum period, more than three-fourths of its cotton was being exported. Not only was cotton the leading American export in the antebellum period, but when cotton was combined with the two other leading southern staples, tobacco and rice, the South, with just over a third of the nation's population (free and slave), accounted for well over half of the value of all American exports during the 1850s. With cotton prices rising by more than 11 cents a pound over the decade, the value of slave property alone soared to an estimated $3-4 billion by 1861, making the Confederacy, by aggregate measurement at least, one of the wealthiest nations in the world.

The South's prominent position in the world economy not only encouraged southern leaders to oppose the protective tariff and other measures disadvantageous to the cotton export trade, but it reinforced their predispositions toward a belief in southern superiority or even invincibility. When he proclaimed in 1858 that "Cotton is King" and "No power on earth dares to make war on it," South Carolina's James Henry Hammond actually sounded fairly moderate in comparison to another southerner who insisted that without southern cotton "England would topple headlong and carry the whole civilized world with her, save the South." Such assertions may seem altogether ludicrous in retrospect, but at the peak of England's textile expansion, the loss of southern cotton, which then accounted for nearly 80 percent of its cotton imports, would obviously have smarted quite a bit.

 Alas, however, unbeknownst to the overheated southern orators who were lustily proclaiming the perpetual reign of King Cotton, the great British textile boom of the nineteenth century had already begun to recede. Britain's leaders could hardly have recognized it at the time, but thanks to a prudent policy of limited cotton stockpiling in recent years, a bumper 1860 crop already on hand, and reasonable prospects for relying on alternative sources such as Egypt and India if need be, they would soon feel less need for the vaunted southern staple than those across the Atlantic who had invested such faith in it could ever imagine.

Even if British textile magnates had presumed their desire for that staple would survive the Civil War undiminished, they had little reason to doubt that, as in the past, northern agents, factors, and shippers would still be critical to whatever postbellum commerce in southern cotton they might conduct. There was also industrial England's continuing need for northern wheat (which accounted for, on average, about 25 percent of its wheat imports in the 1850s), and the annual volume of pre-war commerce between England and the northern states in general had to be considered as well. (Suffice it to say, nobody in the King Cotton camp seemed to have pondered the effects of having to forego the substantial supply of Midwestern wheat and northern manufactured goods also purchased by southerners at that point.)

            In addition, had blustering, cotton-drunk southerners sobered up even briefly, they might also have picked up on signals from a reconfigured and rapidly modernizing North Atlantic trade network that industry, not agriculture, was to be the new dynamo of world capitalism. While the South, with only 11 percent of America's manufacturing investment in 1860, had shown neither the capacity nor the inclination to adjust to this transformation-in-progress, the emerging entrepreneurial culture and stellar economic prospects of the mid-Atlantic and northeastern states had already attracted the attention and investments of their on-the-make counterparts in Britain who had good reason to believe that the two could be looking at a bountiful future together. Beyond such dollars and cents calculations, it is fair to say that the 100 percent cotton blinders favored by southern leaders apparently obscured the size and growing strength of the abolitionist movement, not only in England but elsewhere in Europe.

If the foregoing raises doubt about the savvy of the southern secessionist contingent, it should at least be noted that most of the unheeded signals not to leave the Union are far more obvious in retrospect than they could possibly have been at the time. That is not the case, however, with the economic pitch served up by Scotland's contemporary campaigners for independence. Southern secessionists' faith in the long-term power and viability of King Cotton may have been overly optimistic, even wildly so, but their claims did not fly in the face of any such massively contradictive body of evidence and analysis as confronted the assertions of Scottish secessionist leaders like Alex Salmond that an independent Scotland stood to reap a veritable bounty in what he estimated to be 24 billion barrels of remaining North Sea oil deposits, which, in turn, could be used to fund the expanded welfare state most independence advocates seemed to desire.

Respected petroleum experts not only suspect that Salmond has overshot the mark here by 40 to 60 percent, but point out as well that current North Sea oil production is down two-thirds from its 1990s peak. Beyond that, there is no guarantee that the U. K. will actually hand over all the tax revenue generated by North Sea oil production, and even if it does, last year's tax take of 5 billion pounds is equivalent to only about 3 percent of Scotland's economy. Meanwhile, major international petroleum companies largely seem more inclined to cut back on their North Sea operations than to expand them, given the current uncertainty both over oil prices and the cost of new production facilities. All of this is to say that putting all the South's eggs in the cotton basket in 1860 seems almost conservative compared to the efforts of Scottish secessionists to downplay the astonishing risk attached to a "King Petroleum" secession strategy.

There were, of course, additional related concerns that may have undermined the  efforts of the Scottish "Secesh." While the currency question was less troubling at the outset for the Confederates, the matter of whether an independent Scotland could continue to pound the pound or, if not, could even count on being able to jump immediately to the Euro clearly loomed large in Thursday's vote.

Beyond the concrete issues on which the Scottish secession movement was ultimately splattered, from Charleston in 1861 to Edinburgh in 2014, sentiment for disunion was fueled in no small part by pure emotion, be it festering resentment or wounded pride or a combination thereof. In this respect, back-to-back screenings of "Brave Heart" and "Gone With The Wind" might give us the best comparative perspective on two secession movements separated by more than 150 years. Failing that, maybe just noting that ol' 007 himself, Sir Sean Connery, saw Scottish independence offering a glorious opportunity to toot his homeland's horn and maybe even make a pound/euro in the bargain through "international promotion of Scotland as an iconic location." Whatever comes next, Sir Sean will surely be as eager as the ol' Bloviator to see whether the resurgent independence movement, which has clearly stirred Scotland, will leave it thoroughly shaken as well.

(An earlier version of this humble offering was posted at www.likethedew.com)

            The older he gets and the worse things get, the Ol' Bloviator is finding progressively less satisfaction in yelling, "I told you so!" when one of his rants about our ever-madder dash toward doom comes true. This is certainly true in the case of two recent and remarkably similar incidents that amount to textbook examples of the ongoing devaluation of education in the face of a comparably blind, but increasingly overpowering obsession with industrial development as a "bargain at any cost" panacea for all our ills.

Witness the crisis over in Alabama, where their subsidies to new plants over the last two decades have long since run well into the billions, but  it seems they are running short of cash just now to shower on the next corporate candidate for a humongous  public payout. Not to worry, however, Alabama governor Robert Bentley has come up with an inspired, yet simple  solution for this dilemma; he wants to shift funds from education in order to shore up the stash he draws on in his role as the state's official bagman to new companies. The rationale for this switcheroo seems clear enough to Bentley: "Who pays for the incentives? It's not education, but they benefit from it totally . . . you ought to eat what you kill."  (If the Guv. really practices what he preaches here, he better pray he never hits a feasting buzzard while travelling on Alabama's excellent highway network.) Although some Alabama legislators expressed reservations about a special legislative session geared to making the governor's enlightened proposal a reality, it was not clear whether they objected to the move so much as to Bentley's failure to consult with them before releasing what he later insisted was merely a "trial balloon" that had simply been "misconstrued." Yeah, right. Alabama governors are known for their exceedingly complex thinking and rhetoric. For example, it took the O.B. forever to figure out what "Segregation Now! Segregation Tomorrow! Segregation Forever!" meant. Unfortunately, former governor Fob James's intellectual firepower went largely unappreciated, as became apparent when his industry-hunting trip to Israel was billed as "Our Yahoo Meets Their Netanyahu." Not coincidently, perhaps, back in the nineties, it was Fob who tried to raid the school fund to pay off part of the state's subsidy obligation to Mercedes.

            Meanwhile, Mississippi politicians are seldom accused of subtlety, and when they are, as in this case, it is almost always in comparison their counterparts in Alabama. According to this report, the state of Mississippi has been in violation of its own laws since 2008 by failing to provide its legally mandated share of public school funding.  It is currently spending $648 less per pupil than it did in 2008, and since then, it has racked up an illegal deficit in public education of at least $1.3 billion. In what must surely rank as the great-grandmother of all coincidences, that is precisely the figure arrived at by researchers in 2013 as the total value of the tax breaks promised to Nissan in exchange for locating a production facility at Canton, Mississippi.

            Over thirty years, the tax abatements offered Nissan will cost Madison County an estimated $210 million in revenue that might otherwise have been spent on schools. Beyond that, in a program truly reminiscent of the old sweat shop days when workers' pay checks were docked for a "subscription fee" used to defray the cost of building their employer's plant, Nissan is also allowed to keep what would normally be state income tax deductions from their employees' wages. Over twenty-five years, this nifty little palm greaser could ultimately top off at $160 million.

            Reports of such extravagances in two states not exactly known for their heroic sacrifices in the war against ignorance  simply underscore the hypocrisy of  current pious calls for "austerity," the fallout from which continues to fall heavily on public education. Perhaps the O.B. might be forgiven for nearly going a tad bit postal upon reading a New York Times account of Ranger Rick Perry's efforts to oust the current president of the University of Texas where, instead of just teaching the great gobs of info they already know, the faculty are apparently wasting their time and the public's money in trying to find out even more stuff (God knows what) to teach. In their discussion of the many difficulties facing public university presidents these days, the reporters twice refer to declining state "subsidies" for public higher education. Instead of reaching for his twelve-gauge, however, the O.B.  ultimately opted for a high-minded remonstrance, dispatched with dispatch to the nation's number one publishing platform:

In an otherwise excellent account of efforts to oust the president of the University of Texas, the writers twice refer to recent cuts in legislative appropriations for public higher education as "declining state subsidies." In the current political climate that this story so vividly reflects, this is heavily freighted language. However inadvertently, it reinforces the popular notion that state funding for an institution created by the state to function as a duly constituted obligation of the state is instead some sort of voluntary dispensation or indulgence. In the operative sense of the word, the funding received by the University of Texas from the State of Texas is no more a "subsidy" than is Governor Rick Perry's salary.

 The O.B. assumed at the outset, correctly, as it turned out, that his missive would almost certainly never make it into print, but he also assumed that sending it would at least make him feel a little better. Maybe it did, but the futility and meaninglessness of his gesture was quickly hammered home not only by the foregoing accounts from Dixie but comparable ones from elsewhere in the country, like New Jersey, where Gov. Chris[py Kreme] Christie has already signed off on $2 billion in corporate subsidies and state funding for higher education has fallen by more than 20 percent over the last six years. It was once almost a given that politicians were obliged to at least pay lip service to the notion that education is a powerful engine of economic progress. These days, a growing number of them seem to want us to see it as merely a cumbersome caboose.

THE MOST AMERICAN PLACE ON EARTH

NORMANDYCOMPRESSED8603070444_4aa16dc182_o.jpg
(courtesy isamiga76 @flickr.com)

I once wrote a book about the Mississippi Delta called The Most Southern Place on Earth. Were I to undertake a comparable tome about the most "American" place on earth, believe it or not, the focal point would not only lie outside the United States, but, of all places, in France, specifically, the Normandy American Cemetery at Colleville-sur-Mer, where lie the remains of 9,387 of the U.S troops who died during the June 1944 Allied invasion. More than any other historical site or monument that I have ever visited--in fact, more than all of them put together--this place engulfs me in a wave of teary, tingly, emotions. Set atop a bluff overlooking the English Channel and Omaha Beach against a stunning backdrop of lush, unimaginably green grass and perpetually wind-bent trees, even with the surf pounding rhythmically just below, the iconic, seemingly endless rows of perfectly aligned white crosses convey a palpable sense of peace and order that belies the chaos and wholesale slaughter that raged down on the beach 70 years ago. There is some irony in the fact that the Normandy American Cemetery provides entree and closure to the epic 1998 film Saving Private Ryan, whose opening scenes reflect an unprecedented cinematic effort to depict the D-Day landing as the nightmare of bloody, headless, legless, disemboweled carnage and confusion that it actually was.

 

In truth, this placid and pristine setting seems far better suited to serve as the final resting place of characters slain in earlier, less graphic World War II movies like The Longest Day who died neatly and, so it would seem, painlessly, shot down as they stood just inches from an apparently bulletproof John Wayne or Robert Mitchum. After all, in the popular mind at least, this was a war in which men died bravely and stoically, repeating the Lord's Prayer or receiving the last rites or saying the Kaddish, not one where agonized screaming or crying was punctuated by horrible blasphemies alternating with piteous, little-boy pleas for "Mama."

 

There is no record of the final minutes of Technical Specialist Five Joseph G. Hardy, the only World War II soldier who entered the service in Clarke County to be memorialized at the cemetery. In reality, Hardy (who actually hailed from the tiny hamlet of Good Hope (pop.219), in nearby Walton County) never even set foot on the sands of Normandy, because he was among the 39 members of Battery B of the 4th Infantry Division's 29th Field Artillery Battalion who were killed when their landing craft struck a mine on its approach to Utah Beach on June 6. Like most of his fallen battery mates, Hardy's body was never recovered, and thus his name is among the 1,557 inscribed on the "The Walls of The Missing," which encircle a beautifully maintained garden.

 

Idly perusing the names and accompanying states etched on the gravestones, I found myself wondering how many of the small-town boys like Joseph Hardy had even been out of Georgia, Alabama, Mississippi or South Carolina prior to the war. It is frankly difficult for me grasp how so vast an abstraction as national allegiance or patriotic duty could motivate thousands of such men to come thousands of miles away from home to step off landing crafts and wade into an unrelenting volley of lethal lead. In anointing this place with their blood and sacrifice, they made it both an enduring shrine to American national identity and a source of gnawing self-doubt for succeeding generations destined to remain forever in their debt.

 

Those buried here secured their hallowed place in history by giving their all in an epochal encounter that effectively secured victory in what seemed an indisputably righteous crusade against a correspondingly monstrous evil. In contrast, Vietnam veterans of my generation, who no less heroically risked or sacrificed their lives have been caught in a historical backlash against a conflict that, unlike World War II, did not unify us in defense of our longstanding ideals but instead divided the nation and called those ideals into question. In what seems an era largely lacking in courage and commitment, some of today's visitors to the Normandy American Cemetery are likely to leave inspired but also perhaps a bit saddened by a sense that those interred in this magnificent setting died in defense of a nation far worthier of their sacrifices than the one we live in today.

 

In reality, of course, this perception requires some degree of selective historical amnesia. For example, despite serving in a bloody struggle to defend democracy against a racist, totalitarian onslaught, African American soldiers in World War II found themselves fighting not just the Germans and the Japanese but the hostility of white civilians living in the vicinity of their stateside postings and, worse yet, the resentment and distrust manifested within the ranks by their own white comrades and commanders. D-Day operations reflected these racial realities quite clearly, as only a single battalion of black troops actually landed on Omaha Beach on June 6, 1944. Soldiers of the 320th Barrage Balloon Battalion came in on the third wave to set up anti-aircraft barrage balloons aimed at preventing German pilots from strafing the beach. Three members of the 320th are buried here, including Cpl. Brooks Stith from Virginia and Pfc. James McLean from North Carolina. Had the two survived, both would have returned to essentially the same segregated, discriminatory and disfranchised existence they had left behind, although black soldiers who came back from the war would go on to play a pivotal role in laying the groundwork for yet another all-out offensive that ultimately toppled Jim Crow. Those who stand in awe of this fearless band of postwar civil rights crusaders might well harbor certain sentiments common to American visitors to the Normandy Cemetery, including the fictional Pvt. James Francis Ryan, who, kneeling at the end of the film amid the graves of the comrades who gave their lives to save his, wonders aloud whether "in your eyes I've earned what all of you have done for me."

 


Bloviate:

"To orate verbosely and windily."

Bloviate is most closely associated with President Warren G. Harding, who used it frequently and was given to long winded speeches. H.L. Mencken said of Harding:

"He writes the worst English that I've ever encountered. It reminds me of a string of wet sponges; it reminds me of tattered washing on the line; it reminds me of stale bean soup, of college yells, of dogs barking idiotically through endless nights. It is so bad that a sort of grandeur creeps into it. It drags itself out of the dark abysm of pish, and crawls insanely up the top most pinnacle of posh. It is rumble and bumble. It is flap and doodle. It is balder and dash."

Cobbloviate dedicates itself to maintaining the high standards established by President Harding and described so eloquently by Mr. Mencken. However,the bloviations recorded here do not necessarily reflect the opinions of the mangement of Flagpole.com,nor,for that matter, are they very likely to be in accord with those of any sane, right-thinking individual or group anywhere in the known universe.

Monthly Archives