The Ol' Bloviator began the entry below as an attempt to respond to a query from an old and dear friend about the causes of the Civil War. He had no intention at the time of penning such a lengthy disquisition on the matter. Yet any hope for brevity vanished when it occurred to him that,well-worn as the topic may seem there might be some outside the Sacred Circle of Pointy Heads who would benefit from a more precise explanation of why the election of a Republican president in1860 triggered such an immediate outcry for secession in most of the major slaveholding states. The aforementioned pointy-heads may well find much reason to quibble with what follows, but this was not written for their edification, and their slings and snarks stand no chance of penetrating the OB's battle-thickened hide.

We historians may find some satisfaction in a recent survey suggesting that, at long last, a solid majority of Americans chose slavery as the primary cause of the Civil War (Psst: Don't tell Nikki Haley.) as opposed to the old "states' rights" dodge that was the top response just a dozen years ago.   This acknowledgement of what has been an unpleasant and long-denied historical truth for many whites is heartening to be sure. Still, identifying slavery as the dominant factor behind such an epochal historical event is not the same as grasping the particular aspect of the slavery question most immediately responsible for the South's exit from the Union in 1860, which in turn became the critical precipitant for the four years of savage bloodletting that ensued.

As the largest, most centrally located Deep South state with the largest enslaved population and the most slaveholders, Georgia's decision on secession seemed critical to any hopes of establishing a viable southern nation. Less than two weeks after Republican Abraham Lincoln captured the presidency in 1860, Judge Henry L. Benning stood before the Georgia General Assembly to make an impassioned case for secession, his "first proposition" being that the new president meant to abolish slavery "as soon as the party which elected him shall acquire the power to do the deed." Yet with Lincoln polling only 40 percent of the popular vote and the more slavery-amenable Democrats still in the majority in the Senate at that point, it stood to be quite a while before the Republicans acquired the power to wipe out slavery, much less demonstrated the will. The 1860 Republican platform expressly acknowledged "the right of each state to order and control its own domestic institutions according to its own judgment exclusively."  Lincoln himself had made no secret of his personal distaste for slavery, but he had more than once foresworn any attempt to interfere with it "in the states where it exists, because the constitution forbids it, and the general welfare does not require us to do so."

These were hardly the words of a wild-eyed radical looking to destroy slavery on the spot, and on their face, they seem to beg the question of why Lincoln's election and the ascent of the Republican Party provoked such an outcry for secession from Benning and his ilk. Allowing for certain complexities, the key to answering this question lies in understanding that for many of the most dedicated guardians of slavery, the gnawing concern was not simply whether it would continue to exist within the Union, but where.

The near warp-speed of slavery's westward progression after the invention of the cotton gin had led to the admission of three new slave states between 1800 and 1819.  Protracted wrangling over congressional representation between free- and slave-stated delegates to the constitutional convention in 1787 had led to the inclusion a provision that three-fifths of a state's enslaved population would be added to its free population in determining how many seats it would be allocated in the House of Representatives. Free-state leaders were soon complaining that this supposed "compromise" gave an unfair advantage to the slave states, although the three free states that had recently joined the Union made for at least a numerical free-state/ slave-state balance of eleven each in 1819.  The Missouri Compromise of 1820 seemed to offer a blueprint for maintaining that equivalency by pairing the admission of Missouri as a slave state with the entry of Maine as a free state, though it imposed geographic restrictions on the spread of slavery by banning it in any new territory north of the 36th parallel.

In the face of continuing western territorial expansion over the next three decades, congressional leaders took pains to preserve free-state/slave state parity by essentially granting statehood to individual slave and free territories in pairs.  This studiously maintained equilibrium was shattered in 1850, however, when, under the terms of the Compromise of 1850, California entered the Union as a free state without a slave state to offset it.  

Faster population growth had already allowed the free states to overcome the old "three-fifths" advantage enjoyed by the slave states in the House of Representatives by that point, but southern slaveholding interests had counted on a numerical stalemate in the Senate to forestall any move to eradicate or weaken slavery. Now, with the entry of California, the chances of re-leveling the political terrain in the Senate seemed next to nil because any territory likely to join the Union in the foreseeable future was situated north of the old Missouri Compromise line where slavery was forbidden. That changed abruptly in 1854, thanks to the machinations of Illinois Senator Stephen A. Douglas, a Democrat with presidential ambitions who was also courting southern support for building a transcontinental railroad that would originate in Chicago.  To that end, Douglas pushed through legislation effectively removing the old Missouri Compromise line as the northern limit for the extension of slavery and allowing settlers of the Kansas and Nebraska territories to settle the matter among themselves by popular vote. The intense backlash against the measure the free states quickly gave rise to the Republican Party, which began in 1854 as a loose coalition comprised primarily of former Whigs and veterans of the "Free Soil" movement. There was also a smattering of abolitionists, but in the main the Republican Party was united by little more than a shared determination, not to end slavery immediately, but to keep it out of the new western territories at all costs. (Otherwise, the white "free labor" drawn to these areas could find itself competing directly with labor that was anything but.) Though it was a new, effectively single-issue party with a decidedly regional base, strong support in New England and the fast-growing Upper Midwest enabled the first Republican presidential candidate, John C. Fremont of California, to capture nearly 40 percent of the electoral votes in 1856. Fremont's surprisingly strong showing foretold the nightmare scenario awaiting southern slaveholding interests four years later with the election of a president whose party's central unifying purpose lay in blocking any further extension of slavery.  Although Lincoln had offered numerous assurances that he meant to leave slavery intact where it already existed, he had likewise made it clear that he would accept "no compromise which assists or permits the extension of the institution on soil owned by the nation."

Despite Lincoln's promises to leave them alone, in a future when slavery could no longer expand, slaveholders foresaw not only increasingly pronounced political isolation but the certainty of financial ruin. The soaring demand for labor accompanying the rapid spread of cotton cultivation into the southern interior after 1800 had led to consistent increases in slave prices for over half a century.  As early as 1848, a Georgia farmer who owned just 10 prime field hands was wealthier than all but 1 percent of the citizens of Boston.  By 1860, the average slaveholder in Georgia was five times wealthier than the average northerner, and one who owned just two slaves and nothing else was at least as well off.   

Georgia's largest slaveholders were some of the richest people not only in the nation but in the world. Serendipitously or otherwise, Howell Cobb married one of the richest women in Georgia in Mary Ann Lamar. Despite his free-spending ways, at the end of the 1850s he and Mary Ann still laid claim to large plantations in four counties. With slave prices averaging roughly $1,000 in 1860 (and those in their prime working years often going for considerably more), the enslaved labor forces on each plantation accounted for the lion's share of their worth. In this case, combined value of just two of the Cobbs' holdings likely surpassed $400,000, which in 1860 would have packed the purchasing power of roughly $15,000,000 today.

The value of the Cobbs' holdings was eclipsed by those of Joseph Bond, who owned eight plantations (six in Dougherty County and two in Lee) and five hundred slaves, and his estimated net worth exceeded a million dollars in 1859. In 1858 his plantations had also yielded a cotton crop worth $100,000. Still, as Howell Cobb understood full well, for a large slaveholder, cotton was but a secondary crop. With every infant born in captivity an addition to his assets as both capital and labor, his "largest source of prosperity [was] in the Negroes he raises." Cobb had once exulted that his own slaves "multiplied like rabbits," and speculated that their numbers might even double every fifteen years or so.

 This would profit him but little, of course, if restrictions on where slaves could be taken or compelled to work were allowed to diminish their value as either labor or capital.  The resulting deflation of slave prices stood to be exacerbated by the same high rate of natural increase within the enslaved population that had once been such a boon to the planter's finances, but now promised only to increase the supply of enslaved labor at a time of slackening demand for it.

Finally, there was the emotionally fraught question of how whites were to maintain control of a relentlessly expanding slave population that in Georgia's case had swelled by more than 100,000 in the 1840s.  The 1850 census showed enslaved blacks already outnumbering whites in more than a third of Georgia counties, and by more than two to one in several of them.  The racial imbalances ran even higher in older coastal counties such as Glynn, Liberty, and McIntosh. With no outlet for a constantly swelling enslaved population desperate from freedom and bent on revenge, slaveholders would in live in constant fear of midnight massacres.  Meanwhile, steeped in the age-old mythology that within every adult black male beat the heart of a bestial rapist, their wives and daughters would live in absolute terror of sexual assaults so unimaginably savage that as Henry Benning depicted them, "they would cry out for the mountains to fall on them."  

Incendiary secessionist orators like Benning played shamelessly on these horrific fantasies to a target audience consisting by and large of the wealthy and powerful slaveholders who were strikingly overrepresented in the state legislature, where half the members owned twenty slaves or more. Meanwhile, at the convention where the ordinance of secession was subsequently drafted roughly three times as many delegates as legislators owned at least that many. Not surprisingly then, the principal question to be resolved at the assemblage was not whether slavery must be preserved, even at the price of secession, but whether at that juncture some better means of assuring its perpetuity might still be devised within the Union.

The events of the previous decade culminating in the election of Lincoln and the Republicans hardly inspired optimism about the latter option, however. Acutely aware that his election had energized the secession movement in Georgia, in December 1860 Abraham Lincoln wrote to assure his old friend and former Whig compatriot Alexander Stephens that slavery would be in no more jeopardy on his watch than it had been "in the days of Washington." Yet he appeared to acknowledge the futility of his effort when he allowed that "You think slavery is right and ought to be extended; while we think it is wrong and ought to be restricted. That I suppose is the rub. It certainly is the only substantial difference between us." Not only did that difference prove substantial enough to tear the Union apart, but in doing so it underscored the profound irony of the conviction among the South's powerful slaveholding interests that their wealth and future welfare depended on ensuring that the "peculiar institution" became less peculiarly southern.  

The Ol' Bloviator has always been a sucker for free speech dustups, and the current one is a certified humdinger. So much so that it is impossible to do justice to its nuances and complexities in the space allocated by most publications. What follows is a slightly expanded version of this piece as it ran at Time.com.

Disagreements over whether universities should curb the rhetoric of students protesting Israel's military incursion into Gaza have been striking in their ferocity, and remain heated more than two months after the PR disaster of a congressional hearing in which New York Representative Elise Stefanik pressed the presidents of the University of Pennsylvania, Harvard, and MIT about whether calling for a campaign of "genocide" against Jews would violate their school's code of conduct policies against 'bullying and harassment." Caught between warring factions on campus and beyond and hamstrung by their schools' seemingly contradictory speech and conduct policies, the presidents offered only what appeared to be deliberately evasive, non-committal responses.  Widespread dissatisfaction with those responses sparked an uproar in both the public and academic spheres over whether certain types of speech should be forbidden on America's campuses. The ensuing furor prompted the resignation of Penn's Liz Magill and contributed to the demise of Harvard's Claudine Gay as well. 

Conflicts over the boundaries of acceptable speech on campus--or whether any such boundaries should even exist--are hardly new. Few could better attest to this or to the lessons they offer than the late C. Vann Woodward, one of America's most distinguished historians, as well as one of its most ardent defenders of free speech. Woodward's abiding conviction that "the results of free expression are to the general benefit in the long run, however unpleasant they may appear at the time," should inform the thinking of  administrators now weighing the intrinsic long-term rewards of guaranteeing free speech on their campuses against demands to protect students from hateful speech in the here and now.

Woodward began to earn his credentials as a champion of free speech in the early 1930s when he spoke out forcefully against police persecution of communist organizers in Atlanta. 

Teaching at Johns Hopkins in the early 1950s he again weighed in to prevent the firing of his faculty colleague Owen Lattimore, after Senator Joseph McCarthy accused Lattimore of being a Soviet agent. Lattimore's case fell into a general pattern dating back to the early days of the republic, in which people opposing the prevailing conservative majority were silenced, either through political repression, ostracism, or economic or social coercion. 

Read MoreThe Dangers of Curtailing Free Speech on Campus

Yet, by the time Woodward arrived at Yale in 1962, most attempts to restrict speech on campuses were coming from the opposing ideological direction, as left-leaning students and faculty rallied to prevent dissenting voices on the right from being heard. Though he had been at Yale for scarcely a year, Woodward voiced his extreme displeasure in Sept. 1963, when then acting president Kingman Brewster persuaded a student organization to rescind a speaking invitation to segregationist Alabama Governor George Wallace. 

By the end of the decade, the leftist speech police had moved on to muzzling supporters of the Vietnam War. In 1972, Woodward objected vigorously when student protestors formed a physical barrier to prevent former Vietnam commander General William Westmoreland from speaking at Yale. 

Two years later, he protested just as vehemently about students shouting down William A. Shockley, a black-inferiority proponent. 

Woodward's outspokenness on such incidents made him a logical choice to chair a committee created by Brewster to craft what both agreed was a much-needed statement affirming Yale's unwavering commitment to free speech.

The result was a new free speech policy, released in 1975, and better known on campus as the "Woodward Report." The document made a forceful case for freedom of speech as an immutable principle by which any university worthy of the designation should abide, stressing "the need for unfettered freedom, the right to think the unthinkable, discuss the unmentionable, and challenge the unchallengeable. . .. We value freedom of expression precisely because it provides a forum for the new, the provocative, the disturbing, and the unorthodox."

A university might well be "a special kind of small society," the report's authors conceded, but its "primary function is to discover and disseminate knowledge.... It cannot make its primary and dominant value the fostering of friendship, solidarity, harmony, civility, or mutual respect," and remain true to its "central purpose." Simply put, when there was a choice to be made, the "need to guarantee free expression" must take precedence over concern for "civility and mutual respect."

Commentators eagerly embraced the Woodward Report as a definitive blueprint for resolving --or at least containing--one of the most perennially divisive issues confronting campus administrators. Some students and faculty were not so sure, including a dissenting member of Woodward's committee who foresaw such an absolutist stance on free speech as giving tacit license for persecution and harassment of "small and powerless minorities" on campus.   

His concern seemed to be well-placed in the 1980s when bulletin boards at Yale used by gay student organizations were routinely vandalized. By 1983, the problem had grown severe enough to spur a campuswide research project aimed at collecting "accounts of verbal and physical harassment" of gay and lesbian students. 

Matters seemed to come to a head in 1986 when undergraduate Wayne Dick posted flyers that mocked "Gay and Lesbian Awareness Days" by announcing "Bestiality Awareness Days." University administrators quickly charged Dick with violating Yale's policy against "harassment or intimidation of members of the university community on the basis of their sexual orientation" and a campus executive committee placed him on two years' probation. Dick, however, insisted that his actions fell under the protections guaranteed in the Woodward Report. 

Though Woodward had been retired for 10 years, he drew heavily on the enormous clout he still enjoyed on campus in order to get Dick's probation lifted. In his mind, Dick's actions did not constitute "harassment" because he had not advocated "violence or intimidation" at any point. "Certainly I don't agree with his ideas," Woodward explained, "but they all come under the protection of free speech." 

If anything, Woodward became more adamant on this point as he grew older, but the weight of opinion was already shifting against him at Yale and elsewhere. As administrators made boosting diversity on campus an increasingly urgent institutional priority, efforts to attract and retain more minority students and faculty ushered in policies aimed at safeguarding their sensibilities and making them feel at ease. 

Read More: What the State of the American South a Half-Century Ago Revealed About the Whole Country's Future

With schools such as Wisconsin and Michigan leading the way at the end of the 1980s, hundreds of colleges and universities implemented speech codes and other provisions aimed at preventing the intimidation and persecution of minorities on campus. The courts would strike down speech codes at a number of public universities as violations of the First Amendment. Still, be the school public or private, including both Harvard and Penn, wherever these attempts to limit speech survived, they did so in uneasy coexistence with policies that either explicitly or implicitly invoked the First Amendment which strictly forbids any abridgment of the freedom of speech.  After seeing the courts repeatedly strike down or eviscerate speech codes at public universities as violations of the First Amendment, even private institutions such as Penn that are not directly subject to the amendment's provisions opted nonetheless to "embrace its values" in their own speech policies. Harvard went a step further by stipulating that conflicts between "freedom of expression" and "other rights" must be resolved in a manner consistent with "established First Amendment standards."

 With adherence to those standards in adjudicating speech-related complaints guaranteed by official university policy and the courts continuing to grant First Amendment protections to the most egregious forms of so-called hate speech, calling for the genocide of Jews would logically enjoy the same protections on campus.  Presidents Magill and Gay were not simply splitting semantic hairs to avoid giving a direct answer to Congresswoman Stefanik's question when they explained that only  "speech" (which was essentially protected by one set of their school's policies) passed over into "conduct" (which another set of policies empowered them to regulate) was it punishable as "discrimination, bullying [and] abusive behavior" on their respective campuses.

Proponents of speech codes were looking to shelter minorities from the abuse of free speech protections by others. There was little apparent concern that these protections might one day be weaponized by one student minority against another. 

Yet that is what appears to be happening now on a number of American campuses. Both the pro-Palestinian and pro-Israeli factions constitute minorities within the student bodies at these embattled institutions. Supporters of a Palestinian state have become more vocal and insistent since Hamas's Oct. 7 attack on Israel, with some of their rhetoric featuring the kind of resentment and rage historically associated with religious or cultural nationalism.

In turn, although the share of Americans who support Israel's ongoing military campaign in Gaza has declined some in recent weeks, it still includes a sizable portion of the academic donor class. Their demands to censor critics of Israel's military response have introduced another facet to the free speech debate. The rapidly escalating endowment arms race makes it difficult to limit donors' involvement in university affairs, especially when gifts come not simply with specific strings but powerful emotions attached. As the recent outcry from Penn donors suggests, the Israel-Hamas war has brought a new sense of urgency to the longstanding debate over what universities "owe" their benefactors.

There is another differentiating element to the war over words now engulfing our universities. The assaults on free speech that Woodward sought to repulse largely emanated from one end of the political spectrum or the other. By contrast, the impetus for today's conflicts seems to be coming, both on and off campus, from several directions at the same time. 

These contemporary clashes over campus speech policies reflect the powerful and complex forces that have dramatically altered the landscape of American higher education since Woodward's death in 1999. One consequence has been a growing inclination to challenge the primacy long accorded free speech at our universities.  There is all the more reason, then, to harken back to Woodward's position on such challenges, particularly his warning that succumbing to pressures to restrict speech on campus, regardless of the source, stood to be deeply and enduringly injurious to the intellectual health of a university. Whether today's college administrators can tune out the anger and shouting of the current moment long enough to give his counsel the hearing it deserves remains to be seen, however. 

James C. Cobb is Spalding distinguished professor of history emeritus at the University of Georgia. His most recent book is C. Vann Woodward, America's Historian (2022). 

 

The Ol' Bloviator has always been a sucker for free speech dustups, and the current one is a certified humdinger. So much so that it is impossible to do justice to its nuances and complexities in the space allocated by most publications. What follows is a slightly expanded version of this piece as it ran at Time.com.

Disagreements over whether universities should curb the rhetoric of students protesting Israel's military incursion into Gaza have been striking in their ferocity, and remain heated more than two months after the PR disaster of a congressional hearing in which New York Representative Elise Stefanik pressed the presidents of the University of Pennsylvania, Harvard, and MIT about whether calling for a campaign of "genocide" against Jews would violate their school's code of conduct policies against 'bullying and harassment." Caught between warring factions on campus and beyond and hamstrung by their schools' seemingly contradictory speech and conduct policies, the presidents offered only what appeared to be deliberately evasive, non-committal responses.  Widespread dissatisfaction with those responses sparked an uproar in both the public and academic spheres over whether certain types of speech should be forbidden on America's campuses. The ensuing furor prompted the resignation of Penn's Liz Magill and contributed to the demise of Harvard's Claudine Gay as well. 

Conflicts over the boundaries of acceptable speech on campus--or whether any such boundaries should even exist--are hardly new. Few could better attest to this or to the lessons they offer than the late C. Vann Woodward, one of America's most distinguished historians, as well as one of its most ardent defenders of free speech. Woodward's abiding conviction that "the results of free expression are to the general benefit in the long run, however unpleasant they may appear at the time," should inform the thinking of  administrators now weighing the intrinsic long-term rewards of guaranteeing free speech on their campuses against demands to protect students from hateful speech in the here and now.

Woodward began to earn his credentials as a champion of free speech in the early 1930s when he spoke out forcefully against police persecution of communist organizers in Atlanta. 

Teaching at Johns Hopkins in the early 1950s he again weighed in to prevent the firing of his faculty colleague Owen Lattimore, after Senator Joseph McCarthy accused Lattimore of being a Soviet agent. Lattimore's case fell into a general pattern dating back to the early days of the republic, in which people opposing the prevailing conservative majority were silenced, either through political repression, ostracism, or economic or social coercion. 

Read MoreThe Dangers of Curtailing Free Speech on Campus

Yet, by the time Woodward arrived at Yale in 1962, most attempts to restrict speech on campuses were coming from the opposing ideological direction, as left-leaning students and faculty rallied to prevent dissenting voices on the right from being heard. Though he had been at Yale for scarcely a year, Woodward voiced his extreme displeasure in Sept. 1963, when then acting president Kingman Brewster persuaded a student organization to rescind a speaking invitation to segregationist Alabama Governor George Wallace. 

By the end of the decade, the leftist speech police had moved on to muzzling supporters of the Vietnam War. In 1972, Woodward objected vigorously when student protestors formed a physical barrier to prevent former Vietnam commander General William Westmoreland from speaking at Yale. 

Two years later, he protested just as vehemently about students shouting down William A. Shockley, a black-inferiority proponent. 

Woodward's outspokenness on such incidents made him a logical choice to chair a committee created by Brewster to craft what both agreed was a much-needed statement affirming Yale's unwavering commitment to free speech.

The result was a new free speech policy, released in 1975, and better known on campus as the "Woodward Report." The document made a forceful case for freedom of speech as an immutable principle by which any university worthy of the designation should abide, stressing "the need for unfettered freedom, the right to think the unthinkable, discuss the unmentionable, and challenge the unchallengeable. . .. We value freedom of expression precisely because it provides a forum for the new, the provocative, the disturbing, and the unorthodox."

A university might well be "a special kind of small society," the report's authors conceded, but its "primary function is to discover and disseminate knowledge.... It cannot make its primary and dominant value the fostering of friendship, solidarity, harmony, civility, or mutual respect," and remain true to its "central purpose." Simply put, when there was a choice to be made, the "need to guarantee free expression" must take precedence over concern for "civility and mutual respect."

Commentators eagerly embraced the Woodward Report as a definitive blueprint for resolving --or at least containing--one of the most perennially divisive issues confronting campus administrators. Some students and faculty were not so sure, including a dissenting member of Woodward's committee who foresaw such an absolutist stance on free speech as giving tacit license for persecution and harassment of "small and powerless minorities" on campus.   

His concern seemed to be well-placed in the 1980s when bulletin boards at Yale used by gay student organizations were routinely vandalized. By 1983, the problem had grown severe enough to spur a campuswide research project aimed at collecting "accounts of verbal and physical harassment" of gay and lesbian students. 

Matters seemed to come to a head in 1986 when undergraduate Wayne Dick posted flyers that mocked "Gay and Lesbian Awareness Days" by announcing "Bestiality Awareness Days." University administrators quickly charged Dick with violating Yale's policy against "harassment or intimidation of members of the university community on the basis of their sexual orientation" and a campus executive committee placed him on two years' probation. Dick, however, insisted that his actions fell under the protections guaranteed in the Woodward Report. 

Though Woodward had been retired for 10 years, he drew heavily on the enormous clout he still enjoyed on campus in order to get Dick's probation lifted. In his mind, Dick's actions did not constitute "harassment" because he had not advocated "violence or intimidation" at any point. "Certainly I don't agree with his ideas," Woodward explained, "but they all come under the protection of free speech." 

If anything, Woodward became more adamant on this point as he grew older, but the weight of opinion was already shifting against him at Yale and elsewhere. As administrators made boosting diversity on campus an increasingly urgent institutional priority, efforts to attract and retain more minority students and faculty ushered in policies aimed at safeguarding their sensibilities and making them feel at ease. 

Read More: What the State of the American South a Half-Century Ago Revealed About the Whole Country's Future

With schools such as Wisconsin and Michigan leading the way at the end of the 1980s, hundreds of colleges and universities implemented speech codes and other provisions aimed at preventing the intimidation and persecution of minorities on campus. The courts would strike down speech codes at a number of public universities as violations of the First Amendment. Still, be the school public or private, including both Harvard and Penn, wherever these attempts to limit speech survived, they did so in uneasy coexistence with policies that either explicitly or implicitly invoked the First Amendment which strictly forbids any abridgment of the freedom of speech.  After seeing the courts repeatedly strike down or eviscerate speech codes at public universities as violations of the First Amendment, even private institutions such as Penn that are not directly subject to the amendment's provisions opted nonetheless to "embrace its values" in their own speech policies. Harvard went a step further by stipulating that conflicts between "freedom of expression" and "other rights" must be resolved in a manner consistent with "established First Amendment standards."

 With adherence to those standards in adjudicating speech-related complaints guaranteed by official university policy and the courts continuing to grant First Amendment protections to the most egregious forms of so-called hate speech, calling for the genocide of Jews would logically enjoy the same protections on campus.  Presidents Magill and Gay were not simply splitting semantic hairs to avoid giving a direct answer to Congresswoman Stefanik's question when they explained that only  "speech" (which was essentially protected by one set of their school's policies) passed over into "conduct" (which another set of policies empowered them to regulate) was it punishable as "discrimination, bullying [and] abusive behavior" on their respective campuses.

Proponents of speech codes were looking to shelter minorities from the abuse of free speech protections by others. There was little apparent concern that these protections might one day be weaponized by one student minority against another. 

Yet that is what appears to be happening now on a number of American campuses. Both the pro-Palestinian and pro-Israeli factions constitute minorities within the student bodies at these embattled institutions. Supporters of a Palestinian state have become more vocal and insistent since Hamas's Oct. 7 attack on Israel, with some of their rhetoric featuring the kind of resentment and rage historically associated with religious or cultural nationalism.

In turn, although the share of Americans who support Israel's ongoing military campaign in Gaza has declined some in recent weeks, it still includes a sizable portion of the academic donor class. Their demands to censor critics of Israel's military response have introduced another facet to the free speech debate. The rapidly escalating endowment arms race makes it difficult to limit donors' involvement in university affairs, especially when gifts come not simply with specific strings but powerful emotions attached. As the recent outcry from Penn donors suggests, the Israel-Hamas war has brought a new sense of urgency to the longstanding debate over what universities "owe" their benefactors.

There is another differentiating element to the war over words now engulfing our universities. The assaults on free speech that Woodward sought to repulse largely emanated from one end of the political spectrum or the other. By contrast, the impetus for today's conflicts seems to be coming, both on and off campus, from several directions at the same time. 

These contemporary clashes over campus speech policies reflect the powerful and complex forces that have dramatically altered the landscape of American higher education since Woodward's death in 1999. One consequence has been a growing inclination to challenge the primacy long accorded free speech at our universities.  There is all the more reason, then, to harken back to Woodward's position on such challenges, particularly his warning that succumbing to pressures to restrict speech on campus, regardless of the source, stood to be deeply and enduringly injurious to the intellectual health of a university. Whether today's college administrators can tune out the anger and shouting of the current moment long enough to give his counsel the hearing it deserves remains to be seen, however. 

James C. Cobb is Spalding distinguished professor of history emeritus at the University of Georgia. His most recent book is C. Vann Woodward, America's Historian (2022). 

 

Déjà Vu Down in the Swamp

 

C. Vann Woodward would become perhaps the most influential American historian of the twentieth century, but he was in just his second year as a lowly assistant professor of social science at the University of Florida in November 1938. His friend and faculty colleague Bill Carleton was away from Gainesville recuperating from a thyroid illness, and Woodward wrote to update him on recent campus happenings. His missive to Carleton revealed a sentiment all too common among succeeding generations of professors plying their trade at southern universities whose administrators and alumni alike seemed to see the key to enhancing the school's reputation not as investing in its academic standing, but going all in to juice up its prowess on the gridiron:  

"This was homecoming weekend, just over - and I hardly stuck my head out the door.... a succession of football defeats has cleared the air somewhat now, and I am glad of them. There seems to prevail a sane attitude on the campus once more, especially since Oxford, the captain, was dropped from the team for not attending classes. And the "Senate," (i.e., Tigert) upheld the action. [ J.J. Tigert was the president of the University of Florida and a noted advocate of expanding intercollegiate athletic programs who had been instrumental in the formation of the Southeastern Conference.] The Alligator [student newspaper] applauded the action. There seems to be a general sobering up and the realization that football is not the sole concern of the University. But for a while the uproar of the alumni was disgusting. They yelled for [head coach Josh] Cody's blood, for Tigert's, for the team's. The Board of Control met and pondered. Rumors and counter rumors! Streamer headlines in the papers! Gov. Cone - silent on his death bed for months and incapacitated - returns to life to "save the situation." Gives an interview. Will give Florida a winning team etc., etc., ad nauseum administration frantic with thousands of dollars due for unpaid scholarships. And every decision in the University from the B. of Control to the friends of the library turning on whether the team would win the next game. Nauseating really! But now things look better.  All agree the three final games will be defeats and cease to worry about it."

It is frequently difficult to figure out how college football fans develop such overheated expectations, and it was especially so in this case. Prior to 1938, in five seasons of SEC play the Florida Gators had accumulated a combined conference record of 22-26-2 and could claim but five victories against SEC opponents not named Sewanee. After two seasons at Florida, coach Josh Cody had won 8 games and lost 13. His 1938 team would finish 4-6-1, and he would be fired after the Gators went 5-5-1 in 1939.  Contrary to the impression Woodward conveyed, Florida had actually beaten Maryland 21-7 in the aforementioned November 12 homecoming contest, though after back-to-back losses to Boston College and Georgia. The team was 3-5 at that point, with its other two victories coming against Tampa and the ever hapless and outmanned Sewanee. The Gators lost their opening game by 2 points to Stetson and despite a huge sendoff at the rail station featuring the university's vaunted "72-piece band," their trip to take on the then Mississippi State "Maroons" in Starkville ended in a 22-0 thrashing. Deposed team captain Jimmy Oxford from Leesburg, Florida was neither a quarterback, running back nor wide receiver, but a center, who stood 6' 1" and weighed under 200 pounds. Despite Woodward's smug assurance that the team would tank in its last three games, the Gators managed to tie Georgia Tech and eke out a win over Auburn before falling to Temple in the season finale. The loss to Temple warrants a brief reminder about historical context. Where Temple would venture into the SEC lair today only with the promise of a payoff commensurate with losing half a dozen starters to shattered collarbones and shredded ACL's, the center of gravity and momentum in college football had yet to complete its southward migration in 1938, a year in which Auburn lost to Villanova, Georgia to Holy Cross, and Kentucky to Xavier. Between 1900 and the opening of SEC play in 1933, teams representing current members of the Ivy League had claimed at least a share of sixteen national championships, while teams now playing in the Southeastern Conference accounted for five. In fact, ironically enough, the leaders of a number of southern universities had seized on achieving gridiron pre-eminence comparable to that of the major northeastern and midwestern schools as a key to gaining ground on the them in the scramble for national prestige on all fronts.   Although Georgia managed to win six of its eleven encounters with Yale between 1923 and 1934, the latter was favored going into each tilt, and over the entire span, the Bulldogs from New Haven had deigned to meet the Bulldogs from Athens on their own turf only once, in 1929, for the fabled dedication of Sanford Stadium. Though this seemed quite a concession, officials and coaches at Georgia deemed it a fair price for the additional creds that came with regularly rubbing shoulder pads with a football program with twenty-seven national championships on its resume′.

Florida football had appeared to be building up quite a head of steam in the 1920s, and despite having no state funds to dip into as the Great Depression set in, President Tigert had gone out on the flimsiest of financial branches by persuading ten of the team's biggest boosters to join him in taking out personal loans to expedite construction of Florida Field in 1930. (Fittingly enough for a venue ultimately known as "The Swamp," construction had been delayed by drainage problems.) Despite the Herculean effort required to make it a reality, the brand new 22,000-seat facility seemed to foretell a glorious future on the gridiron for the Florida faithful, especially after the university joined a dozen other schools in the states south and west of the Appalachians, including Tulane, Georgia Tech and Sewanee in withdrawing from the bloated twenty-three-member Southern Conference and forming the brand-new Southeastern Conference, which officially opened play in 1933. Alas, instead of the meteoric rise to greatness anticipated in Gainesville, the next twenty years proved to be a generation of sustained deflation for Gator nation. (Not too shabby, huh?) Coach Dutch Stanley's 1934 squad managed to go 6-3-1, despite winning only two conference games.  The next the time Gators would notch a winning season overall was 1952 (Yep, that means eighteen consecutive losing seasons) and only in 1954, twenty-one years into their SEC tenure, would they post their first winning mark within the conference. The Gators were hardly off and running at that point, though. They would be stripped of what appeared to be their first conference title in 1984 for recruiting violations.  Only in 1991, nearly sixty years after joining the SEC, would they claim their first league title on the up and up. There would be seven more by 2008, not to mention the national championships of 1996, 2006 and 2008. Many more were surely in the offing, and sooner rather than later, assumed the by now thoroughly wild-eyed Gator masses. Alas, however, the curse of high expectations was once again about to take a big ol' Gator chomp out of their hind parts. After the departure of two-time natty winning coach Urban Meyer in 2010, the jacked-up Florida faithful quickly found themselves grappling with the fearful prospect of an imminent slippage back into irrelevance and mediocrity.  Despite winning ten or more games four times over the next dozen years, there would be three seasons, including two of the last five, when the Gators finished below .500. Over this span, the university pink-slipped three head coaches while soothing the pangs of separation with payouts sizable enough to slacken the jaws of Putin's oligarchs. At this point there is no shortage of angst around Gainesville town as a fourth would-be restorer of the faith, Billy Napier, late of the University of Louisiana- Lafayette, currently preps for his debut amid loud wailing about a dearth of young men on the roster capable of bench-pressing an ox or outrunning a cheetah and athletic facilities too lacking in glitter and plush to win the hearts of those who can do either. 

Meanwhile, our young Professor Woodward had become disenchanted with the University of Florida almost on arrival, and he was gone after two years. He would go on to spend the bulk of his career, first at Johns Hopkins and then at Yale, where, by the 1960s, the annual gridiron dust-up with the lads from Harvard marked the apogee of football fervor thereabouts. Even so, if Woodward were still around, he would not be slow to recognize the irony--or naivete--of his long ago proclamation that the campus community had had finally come to recognize "that football is not the sole concern of the University" at a place where, even when the $118,000 spent to construct Florida Field in 1930 is converted to current dollars ($2.1 million) it comes to less than a third of what the school is paying its brand new, little-tested head football coach for the upcoming season.

 

A Not So Sentimental Journey Into the Past

The Ol' Bloviator scrawled out this piece within a couple of hours of the attack in order to meet the deadline for the next day's paper. The biggest change he can see since then is that our most pressing need as a people today is freedom from fear of each other. 

Atlanta Journal-Constitution, September 12, 2001

James C. Cobb

"Americans Left to Fear Unseen Enemy"

On January 6, 1941, President Franklin D. Roosevelt promised to forge "a world founded upon four essential freedoms." In addition to freedom of speech, freedom of religion, and freedom from want, there was "freedom from fear," which in Roosevelt's view meant "a worldwide reduction of armaments" so that "no nation will be in a position to commit an act of aggression against any neighbor--anywhere in the world." Rather than securing freedom from fear, however, our victory in World War II soon dissolved into a nuclear arms race fueled by the Cold War.

The generation that spent portions of their childhoods practicing for direct nuclear hits on their elementary schools by putting their heads under their desks or had its adolescence punctuated by the sheer terror of the Cuban Missile Crisis can hardly look back with much nostalgia on that era. Yet, even as the Cold War ended and we breathed a collective sigh of relief at the diminished likelihood of a global nuclear holocaust, we were already slipping into a new era of fear and uncertainty, one in which the enemy could be internal, as well as external, and essentially invisible to boot, one in which extravagant defense budgets and massive missile stockpiles count for less than the ruthless and calculated fanaticism of relatively small numbers of unseen and often unknown enemies.

Regardless of whether Tuesday's death total exceeds that of Pearl Harbor, one of these terrifying new enemies made September 11, 2001, a day that would live not only in infamy, but in irony as well. As president, George Bush, Sr., sought to take credit for the end of the Cold War and promised to create a new world order. Yesterday, he saw another president named Bush forced into hiding in an underground bunker. Our inability to protect even the Pentagon and perhaps even the White House or the Capitol served chilling notice that, when all is said and done, Osama Bin Laden can get closer to George W. Bush, Jr., than the latter, for all his resources, can get to him.

The hysterical reporters and the scenes of genuine public panic in New York seemed more the stuff of B-movies or a TV mini-series than that of live "as-we-speak" reality. Obviously, we are stunned by the apparent ease with which planes at major airports could be hijacked and used to demolish what should have been a tightly secured potential terrorist target. Yet, neither our shock or our dismay at the paralyzing fallout of this atrocity at all the nation's airports and in its major cities defines the true significance of yesterday's horrors. That significance lies in the capacity of an unseen enemy to make not just the residents of New York or Washington, D.C., afraid, but to implant that fear into the hearts of millions of Americans who have never been (and probably never intend to be) near either New York City or a major airport.

This reality came through to me in a number of ways, including the cancellation of classes at the University of Georgia and the anxious investigation of a "suspicious" van parked near the federal building in Athens. However, it was local reaction here in Hart County to yesterday's horrors that I found most enlightening however. Our local radio station, WKLY, "The Voice of the Upper Savannah River," largely suspended its regular programming (save, of course, for the obituaries and mid-day devotional) and broadcast the programming of WGST and the Georgia News Network. The mayor of Hartwell, a woman of Lebanese extraction and Episcopal faith, urged citizens to offer their prayers for the victims and their families "in their own tradition." To that end, churches in town and throughout the county opened their doors to the prayerful. Yet, for all the sincere expressions of grief and compassion for the victims and their families that were uttered in Hart County yesterday, I feel certain that explicitly or not, those prayers also embodied a personal plea for the freedom from fear that, despite our victories in World War II and the Cold War, seems more elusive now than it did when Roosevelt promised to pursue it sixty years ago.

A Not So Sentimental Journey Into the Past

The Ol' Bloviator scrawled out this piece within a couple of hours of the attack in order to meet the deadline for the next day's paper. The biggest change he can see since then is that our most pressing need as a people today is freedom from fear of each other. 

Atlanta Journal-Constitution, September 12, 2001

James C. Cobb

"Americans Left to Fear Unseen Enemy"

On January 6, 1941, President Franklin D. Roosevelt promised to forge "a world founded upon four essential freedoms." In addition to freedom of speech, freedom of religion, and freedom from want, there was "freedom from fear," which in Roosevelt's view meant "a worldwide reduction of armaments" so that "no nation will be in a position to commit an act of aggression against any neighbor--anywhere in the world." Rather than securing freedom from fear, however, our victory in World War II soon dissolved into a nuclear arms race fueled by the Cold War.

The generation that spent portions of their childhoods practicing for direct nuclear hits on their elementary schools by putting their heads under their desks or had its adolescence punctuated by the sheer terror of the Cuban Missile Crisis can hardly look back with much nostalgia on that era. Yet, even as the Cold War ended and we breathed a collective sigh of relief at the diminished likelihood of a global nuclear holocaust, we were already slipping into a new era of fear and uncertainty, one in which the enemy could be internal, as well as external, and essentially invisible to boot, one in which extravagant defense budgets and massive missile stockpiles count for less than the ruthless and calculated fanaticism of relatively small numbers of unseen and often unknown enemies.

Regardless of whether Tuesday's death total exceeds that of Pearl Harbor, one of these terrifying new enemies made September 11, 2001, a day that would live not only in infamy, but in irony as well. As president, George Bush, Sr., sought to take credit for the end of the Cold War and promised to create a new world order. Yesterday, he saw another president named Bush forced into hiding in an underground bunker. Our inability to protect even the Pentagon and perhaps even the White House or the Capitol served chilling notice that, when all is said and done, Osama Bin Laden can get closer to George W. Bush, Jr., than the latter, for all his resources, can get to him.

The hysterical reporters and the scenes of genuine public panic in New York seemed more the stuff of B-movies or a TV mini-series than that of live "as-we-speak" reality. Obviously, we are stunned by the apparent ease with which planes at major airports could be hijacked and used to demolish what should have been a tightly secured potential terrorist target. Yet, neither our shock or our dismay at the paralyzing fallout of this atrocity at all the nation's airports and in its major cities defines the true significance of yesterday's horrors. That significance lies in the capacity of an unseen enemy to make not just the residents of New York or Washington, D.C., afraid, but to implant that fear into the hearts of millions of Americans who have never been (and probably never intend to be) near either New York City or a major airport.

This reality came through to me in a number of ways, including the cancellation of classes at the University of Georgia and the anxious investigation of a "suspicious" van parked near the federal building in Athens. However, it was local reaction here in Hart County to yesterday's horrors that I found most enlightening however. Our local radio station, WKLY, "The Voice of the Upper Savannah River," largely suspended its regular programming (save, of course, for the obituaries and mid-day devotional) and broadcast the programming of WGST and the Georgia News Network. The mayor of Hartwell, a woman of Lebanese extraction and Episcopal faith, urged citizens to offer their prayers for the victims and their families "in their own tradition." To that end, churches in town and throughout the county opened their doors to the prayerful. Yet, for all the sincere expressions of grief and compassion for the victims and their families that were uttered in Hart County yesterday, I feel certain that explicitly or not, those prayers also embodied a personal plea for the freedom from fear that, despite our victories in World War II and the Cold War, seems more elusive now than it did when Roosevelt promised to pursue it sixty years ago.

Into the Blue out of the Blue: Who's Responsible for the Democratic "Surprise" in Georgia, and What Does It Mean?

The initial 'hot take" on Democrat Joe Biden's victory on November 3, emphasized the critical importance of traditionally Republican suburban white voters who crossed over to support Biden out of revulsion with President Donald Trump while remaining loyal to GOP candidates farther down the ticket. Subsequent examination suggests that this narrative hardly squared with actually voting patterns in the critical battleground state of Georgia, however. It is true that, along with Fulton, the state's three most populous suburban counties, Dekalb, Cobb, and Gwinnett accounted for over half of Biden's vote gains over Hillary Clinton four years ago, but it is also true that Cobb, by the margin of but a single point, is the only one of the trio where minorities are not in the majority.  Beyond that, the Republicans also failed to hold the line in local races in these counties, which saw both Cobb and Gwinnett elect their first black sheriffs. The same patterns in voting and turnout seemed to hold in the state's Senate runoff races. On the other hand, while the emerging  counter-narrative holding that, like President-elect Biden, "the Democratic senators-elect owe their wins to Black voters" is truer to the facts, it undervalues the significance of the share of the white vote claimed by the Democrats in both the presidential and senatorial races.

  Republican strategists who opted to go all in with Trumpism in the Georgia senatorial runoffs did not anticipate the contradictory appeals that would soon be emanating from various quarters in the GOP camp, leaving voters to choose between "Turn out big to preserve president Trump's legacy" and "This election will be rigged just like the other one, so don't bother." Suffice it to say, neither the farcical attempt by Trump and his kamikaze henchmen to discredit the November 3 results, nor his brazen try at coercing Georgia election officials into helping him steal the state did much to cement his legacy as something to be preserved. Meanwhile, calls to boycott the balloting in the Senate runoffs because the fix was already in resonated with enough Trump diehards, particularly in counties where they were most concentrated, to put Republican candidates Kelly Loeffler and David Perdue at a definite disadvantage. At the same time, minority voters defied the traditional wisdom by turning out in proportions unheard of for runoff elections. In fact, at 90% of the general election total, the overall turnout in the runoff races was actually much higher than historical precedent would have suggested. But, where Republican voting was down roughly 10% in both races, the slippage for the Democrats ranged from roughly 5% for Jon Ossoff to 3% for Raphael Warnock.

Altogether, blacks accounted for roughly 32% of the electorate in the runoffs as compared to 29% in the general election. The already suspect narrative that ticket splitting by better educated suburban whites was the key to Biden's success on November 3 was even shakier after the runoffs, as both Democratic candidates ran slightly behind him with this demographic on January 5. At the same time, both ran ahead of him among black voters in Georgia's most heavily black counties, who actually showed up in greater numbers than in the general election.

What we can discern thus far about voter behavior in the Georgia Senate runoff elections makes it clear that the primary constant in both the presidential and senatorial races was the overwhelming turnout and corresponding Democratic loyalty at the ballot box among minority voters.  The Democrats would have stood little chance of winning any of these contests without this show of fidelity, to be sure. Yet, simply concluding that they owe their victories here, (and elsewhere, perhaps), solely to energized minority voters, rather than any real change in white voting patterns, does not do full justice to the complexity or potential significance of what enabled the Democratic to win these elections. Since the civil rights initiatives of the mid-1960s led southern whites to flee the Democratic ranks in droves, the principal difficulty for Democrats in the region has been the success of their Republican adversaries in painting them as a party made up overwhelmingly of, by, and for minorities. Accordingly, they have long struggled in vain to win back an elusive, perhaps even mythical, contingent of working-class whites who are more attuned to economic than racial concerns. Finally, urged on in 2020 by Stacey Abrams and other minority leaders, Georgia Democrats redirected their energies and resources to an all-out, unvarnished effort to register and turn out their historically loyal nonwhite base.

 The success of this minority mobilization initiative by Abrams and her cohort was clearly the most significant contribution to the party's improved fortunes in this state, but the ultimate promise of their accomplishment might still have gone unrealized had they not managed to pull it off without simultaneously losing ground with white voters. In both 2008 and 2012, Barack Obama picked up 20-23% of the white vote in Georgia, almost the same share as John Kerry in 2014 and slightly more than Hillary Clinton in 2016. In her 2018 gubernatorial bid to become Georgia's first black (or woman) governor, Stacey Abrams nudged the Democratic share of the white vote up to 25%. This year, both Biden, Ossoff, and Warnock upped that share to roughly 30%. These are hardly astronomical figures, but neither were the victory margins of any of the Democratic candidates. With demographic trends likely to remain favorable to her prospects, Abrams seems well-positioned to secure the Georgia governorship in 2022. The long- or even medium-term damage to Republican fortunes in state or national politics incurred in these last tragic days of the Trump presidency is impossible to gauge at this juncture. Still, it's fair to speculate that she will still need to at least hold on, in large part, to her party's admittedly modest, slow-to-come-by gains with white voters to succeed two years from now where she fell just short two years ago. If she actually managed to build on those gains, the import of what we have witnessed over the last two months will be even greater than we can appreciate just now.

 

Jim Cobb is Emeritus Spalding Distinguished Professor of History at the University of Georgia.

 

An earlier version of this offering appeared at

 https://likethedew.com/2021/01/14/into-the-blue-out-of-the-blue-whos-responsible-for-the-democratic-surprise-in-georgia-and-what-does-it-mean/#.YA2hW3ZKiMQ

HISTORY IS WHISPERING TO US AGAIN, SO LISTEN UP!

Unemployment stood at 25%, and 7,000 banks had folded in the last three years  when FDR delivered his first inaugural address on March 4, 1933. Even as he cautioned his fellow Americans that "the only thing we have to fear is fear itself," he also conceded that "only a foolish optimist can deny the dark realities of the moment." The realities of that moment still appear at this instant to be grimmer than those of the current one. Yet with a staggering 22 million Americans filing for unemployment over the last four weeks, it is difficult to dismiss projections of jobless rates reaching or even eclipsing the Depression-era peak that confronted Franklin Roosevelt on that very first day of his presidency. The desperate, decade-long struggle to keep their families fed, clothed and under the same roof left an indelible imprint on the mindset of many of the adults who survived the Great Depression. If that historic effect is any indication, we may emerge from our own global crisis to find our habits and lifestyles significantly altered as well--a prospect that runs counter to the blithe assumptions of some politicians and presidential advisers that the U.S. economy will quickly return to "normal" once it's re-opened.

 Most Americans continued to trust Franklin Roosevelt even when his New Deal recovery efforts faltered: Even after unemployment climbed back into the vicinity of 20% in 1937-38 and his Democratic Party suffered major losses in the 1938 mid-term elections, F.D.R.'s presidential approval rating still stood at 54%. Still, a great many of his supporters never managed to divest themselves entirely of the fear that he warned could undermine efforts to promote recovery. In fact, for many of them, that fear endured long after the recovery arrived.

 A vital part of FDR's recovery plan was not simply getting Americans back to work but persuading them, insofar as possible, to resume their normal spending patterns. For example, the Civilian Conservation Corps was expected to preserve and beautify the natural landscape, but its guidelines stipulated not only that the young men it employed were to come from families on relief, but that they must send all but $5 of the $30 they earned each month back to those families, who presumably would have no choice but to spend it. The upshot was supposed to be thousands of mini-stimuli injected into a near-comatose consumer economy, raising prices and encouraging businesses to reopen.

 As some oral histories reveal, however, in many cases much of the money sent home to parents wound up stuffed in Mason jars or sewn into the corners of their sheets. The New Deal's unprecedented--if inconstent-- level of federal spending was aimed at boosting employment, and thereby, consumer demand; this should have had a net inflationary effect. And yet the years 1930-1940 registered a cumulative 19% decline in the inflation rate --reflecting at least in some measure, the extreme reluctance of frightened Americans to spend on anything but absolute necessities. 

Painfully aware of how abruptly the rhetoric of "permanent prosperity" had turned to ashes in the mouths of the leading economic experts of the 1920s, a great many Americans clung to the parsimonious habits ingrained by their long ordeal of deprivation, even after World War II finally brought the long-awaited economic recovery. Their children and grandchildren will surely recall their steadfast resistance to purchasing any but the cheapest consumer items for themselves. Nor was there any excuse for tossing perfectly good aluminum foil or wax paper after a single use or any reason to be in a hurry to dispose of paper bags or any jar, bottle or box that might come in handy for storing something one day.

While there is certainly no basis at this point for anticipating that the economic consequences of the Coronavirus onslaught will ultimately compare in any purely apples-to-apples computation to the enormous losses registered between 1929 and 1933, those who actually find themselves caught up such crises seldom assess the severity of their situation in purely objective terms. We may have the means to calculate that, adjusted for inflation, the approximate real-time $ 41 billion price tag for the entire New Deal amounts to roughly 38% of the cost of the initial Coronavirus economic stimulus package, but there is no formula for determining how the psychological effects of spending several years trying to stave off absolute impoverishment might compare to those of what is hopefully a much shorter span fearing that you or someone you care about might suddenly fall ill and die while stressing out about your current or possible future economic jeopardy. If the normally stolid editorial pages of the Wall Street Journal, depict the COVID-19 outbreak as "a once-a-century threat to American life and livelihood," it's a fair guess that popular anxiety is currently no less intense.

We are unlikely to witness anything approaching the enduring parsimony of the Depression generation, once the immediate crisis has passed, but an explosive release of pent-up purchasing power may not be in the cards either.  Apart from questions of job security, for millions of what just weeks ago seemed comfortably middle-class parents will be facing a dramatic shrinkage in the portfolios they were counting on to educate their kids, anchor their 401(k) plans, or provide collateral for a loan to buy their dream house, whatever its duration, a period of  austerity may be in the offing. Throw in the health risks possibly associated with venturing out to malls, restaurants, bars, theaters, etc. and you hardly have the recipe for an immediate post-pandemic surge, much less one while the pandemic is still very much with us. This study suggests as much for the premature "re-opening" of state economies, most notably here in GA.

And, much as the economic devastation of the Great Depression brought lasting changes to the consuming habits of an entire generation, FDR's approach to combating it left its own enduring mark. Federal involvement in certain public-service initiatives such as disease prevention and social education had expanded notably during the "Progressive Era" of the early 20th century, but taking action to relieve economic distress was another matter. President Grover Cleveland declared in 1887 that "though the people support the Government, the Government should not support the people." Cleveland's dictum largely held sway until the sudden profusion of breadlines, soup kitchens and homeless encampments marking the onset of the Depression put the lie to his insistence that "the friendliness and charity of our countrymen can always be relied upon to relieve their fellow-citizens in misfortune." The severity of the situation finally moved even the hidebound free-marketeer Herbert Hoover to agree in 1932 to set aside $2 billion for loans to banks, credit agencies and other businesses. Though more akin to a drop in an ocean than a bucket, President Hoover's modest rescue initiative nonetheless amounted to a new precedent upon which his successor would expand and improvise repeatedly as the Depression defiantly dragged on.

 After FDR took office, in a five-year whirlwind of legislation, ranging from the Agricultural Adjustment Administration to the Social Security Act to the Fair Labor Standards Act, his New Deal established a federal presence in practically every aspect of national economic life. However, Roosevelt had proceeded in such disjointed and piecemeal fashion that, ultimately, even his economic advisor, Alvin Hansen, was forced to admit that "I really do not know what the basic principle of the New Deal is." FDR had effectively, if unwittingly, as the historian Paul Conkin observed, created what amounted to a virtual "welfare state." Yet because he had  given his fellow Americans little reason to see it that way  or view it holistically in any sense, succeeding generations were inclined to consider the expanded federal presence in national life largely in terms of how its component programs benefited them. How quickly and completely they had come to take those benefits for granted was apparent  in President Dwight D. Eisenhower's blunt warning in 1954 to his brother and his fellow Republicans in general that setting out to abolish Social Security, farm subsidy programs or other vestiges of the New Deal would be suicidal "for any political party." Grudgingly acknowledging as much, even latter-day Republican critics of the post-New Deal welfare state have largely spurned full-blown frontal assaults aimed at ripping it out by its roots, in favor of repeatedly hacking away at its branches.

Yet for all their zeal for slashing federal social and economic welfare programs, they have shown no such ardor for the tax hikes or other political compromises (such as expanding Medicaid) that would allow the states to begin to fill the void in public assistance created by federal cutbacks. When the coronavirus crisis began, the initial default reaction saw state and local leaders turn to Washington for assistance in relieving the health care crises confronting their respective constituencies. However, in this case, the response to their demands has been limited and slow in coming, and the extent of the federal government's obligation to fulfill them has been a subject of much debate. Still uncertain of the details and parameters of a constantly shape-shifting federal response, several governors began to forge their own collaborative plan to address the challenges they face. Though he earlier insisted that he was the one with the power to re-open state economies, President Trump himself has more recently declared that decision-making about this facet of the crisis was the responsibility of each individual governor.

 The meaning of the New Deal's expansion of federal responsibility was never spelled out in so many words, but the American people came to understand what it meant for them as individuals. If the coronavirus response continues in its currently contested form, history suggests that today's citizens will discern that the burden of responsibility is shifting back to the states--and that they may reset their long-term expectations accordingly.

 Meanwhile, Americans may not emerge from the current coronavirus siege embracing anything approaching the extreme, self-imposed austerity of those directly impacted by the Great Depression, and no reduction in federal responsibilities in the current situation is likely to take the country all the way back to pre-New Deal mode. Still, it would be unwise to assume that the severe jolt to their sense of physical as well as material well-being inflicted by this crisis will leave no mark on their habits and attitudes going forward. Like other era-defining historical trials, the Great Depression finally passed. But both on an individual and a governmental level, the end did not signal a return to status quo. While the history of crises past seems to assure us that, one way or another, today's will eventually recede, that history just as surely cautions us against assuming we can anticipate what the world may look like when it does.

 Please be advised that a downsized version of this piece, which may be more suitable for framing, recently popped up on TIME.COM (https://time.com/5827348/great-depression-coronavirus-after/)

 

With more than 25 million viewers taking in the NCAA college football championship on January 13 and with staggering television payouts helping to boost gross revenues for the twenty-five most lucrative programs alone to $2.7 billion in revenue last year, it might seem ludicrous to imply that the sport is anything other than a pink-cheeked picture of health. Yet, the intoxicating financial benefits of what amounts to a hell-for-pigskin pursuit of Nielsen ratings have not come without some sobering side-effects for thousands of college football's most ardent fans, for whom it has traditionally been not simply a spectator sport, but a participatory ritual.

 According to a recent report, attendance fell by 7.6 percent between 2014 and 2018 at games involving the 130 big-time programs in the Football Bowl Subdivision, and the average turnout in 2018 was the lowest since 1996. Not only do major powers like Alabama and Clemson struggle to sell out their home games, but a 2018 Wall Street Journal investigation revealed that, on average, only 71 percent of those holding tickets for FBS games in 2017 ever made it through the turnstiles.

 

Some of the income derived from billions in TV payouts has gone to support non-revenue-producing sports--from field hockey to track and field. Yet, that money also seems to have ignited an orgy of spending on new and upgraded football facilities and super-sized coaching salaries. (According to Knight Commission data, between 2010 and 2018, total spending on football at the University of Georgia increased by 142 percent, while the coaching salary budget swelled by 167 percent.These numbers seem positively puny, however, in comparison to Tennessee's 207 percent bump in total spending and Texas A&M's whopping 333 boost in salary  payouts over the same period.)  With such expenditures now at altitudes too stratospheric to be kept aloft by TV royalties alone, major programs seem more dependent than ever on bigger donations, not only from traditional high-dollar private benefactors, but also from less affluent ticketholders as well.  

 

Getting season tickets has long required ponying up, not simply for the face price, but for an additional "donation," which assures you the privilege of making the purchase.  These buy-ins can vary radically from year to year and school to school, but in 2018, first-time season-ticket purchasers at the University of Georgia were expected to show nearly $24,000 worth of "school spirit" on the front end before spending a minimum of $275 each for some of the worst seats in Sanford Stadium. These fans may be able to improve their views of the game at some point, so long as they maintain the appropriate level of annual "giving" as well.

 

The effective costs for donor-buyers under this arrangement began to bite a bit harder after 2018, when Congress finally blew the whistle on the egregious scam of allowing up to 80 percent of the amount of these coerced contributions to be written off as charitable donations. As a result, more fans are opting to discontinue their purchases of standard season-ticket packages--and thereby their donations as well--in favor of a cafeteria-style, pick-and-choose strategy through secondary ticket distributors like Stub Hub.

 

Consider the University of Texas, where, last year, season tickets for midfield seats would cost a fan $550 each, plus $3,500 in additional support.  At current price projections, by availing himself of Stub Hub, the same Texas fan could get decent seats at all four of the Longhorns' most attractive home games in 2020--against West Virginia, Baylor, Iowa State, and TCU--for a total of roughly $700. The savings would leave more than enough to acquire an enormous state-of-the-art HDTV for watching the remaining Texas games (and many others beside) from the climate-controlled, beer-enhanced comfort of his man-cave.

 

Ironically, that is precisely what the people who are really calling the shots in college football these days--this is to say, television executives-- would prefer that fans do. With their respective corporations shelling out so much for the rights to broadcast games, these executives understand full well that their bread gets buttered by enticing college football fans to keep their gaze fixed on their flatscreens, and spurn the stadium for the sofa.

 

Having outbid CBS for future broadcast rights to Southeastern Conference football games, ESPN will soon be showering $63 million in annual TV revenue on every SEC school, thereby making relative pikers of those Big Ten Yankees currently scraping by on $52 million per member university.  It's a fair surmisal, then, that viewership totals weigh more urgently on the minds of network officials than do the comfort and convenience of fans who still like to consume their college football in person.

 Accordingly, athletic departments addicted to those big TV checks are also taking their marching orders from the folks who sign them instead of the fans, many of whose families have been faithfully buying tickets and attending games for generations.  ESPN/ABC's telecasts already account for 54 percent of the college football viewing audience, and its acquisition of the phenomenally popular SEC package stands to make it an even greater factor in the lives of fans looking to be on the scene when toe meets leather. A single network conglomerate holding sway over so many games means its execs will be juggling kickoff times even more frantically, hoping to plug the most attractive matchups into the most desirable time slots without pitting them against each other. As a result, more fans, especially those who live at some distance from campus, will be forced to weigh their enthusiasm for actually being there against the inconvenience of arising in the wee hours for games at noon or even earlier or getting home from those played at 8 pm or after at roughly the same time.

 

Because all that network money has to come from somewhere, we can anticipate more and longer commercials in games that already subject fans' patience, bladders, and backsides to what amounts to a four-hour stress test. Those who head from the stadium to the local motel instead of fighting traffic and fatigue on the long drive home are almost certainly looking at two-night minimums on rooms at grossly inflated rates. Throw in gas, food, and tickets for a family of four, and your credit card tally will scream of a weekend in Paris, not Clemson.

 

In reality, it's not alums but students who have become college football's most visible no-shows. Some of this has to do with the attractiveness of watching games on high-res TVs in close proximity to kegs and coolers at the frat houses, but there's another technological troublemaker afoot here as well. Slavish addiction to cyber-fixes has led to sustained whining about poor cell coverage inside stadiums, prompting massive investments in more bandwidth for young people who don't find the spectacle of live college football sufficiently captivating.

 

The ineffectiveness of such efforts to date has so frustrated college football's most famous coach, Nick Saban of the University of Alabama, that he resorted to using location-tracking technology and a rewards system to discourage students from leaving games too soon or arriving too late.

 

Some slippage in student attendance may be attributable, indirectly at least, to a larger problem. The sharp downturn in legislative funding for public higher education since 2008  has forced America's cash-strapped public universities to boost tuition revenue by recruiting large numbers of out-of-state students. Representing a reported 59 percent of Alabama's student body in 2018, out-of-staters may be passing on games in Bryant-Denny stadium because they want to watch their real "home team," which is playing elsewhere, on TV. 

 

Failing to fill steeply discounted seats with students may not seem like a problem at this point if others are still willing to pay full price (and more) for the privilege of occupying the same spots. But growing student disinterest in attending games has ominous implications for the future of the college football enterprise. In the years to come, alumni who couldn't be induced to darken the stadium gates as students will be far less likely to be on board with the annual football ticket shakedown cruise than their elders have been.

 

Slumping alumni turnout stands to inflict some collateral damage on a university's academic endeavors as well. Development officers charged with bringing in private contributions to support academic programs and research rely heavily on home football games as propitious occasions for rounding up well-heeled old grads for a weekend of nostalgia-tripping pursuant to a flurry of check-writing.

 

Needless to say, over time, declining attendance within this demographic could end up depriving colleges of much-needed academic support. This support is so precious because, contrary to what some university administrators maintain, solid evidence that athletic success boosts giving to the academic side is practically nil.  

 

In a sad reflection of the times in which we live, an embarrassment of riches has bred an all-enveloping uber-competitiveness in which the prowess of a college football program is registered not simple in its on-field exploits but the priceyness of its construction projects, the luxuriousness of its locker rooms, the exorbitance of its coaches' salaries, the extravagance of it recruiting budgets, the vastness of its stadium video screen. and, more than anything, the grandeur of its own expectations. These days, programs aspiring merely to a reasonably improved record topped off with a solid trouncing of the old archrival are mocked as "losers," not so much for a lack of achievement as for a want of ambition. The year-long obsession with making it into the hyper-hyped college football playoffs has reduced once-prestigious bids to major bowls to little more than consolation prizes. For the most part, once any self-respecting FBS team has incurred two regular season losses, its year is effectively "over."  Many dispirited fans have shown so little interest in what now seem wholly inconsequential postseason contests that tickets can be had for a mere fraction of face value, and judging from an average 35 percent decline in the viewership of the recent Sugar, Cotton, and Orange Bowls, a number of them can't even be bothered to take them in on the tube.

Much the same can be said for many of the players. The hoary nostrum that it's all about getting a degree hardly gets even lip service anymore. What coaches are unabashedly selling recruits now is not a degree but a pedal-to-the-metal autobahn dash to the NFL. Never mind that only 1.5 percent of eligible NCAA players will hear their names called on Draft Day. Coaches who bombard recruits from their very first minutes on campus with LED scrolls flashing the staggering salaries their former players now command as pros surely have little reason to be shocked when their standout upperclassmen announce that they will not be jeopardizing their own prospects for pulling down such paychecks by risking an injury in a superfluous bowl game.

Only recently has there been any relaxation of the once-ironclad restriction that effectively bound players for four years to choices they made as wide-eyed seventeen-year-olds while their coaches were free to bolt whenever they spotted an even more remunerative opportunity elsewhere. It is no mere slip of the tongue when high school stars these days proclaim their excitement about spending the "next three years" at the school of their choice, for they mean to do precisely that and no more. They see themselves in an NFL uniform by the time what would been their senior collegiate season rolls around. Eager to escape their own economic serfdom and with their often sorely disadvantaged families looking to them for relief, there is scant time to wait one's turn while learning from the more experienced upperclassmen playing ahead of them, let alone stick around to earn their diploma. Desperate for enough playing time somewhere to showcase their skills for the pro scouts, they are heading to the transfer portals in substantial numbers before taking the earliest opportunity to declare for the NFL Draft, where, history shows, roughly one-third of them stand to be passed over. Beyond shattering any remaining pretense of amateurism in college football, the contagion of blatant commercialism now afflicting it at every level has helped to make exiles of the people who long saw it not simply as a welcome diversion but the central element of a deeply personal, culturally affirming ritual. It seems unlikely that the rich streams of revenue currently disgorged by this prodigiously lactating cash cow will begin to dry up anytime soon. Yet the future of college football is not foretold solely in exuberant estimates of forthcoming profits, but in the ever-tightening camera angles required to spare millions of TV viewers the visually jarring and steadily encroaching swaths of emptiness within the stadiums themselves.

 

WHEN SOUTHERN HISTORIANS MADE HISTORY THEMSELVES

It pained the Ol' Bloviator greatly last week to miss only his second annual meeting of the Southern Historical Association since 1973.  Those who made it to this year's confab in Louisville not only enjoyed a lively and engaging program, but the warm, inviting atmosphere for catching up with old friends and former students that is the trademark of these SHA gatherings. This particular meeting also marked the 70th anniversary of one of the most momentous events in the group's history and a notable development in the history of race relations in the South as well.  

No apparent thought had been given to the possibility of black membership, let alone attendance at the annual meeting, when the Southern Historical Association was formed in 1934. Bethany L. Johnson notes that SHA president Benjamin Kendrick was unaware that there were any black members in 1941, when he received a letter from one of them asking to what extent he and his colleagues would be allowed to participate in the organization's meeting at Atlanta's Biltmore Hotel. Kendrick had essentially created a defacto policy when he responded that black members enjoyed "all the rights and privileges of any other member, subject only to local city ordinances, state laws, or practices of the hotel in which the meeting is held.".

This effective default to prevailing local Jim Crow practices meant that, despite some variations, black members who chose to attend the annual meeting were well-advised to expect treatment little better than that accorded the black hotel employees, with whom they would both share toilets and eat in the kitchen while white members partook of the luncheons or dinners that were a part of the annual meeting. They might be allowed to enter the main dining area in order to hear a speaker once the white diners were finished, but, that done, they were expected to make for the door, for there was surely no place for them to sleep in a whites-only hotel.

Needless to say, such practices hardly suggested much enthusiasm for inviting a black scholar to actually present a paper in a formal program session. Yet with the 1949 meeting set for Williamsburg, Va., seemingly one of the South's less racially volatile cities, program chair, C. Vann Woodward (who would soon become a towering presence in the field of southern history) proposed doing precisely that. When he suggested asking his friend John Franklin to participate in the program, his carefully selected program committee--composed of white historians who shared his embarrassment at how black members were treated--seemed not only amenable, but downright enthusiastic.  The sledding proved a little bumpier in dealing with representatives of the host institution, William & Mary, but with considerable prodding and not a little guile, Woodward eventually got them to sign off on his bold experiment as well.

Thinking that a presentation on some less racially fraught topic would make Franklin more comfortable, he suggested a paper drawn from his ongoing research on the martial tradition in the South. To put Franklin further at ease, Woodward had bolstered the session's creds by asking Columbia's Henry Steele Commager, one of the most distinguished American historians of his day, to chair it, and Emory University historian Bell Irvin Wiley, a white southerner and, like Woodward, a rising star in the profession, to read the paper accompanying Franklin's.

Thus, it came to pass that, at 3 p.m. on Thursday, November 10, 1949, a crowd well in excess of what the ground floor room in William and Mary's Phi Beta Kappa Hall could accommodate gathered to see southern historians actually making history themselves.  "You couldn't get in that place," Franklin recalled, noting that a number of curious William & Mary faculty as well as townspeople had gathered outside to peer in the windows. Others in attendance offered much the same account and confirmed Franklin's impression that his paper "got a good reception." The only potentially awkward moment, Franklin remembered many years later, came during the discussion from the floor when a stately white woman in the audience asked "how we can sit here and hear him use the term, 'Civil War,' when he should call it the 'War Between the States'?" Doubtless to Franklin's great relief at that point, "everyone broke into laughter."  He may have been uncertain at that precise moment whether to make more of the question itself as an indication of the past flexing its muscles or of the less than deferential response to the questioner as a sign it was losing its grip. He would likely have been more inclined to the latter interpretation had he known at the time that the questioner in question was Susan Tyler, the widow of one Lyon G. Tyler, son of President John Tyler and a historian and dedicated apologist for the Old South, not to mention president of William & Mary from 1888 to 1919. Franklin might have even felt the earth tilt ever so slightly had also he realized that, before her marriage, Susan Tyler had been Susan Ruffin.

If the great-granddaughter of the unrepentant secessionist, Edmund Ruffin, (who wrapped himself in a Confederate flag before blowing his brains out after the war) was deeply disturbed by what she witnessed that day, it's hard to imagine her reaction had she lived to see the black interloper she had risen to rebuke more than twenty years earlier step to the podium in 1971 as President of the Southern Historical Association.  As Franklin noted, he had been elected to this post "long before the other two predominantly white national historical associations made a similar move." Yet only in the last ten years had SHA officials been able find enough southern hotels where the old racial protocols had been abandoned to assure that the annual meetings were finally free of Jim Crow's tenacious clutch. In the interim between becoming the group's first black program participant and serving as its first black president, Franklin had suffered what he called a largely unbroken repetition of "insults and the indignities" at a variety of southern convention hotels, like the Peabody in Memphis, where he could neither eat not sleep but might find himself "literally stranded" because white cabbies refused to pick him up. These memories were still too painful,  even in 1970, to allow him to savor the current moment, let alone encourage anyone to make too much of the progress it signified. Instead, rather than conclude his presidential address on a note of satisfaction or accomplishment, he simply reiterated what had long been his sustaining hope, "that a region whose experience and talents had proved to be so ample in so many ways...might yet be able to confront fundamental changes in its social order."

 

Bloviate:

"To orate verbosely and windily."

Bloviate is most closely associated with President Warren G. Harding, who used it frequently and was given to long winded speeches. H.L. Mencken said of Harding:

"He writes the worst English that I've ever encountered. It reminds me of a string of wet sponges; it reminds me of tattered washing on the line; it reminds me of stale bean soup, of college yells, of dogs barking idiotically through endless nights. It is so bad that a sort of grandeur creeps into it. It drags itself out of the dark abysm of pish, and crawls insanely up the top most pinnacle of posh. It is rumble and bumble. It is flap and doodle. It is balder and dash."

Cobbloviate dedicates itself to maintaining the high standards established by President Harding and described so eloquently by Mr. Mencken. However,the bloviations recorded here do not necessarily reflect the opinions of the mangement of Flagpole.com,nor,for that matter, are they very likely to be in accord with those of any sane, right-thinking individual or group anywhere in the known universe.

Monthly Archives