1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
Introduction
IT IS HARD TO SAY exactly when we first noticed the pattern. Just before we hit the outskirts of a Cotton Belt town, the fields would give way to a string of gleaming white antebellum homes with large lawns, old-growth trees, and grand entrances framed by columns reaching two or three stories high.
Merging onto the majestic arterial boulevards leading into town, we would see more imposing homes presiding over meticulously manicured grounds.
In Sparta, a rural hamlet near Augusta, Georgia, it appears as though someone has invested millions to restore an elegant Greek Revival home.
New windows and shutters gleam. Yet just across the street lies a dilapidated shack, one room deep, with a sagging roof. Over in Demopolis, Alabama, sits the venerable Gaineswood, a massive structure known for its elaborate interior suites, including domed ceilings, remarkable decorative arts, and original antebellum furnishings. Left out of the photos on Gaineswood’s website and tourist brochures are the aging wood cottages in varying states of disrepair, the tumbledown trailers, and the sagging modular houses that flank the historic home.
County seats in the region typically feature a leafy square where a courthouse of sometimes massive proportions—and stunning architectural beauty—stands over streets that are hollowed-out carcasses of a more prosperous time. There is often an imposing Confederate monument on the courthouse lawn honoring soldiers of the war waged a century and a half ago. Nearly without fail, there is at least one upscale business—a home goods store, clothing boutique, high-end restaurant, art gallery, or gourmet coffee shop—catering not to the average resident but to the remnants of the white gentry and to tourists who come to see the grand homes.
What becomes abundantly clear as we travel across the country to see America’s most deeply disadvantaged places firsthand is that they are often home not only to desperate poverty but also to considerable wealth. In the fall of 2017, a program officer named Andrea Ducas at the Robert Wood Johnson Foundation wrote to us out of the blue. Two of us (Kathy and Luke) had written a book that combined insights drawn from national poverty data and ethnography to paint a picture of the unseen lives of some of America’s poorest families—those living on cash incomes of less than $2 per person per day. Would we, she asked, be interested in collaborating on a project to do the same thing for America’s poorest places?
Immediately, we were intrigued by the idea of studying places instead of people. The social sciences had a rich tradition of community studies from about the turn of the twentieth century onward. Yet broadly speaking, these days the proud tradition of the community study seems a bit out of vogue. More recently, prominent work in the social sciences conducted by towering figures such as William Julius Wilson, who wrote the landmark book The Truly Disadvantaged, theorized that place is key to understanding how people’s lives unfold. But most of this work has focused on neighborhoods in big cities. Matthew Desmond has made the case that poverty is not just the experience of not having enough, but is the byproduct of relationships between the actual people—tenant and landlord, worker and employer—within a place. Following this idea to its logical end means studying the relationships among a whole variety of actors in a given community. We wondered: Why were so few of our colleagues studying whole communities? Why weren’t we?
Ducas raised another question: Could we understand poverty more holistically if we included not only income as a measure but health as well?
When President Lyndon B. Johnson declared an “unconditional war on poverty” on January 8, 1964, the nation lacked any method of counting the poor, or even a firm notion of how poverty should be defined. As Johnson led the nation into this “war,” his administration scrambled to come up with a measure that could be used to chart progress. The gauge, it was decided, would be the minimum amount needed to put food on the table multiplied by three (at the time, food constituted a third of the average family budget).
Ever since, poverty researchers have been locked in endless debate about how poverty should be measured. Nonetheless, in virtually all cases, poverty has continued to be defined as a lack of income.
While there is no doubt that income is a vital indicator of well-being, it had become clear to us that it was just one part of a bigger picture. Thus, we decided to harness the immense growth in the nation’s data infrastructure to build a more nuanced way to measure community disadvantage than had ever before been possible.
To assess the level of disadvantage in a community, such as a county or a city, we combined traditional income-based measures with other markers, including health. Especially in the United States, health outcomes vary tremendously by race, ethnicity, and income. In 2008, life expectancy for highly educated white males was eighty years, but only sixty-six for loweducated Black men, whose average life span resembled numbers seen in Pakistan and Mongolia. In 2011, the infant mortality rate for Black mothers in the United States was comparable to that in Grenada and just a bit better than that in Tonga. The rate for non-Hispanic whites was much closer to that in Germany and the Netherlands. Meanwhile, a tidal wave of new research was showing that a person’s health is shaped more by their context —their income, family circumstances, and community characteristics, for example—than by their genetic profiles or the medical care they receive.
Ultimately, as the scope of our study of place-based disadvantage grew, we chose to incorporate two well-measured health outcomes, one that captured conditions at the start of life and the other at the end. In a particular community, what were a baby’s chances of being born with low birth weight, which is closely associated with infant mortality and other threats to children’s health? In that community, how long could the average person expect to live?
We also recognized the importance of measuring whether disadvantage in a particular place persisted for children growing up there. Especially in the American context, it is almost an article of faith that kids should have the opportunity to do better than their parents. Recently, a team of economists employed confidential IRS data to create a measure of intergenerational mobility (the chance that children born low-income could rise up the economic ladder) for every city and county in the nation. These researchers used tax records to follow children born in the 1980s through adulthood to see where they stood on the income ladder compared to their parents. It was already understood that there were big differences in intergenerational mobility by parental income, ethnicity, and race, but the most stunning revelation of this new research was how much variation there was by place. In some communities, a child born into poverty would probably stay low-income as an adult. Yet in others, they had a much better chance of reaching the middle class. It seemed clear to us that to measure the depth of disadvantage in a community, it would be important to include the rate of mobility from one generation to the next.
We chose to bring the term “deep disadvantage” into the conversation about measuring poverty in order to capture the complexity of the problem when a person’s life chances are hindered by multiple conditions or circumstances, including by the community in which they live. Given our aims, “disadvantage” is more accurate than simply “poverty” because it implies an injustice. The term is moral. People are being held back— unfairly.
We incorporated our multidimensional measures of well-being into the Index of Deep Disadvantage, which reflects two traditional income poverty indicators (the official poverty rate and the rate of deep poverty, meaning those with incomes below half the poverty line), two markers of health (low birth weight and life expectancy), and the rate of intergenerational mobility for children who grow up low-income. Since there was no obvious way to decide which factor was more important than another, we used a sophisticated machine learning technique called “principal component analysis” to rank the roughly 3,000 counties in the United States along with the 500 most populous cities on a continuum of disadvantage that accounted for income, health, and intergenerational mobility. We were a long way from limited War on Poverty–era metrics. Immediately, we could see from the rankings that the geographical pattern was stark. The first surprise—especially for three professors who had spent our careers studying urban poverty—was that the “most disadvantaged” places on our index were mostly rural. There is considerable poverty in cities like Chicago, Los Angeles, and New York.
But in our apples-to-apples comparison, none of those cities ranked even among the 600 most disadvantaged places in the nation. For the most part, the only cities and urban counties to find themselves among the most disadvantaged were a relatively small number of industrial municipalities in the Northeast and Midwest, such as Cleveland, Detroit, and Rochester.
Among the rural counties at the top of the list, what we found didn’t fit what most people think of as “rural.” While some of these were majoritywhite, many, indeed most, were communities of Black and Hispanic Americans. We could see, too, that many places with large Native American populations ranked among the most disadvantaged in the nation (19 of the top 200). Beyond these, though, not one community in the western part of the United States registered among the “most disadvantaged” (those in the top fifth). While some might say we ought to have considered the impact of the high cost of living on poverty—those costs are higher in some places— there are trade-offs. Although people pay more for housing in those places, there are at the same time structural advantages in those areas of the country, such as good health care systems, a more generous safety net, public transportation, and higher-quality schools. This, we think, is why some high-cost big cities like San Francisco and Seattle fall further down our index than expected. We also found that those living in the 200 most disadvantaged places on our index were just as prone to have major difficulties paying for housing as those in America’s 500 largest cities.
Apart from predominantly Native American communities, the places that our index identified as “most disadvantaged” most often are found in three regions—Appalachia, South Texas, and the vast southern Cotton Belt running across seven states.
To learn more about these places, we sent teams of researchers to specific locales in the summer of 2019. The Mississippi Delta, also known as the Yazoo-Mississippi Delta, is a region that encompasses portions of Mississippi, Louisiana, Arkansas, and Tennessee, including Leflore County, Mississippi. That’s where Ryan Parsons was living while conducting his dissertation research when he joined our team. Jasmine Simington and Meg Duffy spent that summer in Marion County, in the Pee Dee region of South Carolina, which switched from cotton to tobacco production in the late nineteenth century. Both communities ranked among the most disadvantaged majority-Black places in the nation. Liv Mann and Emily Miller spent the summer in Clay County, in eastern Kentucky, one of the most disadvantaged majority-white places in the country. Lanora Johnson divided her time between the Pee Dee region and eastern Kentucky. In each location, we engaged ordinary residents and community leaders in lengthy conversations. We participated in community events such as parades, festivals, and fundraisers. We volunteered at local charities. On multiple visits to each of these places, we—Kathy, Luke, and Tim—met with a subset of families and leaders, conducting follow-up interviews to pursue additional themes.
The following fall, Emily and Meg made initial forays into two counties that are among the most disadvantaged majority-Hispanic places in America: Zavala and Brooks Counties, in South Texas. Several months later, we visited the area, striking up impromptu conversations in parking lots and cafés and holding a focus group with community leaders who were gracious enough to show up on a Friday morning at the local parish hall.
The plan was to send a team of researchers to these South Texas communities in the summer of 2020.
Then the COVID-19 pandemic intervened, interrupting our ability to travel and to have face-to-face conversations. Locked down in our homes, we turned to history books, government reports, old ethnographic accounts, and first-person narratives—some penned more than a century ago—to learn more. We conducted interviews virtually. As the pandemic ebbed in the early summer of 2021, we, along with researchers Maricruz Moya, Karen Kling, and Christine Jang-Trettien, descended on South Texas.
Throughout, our team met online to exchange stories. We soon learned that every place we were studying was a “first in the nation” or a “capital of the world” of something. Did you know that Marion County’s largest city, Mullins, was once known as the “Tobacco Capital of South Carolina,” the very place where the coveted “bright leaf” strain was first introduced to the state, earning some of the planters who grew it—but not their tenant labor —small fortunes? Or that before Kentucky’s Clay County had coal mines, it was once the salt-mining epicenter of the eastern United States, its purveyors deploying the labor of enslaved people to generate wealth along its creek beds? Is it news to you that it was in Leflore County, Mississippi, and the surrounding area that the antebellum cotton plantation economy was most faithfully reproduced after the Civil War, with vast profits drawn from the labor of Black tenant farmers? Did you know that Brooks County, Texas, once boasted the largest Jersey dairy cattle herd in the world, known far and wide for the delicious butter carrying the county seat’s name, Falfurrias? Did you realize that the seat of Zavala County, Crystal City, once laid claim to the title “Spinach Capital of the World”?
The pandemic-inspired immersion in local history was a crucial turning point for our understanding of America’s most deeply disadvantaged places.
As we pored over our regions’ pasts, we began to realize that what they shared in common was a history of intensive resource extraction and profound human exploitation not seen to the same degree elsewhere in the United States. In these places, it was not enough to be comfortably profiting from one’s enterprise. The goal of the landowning class was to build vast wealth on the backs of those laboring on the land. In each place, this economic pattern emerged (or, in the case of the Cotton Belt, fully flourished) in the late nineteenth or early twentieth century. In each place, one industry linked to national and global markets came to dominate the economy, a pattern that held, broadly, into the 1960s, when King Cotton, King Coal, and the others would bow to the twin forces of automation and competition from global markets.
When we began sending teams to specific locales, we did not know that these patterns would be shared to such a profound degree. Yet we would have had to be exceedingly dull or stubborn to have missed the fact that these places resembled, well, colonies. Internal colonies within the borders of the United States. Using terminology such as “nation within a nation” or “colony” to describe the exploitation of communities of color within the United States has a long history among Black scholars and activists (notable proponents have included Frederick Douglass, Kenneth Clark, Stokely Carmichael, and Malcolm X), among others. We set out to build on this work.
For the places we identified as the most deeply disadvantaged, as the fallout from systems of historic inequality started to come clear, specific themes began to emerge.
In the antebellum era, Clay County, Kentucky, was home to both the mighty salt barons, whose works lined the banks of the creeks, and a tapestry of subsistence farms. Big Timber and Big Coal took over after the Civil War. Today, the opioid crisis is ravaging the region. Locals lament the decline of the local movie theater—now a Pentecostal church—and the loss of the bowling alley; numerous bars, cafés, and beauty salons; and a park that has been plowed under for a highway construction project. The social infrastructure of a community that draws people together and creates the safety net that, when strong, can catch people when they fall has grown weak. People blame the rise of opioid use on the fact that in this place, there is “nothing to do but drugs.” In South Texas, spinach and onion fields were once so vast they met the sky at the vanishing point in almost every direction, yielding fabulous profits for those who owned the land. In some areas, the fields still stretch to the horizon today. Yet extreme hardship was the lot of the landless laborers who planted and harvested those crops. Forced to migrate to find work the rest of the year, generations of children were robbed of their right to a decent education. Even today, adult illiteracy rates in these South Texas communities are among the highest in the nation. High school graduation rates among the younger generation have soared, but test scores remain abysmal, especially in reading.
Residents of every place we got to know for this book can recount stories of local government corruption: the FBI storming City Hall to arrest nearly every member of city government; local officials imprisoned for buying votes and collaborating with drug dealers; corporations getting sweetheart deals to bring a new factory to town but never delivering. Yet when leaders are asked to name the biggest challenges facing their communities, government corruption rarely comes up. Instead, they usually focus on the flaws of the poor.
People don’t need to do more than scan the front page of the Greenwood Commonwealth, the newspaper that serves Leflore County, Mississippi, to know that violence is an issue. It is the number one problem facing the community according to Black residents we spoke to, though white folks were largely oblivious to it. As we would soon learn, this county and the larger region it represents—the vast Cotton Belt stretching from the Carolinas to eastern Arkansas and Louisiana—is indeed among the most violent in the nation. Violence has plagued this region for well over a century.
In Marion County, South Carolina, it seemed that every one of our conversations started with a discussion of the flooding that had come in the wake of back-to-back hurricanes. On the white side of the small, hard-hit town of Nichols, spanking-new replacement homes had risen. In contrast, homes in the Black part of town and in Sellers, a nearly all-Black hamlet where the flooding had devastated much of the town, were still moldering.
Many houses were rendered uninhabitable, though some people were nonetheless living in them. As we dug deeper we learned that in myriad ways those who already had been struggling before the floods were struggling even more after. Centuries-old racial inequality in the Pee Dee region deepened in the wake of the disasters, due to systemic racism encoded in the very government programs that were supposed to help people recover.
We initially thought that these themes—unequal schooling, the collapse of social infrastructure (the places where people build social bonds), violence, entrenched public corruption, and structural racism embedded in government programs—were at least somewhat unique to each place. Yet what turned out to be most remarkable was the degree to which they were shared.
Soon after we mapped our index for the first time back in 2018, one of our researchers, Ryan Parsons, showed us another map, from 1860, depicting enslavement in the South on the eve of the Civil War. The correspondence between this map and the map of our index showing contemporary regions of deepest disadvantage was chilling. Along with a growing number of scholars of history and race, we began asking this question: Could it be that the social and economic relations in a place a century and a half ago continue to shape conditions in that place today? The message was clear: To understand the challenges facing a place of deep disadvantage, the first step is to learn about its past.
In each of these internal colonies, the past appeared to be prologue to the present. But how? We began by exploring the history of the Cotton Belt.
We supplemented the many volumes of systematic, rich description authored by prominent Black sociologist Charles S. Johnson of Fisk University with essays by anti-lynching activist Ida B. Wells. We studied the meticulous ethnographic work of Black anthropologist William Boyd “Allison” Davis and his white coauthors, and that of anthropologist Hortense Powdermaker and psychologist John Dollard, both white scholars hailing from Yale University. Each had studied a portion of the Cotton Belt in the late 1920s and early 1930s when the cotton economy reached its apex. Noted historical works like James C. Cobb’s The Most Southern Place on Earth also offered valuable insights. There were vital first-person accounts like that of Ned Cobb, a Black Alabama tenant farmer who spent a decade in prison for standing up to a white landlord in support of a neighboring tenant’s rights, and whose story is captured in Theodore Rosengarten’s award-winning All God’s Dangers. These and other works chronicled life in the Cotton Belt from the time when the first white settlers arrived. Fortunately, for each of the other regions, there were similar gems.
From these texts, we began to stitch together a sense of how the history of these places has shaped the present.
Tribal lands are an essential chapter in the story of America’s internal colonies. After enslavement and disease had decimated the Native American population, reservations served as the holding ground for a people ruthlessly removed or pushed back to make way for the vast economic and human exploitation that would take place across the Cotton Belt, Appalachia, and South Texas. For this book, we wanted to include one of the Native American tribal lands that appeared high on our index.
Building relationships to conduct research on tribal lands, however, takes time. While we tried to make connections with tribal researchers who had the vital expertise and relationships we lacked, the way forward became fraught, to say the least, as the COVID-19 pandemic claimed a disproportionate share of Native lives. Several communities that ranked highest on our index—including number-two-ranked Oglala Lakota County in South Dakota, home to the Pine Ridge reservation—made national news when tribal governments imposed lockdowns and installed checkpoints on state and interstate highways to protect their citizens from outsiders who might be carriers of the disease. Actions such as these sparked a power struggle with South Dakota governor Kristi Noem, who declared the checkpoints illegal. We acknowledge that this critical chapter remains unwritten in this book.
Our Map of Deep Disadvantage Compared to a Map of Enslavement from 1860 Undoubtedly, if we had chosen different communities within the regions we focus on here, additional themes would have emerged. The insights we share in these pages are just the tip of the iceberg. As much as anything, this book is meant to illustrate an approach: starting with the nation’s rich bigdata infrastructure to identify pockets of deep disadvantage, listening carefully to community members through ethnographic research in specific locales within these regions, attending carefully to the role of history, and, when possible, returning to big data to test the hypotheses generated on the ground.
After all the interviews had been conducted in each of these regions, after all the histories, ethnographies, government reports, and first-person accounts had been read, an additional step was required. In the summer of 2021, two of us (Kathy and Tim) got into a car and began a fourteen-state road trip. We wanted to visit as many of the top 200 places on the index as we could. We felt that, before characterizing them, it was imperative that we drive around each community, take a walk around the county courthouse, stop at a diner or coffee shop, and strike up informal conversations with locals.
Despite the considerable distances involved, we made it to 132 of the top 200 places of deepest disadvantage. Adding in 23 places that we had been to for other research, we visited a total of 155 locations, more than 75 percent of the top 200. Our objective was to ensure that the index reflected truth on the ground. Were there observable differences between the top 200 and those a little lower down the index? Could we sense a difference once we crossed the county line separating an advantaged from a disadvantaged place? The answer to both questions in nearly all cases was a resounding yes.
But there were exceptions. Our data were drawn from reputable sources, yet errors are always possible. This became clear when we traveled to the town of Radford, Virginia, which purportedly ranked 45th on the index.
Instead of the vacant buildings and unkempt lots observed elsewhere on our journey, the place resembled a stage set for Thornton Wilder’s play Our Town. Digging deeper, we learned that Radford and several other Virginia cities ranking high on the index purportedly had life span averages so short that they were not just implausible but downright unbelievable. We concluded that, due to complex reasons involving administrative jurisdictions in these places, the Centers for Disease Control and Prevention numbers were wrong. We adjusted.
The trip was a safeguard against these errors. Yet an additional benefit of conducting this visual audit was in the stories it revealed. Driving through the countryside in some of these regions, one can be lulled by the beauty of the landscape, as in the rolling hills of Alabama, with their tidy pecan groves, bucolic grazing lands dotted with cattle, and fields rich with soybeans and corn. But in both the outskirts of the cities and their downtowns, blight is prevalent, and the passerby realizes that each is in fact a very disadvantaged place.
In these places, there are ample reminders of the past. In Montgomery, Alabama, for example, which comes in at number 244 on our index, we toured two sites overseen by the Equal Justice Initiative: the Legacy Museum, colloquially known as the “lynching museum,” and the National Memorial for Peace and Justice, with its display of eight hundred granite slabs resembling caskets—one for each county where lynching crimes took place between 1877 and 1950. At the memorial, we learned that forty-eight lynchings have been documented in Leflore County, Mississippi, alone.
Next we explored the Mississippi Delta, the portion of the Cotton Belt along the Mississippi River. Traveling south on the Mississippi side, we reached Mound Bayou, in Bolivar County. Established in 1887 by former slaves, the settlement was designed to be a self-reliant community of Black Americans, home to Black-owned farms, businesses, schools, hospitals, and banks. Theodore Roosevelt praised Mound Bayou as “the Jewel of the Delta,” while Booker T. Washington called the community a model of “thrift and self-government.” For decades it prospered, shielding residents from Jim Crow. But it experienced a decline in the second half of the twentieth century due to market fluctuations, increasing rates of racial integration in other parts of the nation, and the many pernicious forces that have dispossessed the vast majority of Black farm owners across the nation.
Over in central Appalachia, we drove through the remains of company towns, often only one or two streets wide, lining the banks of the creeks in the hollows—narrow valleys that sometimes stretch for miles between the mountains. We traveled through Benham, Kentucky, in Harlan County, a town built by the International Harvester Company, and to the town of Lynch, also company owned. There is still a handsome brick building there bearing a historical marker titled “Lynch Colored School.” The school was built for the Black high school students of Benham and Lynch by the US Coal and Coke Company in 1923, and it was said to be the finest Black school around. The building now sits vacant, while the former white school has been converted into a bed-and-breakfast. Also in Lynch, next door to the now defunct Mine Portal 31, are the communal baths the company built for the miners, today adapted for other uses. Across the street is a gourmet coffee shop and bakery.
From these clues, one might think that the mine workers were well cared for, but the former company homes, with their crude cookie-cutter architecture and porches lined with mismatched, worn couches and chairs, tell a different story. Today the houses are generally so cramped (given the large families) that the occupants’ possessions spill out onto their lawns, mixing in with the junk cars and broken appliances that are retained for spare parts. Writing in the 1960s, noted local author Harry M. Caudill described how employers like International Harvester anticipated the massive decline in the demand for labor due to impending automation and competition from other fossil fuels. In light of this, the companies were eager to divest from their company-owned homes. Cynically, they offered them for purchase to the miners, ostensibly to improve the workers’ morale, luring them with promises that there would be work for them for years to come. Then came the massive layoffs.
As we turned from studying people to studying places, we came to the task equipped with tools our predecessors could only dream of—data that offer a picture of the conditions of America’s communities that is more comprehensive than ever before. These tools, combined with on-the-ground ethnography, and historical data, shine a bright light on where people experience the toughest and most intractable problems across the nation.
We are now armed with new revelations about poverty and a new understanding of how deeply disadvantage is woven into the history and present-day institutional fabric of particular places. In the final chapter of this book, we turn to the idea that our nation must launch a fresh offensive against poverty with a relentless focus on the country’s most deeply disadvantaged regions.
The Injustice of Place tells the stories of America’s internal colonies— where disadvantage has been endemic for generations—and calls us to envision a different future, where no corner of the country is left behind. To quote President Lyndon Johnson, the architect of the War on Poverty, “The Great Society rests on abundance and liberty for all. It demands an end to poverty and racial injustice, to which we are totally committed in our time.”

1
America’s Internal Colonies
TODAY, THE MOST OBVIOUS VESTIGE of the nation’s once vast internal colonies are the myriad symbols that celebrate their past. In Crystal City, Texas, Popeye—with trademark pipe stem clenched in his wide mouth, each forearm the size of a Butterball turkey—stands six feet high on a round pedestal in the middle of downtown. The inscription on the statue’s base —“The Spinach Capital of the World”—illuminates why this South Texas city has honored the cartoon character for so many decades.
Crystal City lies in the heart of what is known as the Winter Garden of Texas, a multicounty region roughly ninety miles southwest of San Antonio that was once an agricultural giant in the cultivation of spinach, onions, and other irrigated vegetables. The industry ballooned in the 1920s as new rail lines were laid and the invention of the refrigerated railcar permitted producers to transport onion bulbs and leafy greens across the nation. As the prominent placement of the Popeye statue attests, Crystal City still celebrates this legacy. Each year the city hosts the Crystal City Spinach Festival. Activities include a spinach-eating contest (medical waiver required) and a cook-off. Young women and girls compete to be part of the Spinach Festival Queen’s Court. Winners ride on a special float festooned with colorful flower arrangements and balloons as the festival parade winds through town. Spinach green and gold are the high school colors.
As you travel along Mississippi Highway 7, at the Greenwood city line, a large sign welcomes you to “The Cotton Capital of the World,” the calligraphy bracketed by stylized cotton bolls and underlined with the circular logos of the town’s civic organizations. In Greenwood, cotton has held pride of place since Reconstruction, despite the threat of the boll weevil and the collapse of the cotton market in the late 1800s and again during the Great Depression. Each year, the Junior Auxiliary of Greenwood hosts the Cotton Ball fundraiser, which, according to the Greenwood Commonwealth, “features the crowning of the King and Queen of Cotton. . . . Guests enjoy a variety of activities, including live and silent auctions, a patron’s party, presentation of the Cotton Ball maids, and live entertainment.” Almost seven hundred miles east of Greenwood, where the historic Cotton Belt melts into the Tobacco Belt, lies the coastal South Carolina town of Mullins, in the heart of what is known as the Pee Dee. The area is named for the Great Pee Dee River and its many tributaries, which flow south through the region from the North Carolina border before joining the Atlantic Ocean at Winyah Bay. Each fall, Mullins sponsors the Golden Leaf Festival and barbecue cook-off commemorating the strain of tobacco introduced to the Pee Dee in the late 1880s. This “bright leaf” strain smoked so smooth it popularized the cigarette. During the 1880s cotton market slump, Pee Dee planters jilted cotton for this new cash crop. With railroad tracks running right through town, the city was soon home to the largest tobacco market in the state, a position it maintained for much of the early twentieth century, until the industry was restructured during the New Deal and World War II. While Mullins has no “smoke-off” to parallel the spinach cook-off in Crystal City, several of the old tobacco warehouses still stand. Some tobacco conglomerates, including R. J. Reynolds, maintain a presence here. Today, the town’s historic train depot—once crucial for getting the crop to market—is home to the South Carolina Tobacco Museum.
In the bituminous coal fields of central Appalachia, there are two coal museums. The one in eastern Kentucky occupies the former company store of US Steel’s “model” town of Benham, housed in an imposing four-floor structure complete with a 1950s-era soda fountain and reconstituted diner, though the building is now powered not by coal but by solar panels. A few miles down the road in the former model town of Lynch, built by International Harvester, non-claustrophobic visitors can descend into the Portal 31 exhibition mine. The second museum, in Beckley, West Virginia, features a fully reconstructed coal camp. The website promises that “the Coal Company House, Superintendent’s Home, Pemberton Coal Camp Church, and the Helen Coal Camp School, give visitors a true representation of early 20th century coal camp life.” Across rural America, monuments, celebrations, and museums are markers of local pride. Indeed, Crystal City has vigorously defended its claim to the title “Spinach Capital of the World” against upstart Alma, Arkansas—also a former spinach mecca that has erected multiple statues of Popeye. Yet in South Texas, the vast Cotton Belt, central Appalachia, and the Pee Dee region of South Carolina, these symbols celebrate a past that is fraught, to say the least. They commemorate the very industries that, for a century or more, spelled misery and hardship for thousands, if not millions, while profiting only a few. They memorialize the intensive resource extraction and resulting human exploitation that made these places America’s internal colonies.
How did the identities of these communities become so bound to the economic legacies of the past? A superficial read of the evidence suggests that geology is destiny. The bituminous coal forming over centuries in the mountains of central Appalachia; the alluvial soil of the Yazoo-Mississippi Delta building through thousands of years of annual flooding; a belt of fertile black earth stretching across Alabama and Georgia and into South Carolina left by the waters of an ancient lake; the artesian aquifers bottled up underground southwest of San Antonio: each feature of geology was like a character in a play, waiting in the wings—sometimes for eons—for its moment on history’s stage. But geology is not destiny. Though geological features were certainly necessary, they were not sufficient to produce the vast industries that were built upon them. That process required the acquisition of land, capital, technology, and access to national and global markets. Most important, however, was an ample supply of exploitable people to provide cheap labor.
In each case, there was a moment when it all began, a sudden break between past and future—even if those involved didn’t fully realize it at the time. Someone with the right skills, resources, and connections happened to be at the right place at the right time to help solve a perplexing problem. A chance sighting of an inky rock outcropping in the American wilderness remained lodged in the memory of a determined entrepreneur. An experiment born of desperation catalyzed the resurrection of an old industry, but in a new locale. A rancher late in midlife threw caution to the wind to try his luck with onion seeds imported from Bermuda, with immediate and spectacular success. In each instance, the moment was followed by a boom that transformed an entire region’s economic and social life within a few short years.
THE MOMENT COMES TO THE COTTON BELT OF
THE DEEP SOUTH
As motorists travel down I-95, crossing through the Carolinas and Georgia en route to Walt Disney World and other Florida attractions, most have no idea, as they whiz by Savannah, that they are traveling within a mile or so of the spot where one of the most consequential events in American history took place. A neglected historic marker just north of the city on Georgia State Route 21 commemorates the event, though the sign is dwarfed by an IKEA distribution warehouse and an office park. During the colonial era, this was Mulberry Grove, a silk and rice plantation on the Savannah River owned by a British loyalist who forfeited the property to the US government after the Revolutionary War. The new government gave the land to General Nathanael Greene as a reward for his heroic military service (deemed so notable that more than a dozen US counties are named for him).
After his death in 1786, Greene’s widow, Catherine, inherited the plantation and experimented with planting short-staple cotton.
In the spring of 1793, a recent Yale graduate with a gift for tinkering left New England for the South to assume a tutoring post. On the way, the young man, Eli Whitney, learned that the job would pay only half the salary he had anticipated. Fellow traveler Phineas Miller invited Whitney to come work at Mulberry Grove, which he managed for Catherine Greene. She asked Whitney if he could find a way to solve the aggravating problem of separating the cotton fibers from the seeds.
Whitney had never laid eyes on a cotton boll in his life. Yet in a short time, according to one version of the story, he had devised a simple machine featuring a rotating drum with wire teeth that pulled the fibers away from the seeds. According to the National Archives, however, others deserve at least as much credit for the invention, including Catherine Greene herself and several of the plantation’s enslaved people, whose names are not known. Realizing that the cotton “gin” (short for “engine”) could generate a fortune, Whitney and Miller raced to Philadelphia, then the nation’s capital, to patent the device with Secretary of State Thomas Jefferson. They never saw a profit; despite the patent, bootlegged models spread across the country like wildfire. Yet the gin changed the course of history by doubling the market value of the crop. The great cotton rush was on.
By 1800, the United States was producing 36.5 million pounds of cotton —two-thirds of it grown in South Carolina and Georgia. Over the next two decades, the output grew more than 450 percent. By that time, cotton cultivation had moved westward, as the greedy crop depleted centuries’ worth of nutrients from the soil. For planters thirsty for fresh acreage, the federal government proved to be an indispensable ally, acquiring vast tracts from foreign powers via the Louisiana Purchase and through forced concessions from Native American tribes, and in turn opening millions of acres across the Deep South and the Delta to cotton cultivation. As cotton plantations spread like a rash across the South, so did slavery. According to historian Sven Beckert, “All the way to the Civil War, cotton and slavery would expand in lockstep, as Great Britain and the United States had become the twin hubs of the emerging empire of cotton.” Mississippi became the nation’s twentieth state in 1817. At that time, much of the land was still in the possession of the Choctaw and Chickasaw Nations, and only became US territory after the 1830 Treaty of Dancing Rabbit Creek. The leader of the Choctaw who signed the treaty was a young man of mixed French Canadian and Native heritage named Greenwood LeFlore. LeFlore stayed on after the Choctaw were forcibly removed to Oklahoma and became one of the largest landowners in the state. Just east of where Greenwood, Mississippi, now stands, LeFlore built the palatial mansion he christened “Malmaison” in homage to his hero, Napoléon Bonaparte, where he lived in splendor in the midst of imported French furnishings while four hundred enslaved people toiled on his 15,000 acres planted in cotton. Much of the land to the west of his plantation, referred to simply as “the Wilderness,” was empty of people, “a wild territory where panthers, bears, and snakes lived in the dense cane jungle.” This land— soon to be known as the Delta—“was destined from the beginning to be the domain of substantial planters . . . who possessed both the financial resources and the slaves required to clear and drain the land and take full advantage of its exceptional fertility,” writes historian James C. Cobb. Yet it wasn’t until after the Civil War that the alluvial soil would be fully exploited.
If it could be said that there is a place where human destiny unfolded from the very earth, that place would be the Mississippi Delta. Within its bounds lie almost 7,000 square miles of the richest and deepest topsoil ever cultivated. According to historian John Willis, “With each spring thaw for thousands of years, the Mississippi River carried off the rich topsoil of the Midwest. Then, just south of Memphis, the river predictably bulged over its banks . . . deposit[ing] a thick, rock-free and fecund soil upon the YazooMississippi Delta—conditions fit for a king. King Cotton that is.” On the eve of the Civil War, “as many as sixty thousand Delta slaves produced a staggering 66 million pounds of cotton” per year, transforming the Delta into the most important producer of the world’s most important commodity. In this region, cotton absorbed everything—the land, the capital, the labor. To secure property, clear it for planting, acquire seed and fertilizer, and—in the antebellum period—ensure a sufficient supply of enslaved people all required substantial capital, which planters sought from the Northeast and abroad. After the Civil War, the Cotton Belt’s dependence on outside capital only increased.
As the industrial revolution dawned, cotton fever was so pervasive in the South that the massive industrialization gripping the rest of the nation barely registered there. Economist Jay Mandle notes that of the 5.6 million manufacturing jobs created between 1890 and 1910 across the United States, fewer than 400,000 of them were in the six states where cotton was most dominant. In 1910, Ulrich Phillips, a celebrated southern historian of the early twentieth century, condemned the Cotton Belt as “decadent,” responsible for stymieing all diversification by “[keeping] the whole community in a state of commercial dependence upon the North and Europe.” The rapid expansion of cotton wealth across the South is exemplified by the story of Greenwood, Mississippi. By the early twentieth century, Greenwood was linked to the intercontinental railroad system, which could transport its only commodity to lucrative markets unreachable before. The sudden rise in the price of cotton at that time prompted planters to clear more of the Delta backcountry to cultivate the “white gold.” In the blink of an eye, the rise birthed a regional elite that flourished, in stunning contrast to the legions of impoverished laborers in its midst. Writing in 1935, Delta native David Cohn hyperbolized, “Cotton is more than a crop in the Delta.
It is a form of mysticism. It is a religion and a way of life.” Nowhere else reaped the benefits of this boom more than Greenwood.
In 1892, the Sanborn Fire Insurance Company’s map of the city shows a small settlement with two sawmills, a cotton warehouse, and a few square blocks of modest homes. Its 1905 corollary identifies a dozen or so churches, two schools (one white and one “colored”), the Delta Hotel, the Elks Lodge, and even an opera house. In 1906, a handsome new courthouse was built, one of the largest in the state. The following year, the city erected the castle-like Jefferson Davis School, serving white children in grades one through twelve. Large Queen Anne and Classical Revival mansions sprang up on West Washington Avenue during this time. Soon after the turn of the century, Greenwood became the symbolic center of the entire Delta region.
The Works Progress Administration’s 1938 Travel Guide to the Magnolia State describes (with a whiff of condescension) how thoroughly cotton and its attendant wealth permeated the life of the city, which was “completely surrounded by cotton fields and centered about its gins, compresses and warehouses. . . . Cotton built the gins and compresses and the pretentious mansions on the Boulevard.” But while some planters acquired fortunes, the majority labored under crushing debt. In 1936, Thomas J. Woofter published the results of a study of 646 cotton plantations across the cotton-growing states, showing that landowners’ debts had almost quadrupled between 1910 and 1928. At the end of the period, more than half of these landowners’ debts totaled more than 40 percent of the appraised value of their land and its attendant buildings, animals, and machinery. The author offered this diagnosis: “These areas are utterly subject to King Cotton, booming when the King is prosperous and slumping when the King is sick.” It is hard for the modern reader to grasp what conditions in this region were like fully seventy years after emancipation. As psychologist John Dollard, studying one Delta county around that time, observes, “One sees flat cotton fields, an occasional puff of woodland against the horizon, rainblackened Negro cabins in great numbers along the road and, in the fields, the cotton crop in some state of its growth or decay . . . and here and there, but less frequently than the sentimental northerner would imagine, a plantation mansion.” The prominent Fisk University sociologist Charles S.
Johnson, who conducted hundreds of interviews with tenant farmers in the Cotton Belt in the late 1920s and early 1930s, describes how the “hard white highways of Alabama [have] drawn a ring as distinct as the color line around . . . decaying plantations—each with its little settlement of black peasantry.” If plantation life among Black people before the Civil War can rightly be described as an unending forced labor camp, tenancy after freedom was a perpetual, inescapable form of indentured servitude. In 1930, roughly six in ten Black Americans were tenant farmers, and another three in ten were farm laborers. Landlords often shortchanged their tenants when it came to “settling up” time. They also did everything in their power to block tenants’ access to government relief, no matter how lean the year, on the assumption that it might discourage these laborers from toiling for survival in their fields. These laborers included women and even very young children.
Meanwhile, tenancy ensured material deprivation on an almost unimaginable scale. Any profits doled out at financial settlement time, generally in early December, when crops were sold and debts subtracted from the proceeds, rarely lasted much past Christmas. Yet the “furnish system,” where landlords furnished seed, fertilizer, implements, and other basic provisions on credit, did not operate during the months between settlement and the beginning of planting season, typically in late March or early April. This guaranteed virtual starvation during a significant portion of the year unless seasonal employment could be secured off the farm. Not only was there insufficient food, but diets were deficient in many vitamins and nutrients. Based on his research with tenant farmers, Johnson reported, “The consequences of this were severe, the most direct and spectacular being the wide prevalence of pellagra,” an affliction known to cause mental confusion, diarrhea, abdominal pain, and scaly skin sores. To examine causes of death among Blacks, Johnson obtained data from the health department of Macon County, in the heart of the Alabama Cotton Belt, and found that “heart disease, stillbirths, tuberculosis, influenza, nephritis, cancer, pellagra, and malaria” were among the most common killers, though he cautions that, due to a lack of access to doctors, many of these records relied on folk diagnoses. Nonetheless, he concludes that “the figures for mortality, morbidity . . . , poverty, insufficiency of food and clothing, are barren records for an understanding of the human struggle behind them.” BOOM IN THE PEE DEE
On a cotton farm near the Atlantic coast, Frank Mandeville Rogers was experimenting with something new. In the late 1870s, Rogers had purchased about fifteen hundred acres of prime farmland near the town of Florence, South Carolina. By the mid-1880s, cotton prices had collapsed, often yielding so little that production costs nearly canceled profits. By chance, Rogers encountered a visiting clergyman who suggested growing bright leaf tobacco instead. Bright leaf, developed in the “Old Belt” tobacco-growing region of North Carolina and Virginia in 1839, was the first American strain smooth enough to allow cigarette smokers to inhale and enjoy the experience—and the nicotine rush that came with it.
Intrigued by the clergyman’s suggestion, Rogers sent inquiries to growers in that region, at first meeting with sharp skepticism that the crop could be grown that far south. Undeterred, he planted the first few acres of bright leaf in South Carolina in 1885. Cigarettes had recently become a craze thanks not only to the smoothness of the bright leaf strain but also to the invention of the cigarette-rolling machine, favorable tax policies, and an economic upturn that put a 5-cent pack of cigarettes within the reach of the masses. Rogers built a curing barn, while gathering advice from experienced tobacco hands recruited from the Old Belt. In 1887, he and a business partner sold a twenty-acre crop for $4,611, earning a net profit of $2,930—a phenomenal $146 per tilled acre, about ten times the yield per acre of cotton in those years.
Rogers became an evangelist for bright leaf. A master proselytizer, he earned the backing of the publisher of Charleston’s leading newspaper, the News and Courier, and the state agricultural commission. One scholar writes that, by the first decade of the twentieth century, “the Pee Dee landscape was literally transformed. Thousands of curing barns, their furnaces winking in the black August night, bore witness to the change, as did scores of warehouses, a dozen new banks, hundreds of new stores and homes, and thousands of new jobs in the fields and towns of the region.” While the population of South Carolina grew by 13 percent between 1900 and 1910, several tobacco towns in the Pee Dee region, including Mullins and nearby Marion, swelled by more than 200 percent. In the Pee Dee, King Cotton had been dethroned. Much less has been written about the tobacco economy of the Pee Dee than about that of the cotton-rich Delta, but what evidence we have suggests that the patterns of exploding growth and near-complete market domination seen there were repeated in the Pee Dee. During the 1890s alone, the state’s tobacco production increased one hundred–fold. The expansion of bright leaf in the vicinity of Marion County was nothing short of phenomenal, a 688-fold increase between 1890 and 1899. Though Pee Dee planters hailed bright leaf tobacco as their savior from the market swings of King Cotton, they soon found themselves subject to a new tyrant: Big Tobacco. As it turned out, the tobacco companies that controlled the industry were no less rapacious than their cotton counterparts, and the crop required close attention throughout the year, with virtually no lay-by period, deepening planters’ dependence on tenant labor.
THE WINTER GARDEN OF SOUTH TEXAS
The moment came to South Texas under the aegis of Thomas Carter (T. C.) Nye, who would become known as “the father of Texas onions.” Nye had had a hard-knock start in life. When he was orphaned as an infant, the county tried to sell him at auction, but there were no takers. He was placed with an elderly spinster who raised him at the county’s expense. Just sixteen when the Civil War broke out, Nye joined the Confederate army and was captured twice. After the war, he spent three decades on a cattle ranch near Cotulla, in South Texas. Then, in the late 1890s, at the ripe old age of sixtyfour, he decided to try something different. With what could only be described as wild success, Nye produced a bumper crop of onions on the US-Mexico border near Laredo. Grown from seed imported from Bermuda, they netted a profit of $200 an acre.
Publicity soon followed, and the notion that South Texas was ideal for irrigated farming began to take hold. By the early 1900s, the Winter Garden, as the area north of Laredo would later be called, became known for its rich soil, mild temperatures (by Texas standards), and copious water supply, plus a proven crop with potentially enormous profits. Thus, “the stage was set for the beginning of a series of some of the greatest landcolonization schemes in South Texas history,” according to historian James Tiller. Inspired by Nye’s success, cattle ranchers with massive landholdings —outraged by the cartel-like practices of the meatpacking industry—began feverishly subdividing ranchland into farm plots. They formed land agencies that advertised in newspapers up north during the winter, luring chilly Midwesterners with the promise of abundant land and sunshine, and offering would-be home seekers (or “home suckers,” as the Texans sometimes derided them) special rail excursion fares to inspect the plots.
Migrants also came from Virginia, the Carolinas, and Georgia, fleeing spent plantation land to the east.
In Zavala County, home to Crystal City, a legendary “cattle raiser” from the days of the open range, Colonel Ike T. Pryor, subdivided his 100,000- acre 77 Ranch in the early 1900s. The owners of the Cross-S Ranch, also in Zavala County and one of the largest in the United States, did likewise. In the decades between 1900 and 1930, new towns boomed in quick succession alongside the plots, not only in the Winter Garden but across South Texas.
Anglo ownership of the land had been impossible before the United States negotiated the Treaty of Guadalupe Hidalgo in 1848. That year, the area north of the Rio Grande came under American rule, allowing vast tracts that had been granted by the Spanish and Mexican governments to Mexican families who had lived on the land for generations to be subdivided and sold. Anglo ranchers quickly began amassing large parcels, including the 77 and Cross-S Ranches in the Winter Garden and the mighty King Ranch in the Trans-Nueces region, the largest in the country, spanning 1.25 million acres. By one estimate, more than 80 percent of all the land in South Texas changed hands in the decades after 1848, with some acquiring their property through deception or violence.
This remote landscape had begun to connect to global markets after the Civil War, when cattle ranching became big business. Capital from British and eastern US sources funded much of the development at the time. In the Trans-Nueces region, beginning roughly sixty miles north of McAllen, cattleman Ed Lasater was bankrolled by English and Scottish financiers.
This relationship “typified the manner in which large quantities of foreign money were channeled into Texas between 1880 and 1920 to finance ambitious undertakings,” notes Dale Lasater, the speculator’s grandson. It was a harbinger of things to come.
With a foreign taste for investment already in place, T. C. Nye’s bumper crops spurred a rush to convert ranchland to irrigated vegetable farms. The speed of the transformation was breathtaking. In the 1920s alone, the number of acres farmed in Zavala County grew twentyfold. The population increased by more than 300 percent as laborers from Mexico arrived in large numbers to work the ever-expanding harvests of spinach, onions, and other kitchen staples. Agricultural economist Paul S. Taylor toured the region at that time and observed that the move from ranching to farming in neighboring Dimmit County was similarly dramatic: “From a sparsely settled cattle and sheep range of the southwestern frontier, it has been passing to an irrigated district which watches intently the daily fluctuations in the market price of Bermuda onions in New York.” Beyond the railroads and the region’s natural artesian wells, the single most important resource South Texas offered was a nearly inexhaustible labor supply just across the border, with workers who could be lured by very meager wages. Right at the time labor demand boomed, the Mexican Revolution—an extended period of regional conflicts between 1910 and 1920 that has been called the “defining event of modern Mexican history”—displaced thousands. This confluence of events fundamentally changed the social order of the region. By 1930, Zavala County was home to 7,660 Mexican-origin people, nearly three-quarters of the population, up from just 239 in 1910. Of South Texas’s fourteen counties, only four had a majority-white population by 1930. While relations between owners and ranch hands had been long-standing and governed by paternalism, “the modern society was characterized by wage laborers, impersonal contracts, and a rational market orientation,” according to sociologist David Montejano. Laborers were typically paid $1 a day, a wage one Anglo farmer defended this way: “What a Mexican should be paid is just enough to live on, with maybe a dollar or two to spend. That’s all he deserves. If he is paid any more he won’t work so much or when we need him; he’s able to wait around until we have to raise the [pay].” With the arrival of laborers in such large numbers, white landowners began instituting laws and adopting practices that mirrored those used to control Black labor in the Cotton Belt. To ensure that workers would be available for planting and harvest, Anglos relied not only on minimal wages but also, as Montejano notes, on a “web of labor controls,” including “horsewhipping, chains, armed guards, near-starvation diets . . . , vagrancy laws, local pass systems and labor taxes.” Meanwhile, Anglos propagated bigoted beliefs about “dirty” Mexicans’ inherent inferiority. As a numerical minority, Anglos were especially fearful of labor revolts, a paranoia fueled by lurid stories, such as that of the fate of Texas rebels at the hands of the Mexican army at the Alamo, as taught in schools.
The physical appearance of the neighborhoods in towns reflected the social hierarchy: sturdy wood-frame houses, paved streets, and enclosed sewers in the white neighborhoods; shacks, dirt roads, and privies in the Mexican part of town. These shacks were not meant to be year-round dwellings. White landowners increasingly eschewed sharecropping for wage labor as the region developed, employing workers only during peak harvest and planting time—a few weeks or months out of the year. Thus, Mexican-origin families were forced to migrate to find year-round employment, from the cotton fields of Texas’s Coastal Bend to the sugar beet fields of Minnesota and the Great Plains, and on to the cucumber fields of Wisconsin and the fruit orchards of Michigan. By 1941, fully 96 percent of the 5,500 Mexican Americans in Crystal City were migrant workers, a somewhat higher percentage than in neighboring counties.
Meanwhile, Anglos sought to dominate the landless laboring majority even further, including denying them the vote through poll taxes, the white primary (in which only Anglos were allowed to vote), and other nefarious means. Due to the connivance of local capitalists and complicit politicians, the labor laws that governed the rest of the nation, such as the minimum wage, simply didn’t apply. Unemployment insurance or welfare? These laborers were deemed ineligible. In sum, the laboring class in South Texas was exploited and subjugated to an extreme degree.
BIG COAL COMES TO CENTRAL APPALACHIA
Fifty-four years after Eli Whitney first ventured south to take up teaching, another Yankee, nineteen-year-old Jedediah Hotchkiss, left his small village near the foothills of the Catskill Mountains to embark on a walking tour through Appalachia. Ambling into the Shenandoah Valley, Hotchkiss ended up in Staunton, Virginia, nestled between the Blue Ridge and Allegheny Mountains. He became one of the area’s leading educators and founded several schools. Hotchkiss also earned such a reputation as a skilled mapmaker and expert on the landscape of western Virginia that during the Civil War, General Robert E. Lee chose him as his topographical engineer. While on assignment one day during the war, Hotchkiss spied several large coal outcroppings along the eastern base of Virginia’s Flat Top Mountain, far from the established anthracite coal fields of Pennsylvania, Ohio, and Indiana. Though fully occupied with the war, he made note of the sighting.
After the war, Hotchkiss took a teaching post at Washington University (later known as Washington and Lee) in Lexington, Virginia, under the presidency of Lee. Upon Lee’s death in 1870, Hotchkiss returned to Staunton, where he mounted a campaign to convince financiers that the coal outcroppings he had seen could be the foundation of an industry to replace the slave-based agricultural economy destroyed by the war. Hotchkiss hired prospector Isaiah Welch to investigate the claims of a Tazewell County blacksmith that he was mining all the coal for his shop from his own property. What Welch discovered on the blacksmith’s land was a coal seam thirteen feet high—twice that of any then known in the United States.
Though it would take another decade—after a depression and the construction of the Norfolk and Western Railway—the Flat Top– Pocahontas Coalfield was finally developed, turning the little town of Big Lick, Virginia, into the booming city of Roanoke almost overnight. Almost twenty-five years after first noting those outcroppings, Hotchkiss helped usher in the industrial age in western Virginia. It was only a matter of time before coal mining would spread across central Appalachia.
First, however, control over the land had to be wrested from the Native people living there. Daniel Boone had led the first white families through the Cumberland Gap and into Kentucky in 1773, a trek that violated King George III’s promise to Indigenous peoples that Europeans wouldn’t settle beyond the mountains. Thus provoked, the Cherokee attacked, and six members of the band—including Boone’s eldest son—were killed. Within two years, Boone was back in Kentucky working for the Transylvania Company, a private firm formed for the purpose of securing Cherokee land for white settlement—an action promptly condemned by the governors of both Virginia and North Carolina for its illegality. Undeterred, the company met with Cherokee leaders in 1775. The firm walked away from the meeting with roughly half of present-day Kentucky.
Settlement of the mountains was slow, and most of the population growth over the early years was from natural increase, with subsistence farm families moving farther up the creeks and valleys as they sought morearable land. The discovery of salt springs in 1800 in Clay County drew the attention of several prominent Virginia businessmen. The resulting salt mines became a crucial Kentucky industry linking the region to the national economy, a story we tell in more detail in chapter 5.
From the early 1800s onward, Clay County exhibited extreme economic inequality—a characteristic of all our internal colonies. By 1816, just thirtytwo residents owned more than three-quarters of the land. In his study of the Beech Creek farming settlement in Clay County, which lies a few miles from the present-day town of Oneida, ethnographer James S. Brown noted that in the 1860s, the territory was “largely owned by a few men.” This situation of the landed few and the landless many would have important repercussions as the region began to industrialize at century’s end.
It was then that agents of big corporations descended on rural settlements throughout Appalachia, scouting for timber and coal. After arriving at a farm on horseback, these “mineral men,” as they became known, would accept an invitation to the family table. Following the meal, the agent would “casually produce a bag of coins and offer to purchase a tract of ‘unused ridge land’ which he had noticed on his journeys or the mineral rights to the property.” It was hard for a cash-poor farmer to resist.
What these mountain farmers didn’t realize—until after the coal companies arrived, in some cases years later—was that they had signed notorious “broad form deeds,” which transferred to the coal companies not only the mineral wealth but also the right to remove it, “by whatever means” the companies deemed necessary. These means would come to include strip mining and even mountaintop removal. With thousands of strokes of individual pens, from western Virginia to West Virginia to Kentucky, the fate of Appalachia was sealed.
It wasn’t just mineral rights that were bought and sold. By 1892, out-ofstate corporations had developed “a virtual land monopoly” in eastern Kentucky, holding deeds to more than 80 percent of assessed land in Bell County, which adjoins Clay County on its southern border, and fully 60 percent in other nearby counties. In the period between 1890 and 1930, central Appalachia transitioned from subsistence agriculture to extractive industry, as railroads penetrated ever farther into the region, spurring the clear-cutting of vast stands of timber and the extraction of large seams of coal. In both industries, the prior generation of Appalachian salt capitalists greased the skids for the new investors. It was during this period of industrialization that images of the mountain residents as “hillbillies” began to form in the popular imagination. From 1886, when an Appalachian travelogue was published in Harper’s Magazine, through the 1901 release of novelist John Fox Jr.’s Blue-Grass and Rhododendron, the idea that “a separate and inferior people had settled Appalachia,” throwbacks to more primitive and even “barbarian” humans, was entrenched. In an 1899 Atlantic Monthly article titled “Our Contemporary Ancestors in the Southern Mountains,” Berea College president William Goodell Frost declared, “It is a longer journey from northern Ohio to eastern Kentucky than from America to Europe; for one day’s ride brings us into the eighteenth century!” The idea that mountain whites were of substandard racial stock was particularly in vogue in the eugenics-infused social sciences of the 1920s and carried over into the “culture of poverty” analyses of Appalachia that followed. Indeed, the discounting of the laboring class as of substandard racial stock was part and parcel of—and a necessary precondition for—the pattern of extraction practiced in each of the internal colonies.
As central Appalachian coal mining took off, the companies’ need for labor expanded dramatically. Accordingly, within a year after major industrialization, a county’s population typically doubled. Yet the sharp rise in demand did not lead to an increase in wages, as simple economic theory would suggest, because operators could draw on immigrants from eastern Europe and discouraged sharecroppers from the Deep South, along with those most readily at hand: native Appalachian subsistence farmers.
When investigating the living conditions of miners and their families in the region in the mid-1920s, the US Coal Commission found that roughly 80 percent were living in company-controlled towns, a story we tell in chapter 3. Even more concerning to the commission was the lack of legal protection for those living in a company-owned house. The moment a miner lost his job—for any reason—the family could be evicted. “The knowledge that the two precious eggs—the job and the family shelter—are in one basket and that at any hour of the day the husband might come back with both broken is a constant and grim companion [which] the mine-worker’s wife is powerless to forestall,” the report read. Yet owners’ control of the people who labored in the mines didn’t stop there. One West Virginia attorney general reported that “to ‘maintain their feudal proprietorship’ the operators resorted freely to the use of armed mine guards, blacklists, and martial law, as well as their domination of county governments and courts, and an ‘elaborate espionage and spy system.’” Coal companies controlled not just their towns and the families who lived there, but politics as well: miners regularly received “silent instructions” about how to vote.
Meanwhile, the danger to miners was unvarying. “Falls of coal and rock from the roof, gas and dust explosions, unsafe haulage systems, electrical shock, and other job-related dangers daily exposed the mine worker to the risk of death or disability,” wrote one observer. Accordingly, between 1906 and 1935, nearly 50,000 mine workers lost their lives. One person who grew up in the Blue Diamond mining camp during the 1920s later recalled that “the worst thing about living in the mining camp, one was always in constant dread. . . . Many a time I’d go to bed and I’d lay and worry whether Dad would come back alive or not. . . . Sometimes great tragedies came upon whole families [where several family members were] caught in one section, killed.”
THE LIVING LEGACY OF AMERICA’S INTERNAL
COLONIES
Read one way, the stories of America’s internal colonies are ones of American innovation, ingenuity, and entrepreneurship. Great wealth was extracted from these regions in the form of raw materials that fueled not only national but global markets. Yet from the start, these were also the places in the nation with the most inequality, severe poverty, ill health, and limited mobility. They remain so today.
Although each of these places has unique features and the level of exploitation varies, the parallels between America’s internal colonies are inescapable. Rather than trying to retain laborers through competitive wages, the capitalist class strove to close off any competition; consistent with elite interests, very little new industry, if any, came into these regions.
The dominance of labor-intensive industries with their rock-bottom pay schemes meant that these areas had employment structures akin to feudal systems: very few, and sometimes nonresident, owners who often relied on a small cadre of managers to oversee the impoverished many. Owners saw to it that taxes stayed low by denying the laboring class the franchise or, when that failed, by stripping them of suffrage via literacy tests, poll taxes, the white primary, and rampant vote buying, thereby undermining investment in schools and other civic infrastructure.
In the chapters that follow, we illuminate exactly how, in each region, these patterns have yielded a powerful living legacy today. We document the extraordinary manifestations of this legacy: separate and highly unequal schooling; crumbling social infrastructure; violence; generations of entrenched public corruption; systemic racism; elite backlash to civil rights; and the reproduction of highly exploitative economic and social relations into the present day. Each of these themes is richly illustrated in the portraits of the places we paint in the pages of this book.

2
Separate, Unequal
ON FEBRUARY 13, 1968, Senator Robert F. Kennedy and his automobile entourage pulled up outside a one-room structure in Barwick, Kentucky, home to a bleak coal camp nearly swallowed up by the surrounding hills.
As a legion of politicians, aides, and reporters piled out of the cars and crowded inside the school, the kids were scared—some too afraid even to look at the great man. Kennedy quickly read the room. Rather than delivering his prepared remarks, he moved quietly up and down the rows of desks arranged around a potbelly stove—solemnly shaking a hand, murmuring a reassurance. In one photo, he is smiling as he bends down to greet two little girls.
Kennedy had come to coal country to learn about poverty, but through images like these, the nation, too, would catch a glimpse of the deplorable condition of many of eastern Kentucky’s schools. Absentee timber and coal executives could safely ignore them. Their own children needn’t attend, and there were few middle-class constituents to please in towns like Barwick.
Many of eastern Kentucky’s schools were one-room affairs like the one Kennedy visited that day, serving students from kindergarten to third grade.
Dropout rates were appalling. Our analysis of census data from 1950 indicates that across the nation that year, roughly a third of adults had completed high school. The same was true of only about 14 percent in eastern Kentucky.
Rates of high school completion in the Cotton Belt regions of Alabama and Mississippi were a bit better, though still well below the national average. Student dropout tended to occur in much earlier grades in the Cotton Belt. In the Winter Garden, less than half of adults had completed middle school.
In eastern Kentucky as elsewhere in the internal colonies, the main problem was a failure to invest in the enterprise of education: coal demanded strong backs, not keen minds. While the state had long underinvested in its schools, devoting about a quarter of the national average to educational funding in the 1960s, spending varied dramatically from school to school. The Washington Post reported that in the late 1980s, while one school district in suburban Louisville spent $3,186 in local revenue per pupil, another, in eastern Kentucky, spent only $118. In 1989, the Kentucky Supreme Court deemed Kentucky schools unconstitutional.
Ordering the creation of an entirely new system, the court demanded that “the children of the poor and the children of the rich . . . must be given the same opportunity and access to an adequate education.” Yet still today, high school graduation and college completion rates in Appalachian Kentucky fall far behind comparable figures for the rest of the state.
Meanwhile, in the Cotton Belt and South Texas, school inequality has taken an even more pernicious form: a legacy of separate and highly unequal schools, first by law before the passage of Brown v. Board of Education (1954) and then by other machinations since. Between 1964 and 1975, many Cotton Belt whites effectively upended Brown by standing up all-white, private “segregation academies,” egged on and financially supported by Citizens’ Councils formed for the explicit purpose of maintaining segregated schools. Though many of these would eventually close, those that remained would subsequently provide their communities a vehicle to accomplish an almost complete resegregation of schools.
In South Texas, Anglo elites responded to Brown by pretending to comply. Whites retained control of the school board, students were mostly separated by ethnicity in the lower grades, and the practice of dividing students into “ability” groups—a process that never placed Hispanic children in the top tracks or white kids in the bottom tracks—and establishing other informal quotas ensured that Anglo students would continue to claim most of the perks, including the coveted spots on cheerleading squads.
To fully understand the power of this mechanism of suppression, it’s important to comprehend the system of education that prevailed in both the Cotton Belt and South Texas in the years before Brown: namely, separate and highly unequal schools, sanctioned by law in all fourteen southern states including Kentucky and West Virginia. We chose as our starting point the 1930s, an era when these regions were fully constituted as internal colonies.
BEFORE BROWN IN THE COTTON BELT
The insidious scheme of separate and drastically unequal was deployed in an especially dramatic way in Cotton Belt counties where Black Americans made up large majorities: the funding earmarked for Black schools was diverted to white schools. This diversion was, according to sociologist Allison Davis and colleagues in the classic work Deep South, “highly profitable to the white schools. In ‘white’ counties, on the other hand, colored people constitute so small a part of the total population that the diversion of funds to white schools would be of little or no value to improving those schools.” Another key difference, one not captured in per-pupil spending, was that county governments paid for the construction and maintenance of white schools, while it was up to Black parents to provide spaces—often in churches—for Black students to learn. Stoves, blackboards, and teaching materials were standard issue in white but not Black schools. Black teachers’ salaries were a fraction of those earned by their white counterparts, as was the length and quality of their training. Finally, while buses carried white rural children to and from school, Black children had to walk.
Even so, according to anthropologist Hortense Powdermaker, the most consequential factor in lower academic achievement among Black students was the short length of the school year. School terms for Black children in the Cotton Belt lasted only a few months, depending on the need for cotton pickers and the whim of the white school boards, which decided whether to keep children in school in the face of that demand. In her 1939 book After Freedom, Powdermaker noted that although Black students were in school for approximately half as much time as white students, “both have the same grade system.” Thus, she wrote, Black children “must be rushed through their work at an excessive pace. It is not possible to give two years to each grade, both because these children least of all could afford so long a course, and because parents, teachers, and pupils feel that a promotion must occur at the end of each year, even though the school year is only four and a half months long. Consequently there are children in the fifth grade of country schools who cannot read.” White attitudes about the education of Black children were rooted in the belief that educated Black people “were less amenable to the caste sanctions, less deferential, submissive, and dependent, and therefore a danger to the efficient working of the caste system,” observed Allison Davis and colleagues. In Caste and Class in a Southern Town, published in 1937, John Dollard wrote that whites were opposed to educating Black people in other than vocational fields because “they tend to become better competitors with middle- and upper-class white people and this potentiality of competition sharpens caste antagonism.” In Sunflower County, Mississippi, for example, Dollard reported “there is a colored high school in the town, not at all a common thing, which offers a three, instead of a four, year course; this means that the graduates from it cannot go directly to college but must spend a pre-college year away from home. For the Negroes who are demonstrably least well-endowed economically, this is a heavy handicap to the educational and status advancement of their children.” Meanwhile, white high schools offered a four-year course, which was required for college admission.
The circumstances of Cotton Belt Black children struggling to learn in separate and highly unequal schools may be familiar to many readers.
Perhaps less familiar are the conditions that prevailed in South Texas, which one Mexican American civic leader decried as “Jim Crow in a Sombrero Hat.” These conditions should come as no surprise: Some of the white ranchers and farmers who settled South Texas in the early twentieth century had left depleted Cotton Belt plantations. They applied lessons from their experiences exploiting and repressing Black people back east.
While we don’t have the same level of documentation on the educational experiences of students in South Texas as we do for the Cotton Belt, an extensive report published in 1929 by the Social Science Research Council used Zavala County’s neighbor Dimmit County as a case study for the Winter Garden region. The report found that “probably at no time during the school year” are more than 25 percent of the seven- to seventeenyear-olds in school. Weak state compulsory education laws and the fact that many Mexican Americans were exempt due to their rural locations meant that “we don’t enforce the attendance law,” one school authority told the report’s author, regional economist Paul S. Taylor. Mexican American children were in fact diligently included in the school census, although, as was true in the Cotton Belt, the per-pupil allocation of funding went disproportionately to the white schools.
Until the 1954–55 school year, some districts in Dimmit County had only one K–12 “American” school, which Mexican American children were not usually allowed to attend. Those Mexican American students who did advance through the segregated lower grades were almost never promoted to the English-speaking high schools. Officially, segregation in the lower grades was justified by language differences. The unofficial reason was no doubt more potent: segregation would prevent “race mixing.” According to one Anglo farmer, “A man would rather his daughter was dead than that she should marry a Mexican.” As in the Cotton Belt, segregation in schools was crucial to maintaining a strict division by ethnicity.
Meanwhile, the schools were utterly inadequate to the task: there were no maps, only old books left over from the white school, and no running water. As another Anglo farmer told Taylor, “Some of the people here say, ‘what do you want to educate an old pelado for? He will want 12 cents a row transplanting onions [instead of 10].’” Another offered a slightly more nuanced view: “They should be taught something, yes. But the more ignorant they are the better laborers they are. . . . If these [local Mexicans] get educated, we’ll have to get more from Mexico.” Most Hispanic families in South Texas were not tenant farmers, as in the Cotton Belt, but migrant workers who had to travel hundreds or even thousands of miles to find work in the late spring, summer, and fall before coming back to the Winter Garden to harvest in the fall. Almost all the Hispanic adults we interviewed in the South Texas counties of Brooks and Zavala recalled working in the fields and the journey to the upper Midwest for sugar beet season, then across to Michigan for cherries, a migratory cycle that persisted even into the early 2000s for some. Kids in these families were lucky if they got back to Crystal City in time to start school a half month late, only to leave again before the school year ended. The conditions of work varied in South Texas and the Cotton Belt, but the result was the same: the school year for these children was drastically shortened. As was common for schools across Dimmit County, Crystal City High, the so-called American school, graduated any number of white students each year, but precious few Mexican American kids completed even the elementary grades. In fact, in the early 1940s, one government study found that the average Hispanic eighteen-year-old in Crystal City had completed only 2.6 years of school. One in five had never finished first grade. These conditions persisted for many years and profoundly shaped the educational trajectories of the grandparents (and some of the parents) of the current school-age population.
THE RISE OF SEGREGATION ACADEMIES
On our 2021 road trip, which began in the Pee Dee region of coastal South Carolina, we passed Pee Dee Academy, established in Marion County in 1965. In neighboring Williamsburg County, we encountered Williamsburg Academy, founded in 1969. To the west, in Barnwell County, we found Jefferson Davis Academy, established in 1965. In Lee County, Robert E.
Lee Academy opened its doors the same year. Over in Hancock County, Georgia, we discovered John Hancock Academy, founded in 1966. As we drove west through Dougherty County, we noticed Deerfield-Windsor School, founded in 1964, and in Early County, we noted that Southwest Georgia Academy first welcomed students in 1970. Clearly, we were looking at a pattern.
In Alabama, there were many more instances of that pattern: John T.
Morgan Academy, Lowndes Academy, Macon East Academy, and Wilcox Academy, to name just a few. All were founded between 1964 and 1975.
Entering Mississippi and traveling over to the Delta, we encountered dozens of private academies established in these same years. Sometimes, we learned, the student population today includes a Black student or two. But overwhelmingly, the faces are white, even in these Cotton Belt counties, where most people are Black.
Robert “Tut” Patterson was booster in chief of these schools. At age thirty-three, he founded the nation’s first Citizens’ Council in Indianola, Mississippi (in the county studied by Powdermaker and Dollard), just two months after what local whites called “Black Monday,” the day the Brown v. Board of Education decision was handed down. That year, 1954, the organization established its national headquarters in neighboring Greenwood, with supporting chapters sprouting up across the South as whites rallied to fight the federal mandate to integrate public schools, which some white locals referred to as the “second reconstruction.” In its stronghold of Mississippi, the Citizens’ Council was initially highly successful in its fight to keep the public schools segregated in the wake of Brown: not a single school desegregated in the decade that followed. As late as 1961, three states—Mississippi, Alabama, and South Carolina—had not one integrated classroom.
But in 1963, with the Civil Rights and Voting Rights Acts in the offing, white segregationists, led by local Citizens’ Councils, shifted their strategy away from resisting public school integration and set about creating an alternative: all-white private schools. In 1964, the Citizens’ Council ran a story in its monthly magazine, The Citizen, offering step-by-step instructions on how to start a “private school.” Tut Patterson then turned his attention to establishing a segregation school in Greenwood, founded in 1966 and named Pillow Academy, for the planting family that donated the land. Journalist Richard Rubin writes that “Patterson, and by extension Greenwood, came to symbolize the last best hope of segregation in Mississippi.” That same year, over in Phillips County, Arkansas, the newly founded Marvell Academy—the first of its kind in the state—would enroll its first seventy-three white students. Three years later, Time magazine reported that of the more than two hundred similar whites-only schools that had “blossomed” in the South “for the sole purpose of excluding Blacks,” few were as “openly redneck,” to use the author’s language, as Marvell Academy. Its founders—members of the local Citizens’ Council— proclaimed that “integration is the corruption of the true American heritage by concept and ideology.” Harold Corkran, who led one of the county’s public schools, Marvell High, told the Chicago Tribune that he had watched the number of white students in his school dwindle each year since desegregation, while enrollment at Marvell Academy, just across the cotton field, grew. Yet Corkran confidently told the Tribune that the academy would not survive, “because anything that’s based on hate can’t exist, alone —especially when it’s preached day after day [in school].” Nonetheless, in 1970, the same year Corkran predicted their demise, the number of segregation schools throughout the South doubled. That year, Phillips County would add a second segregation academy, DeSoto School. All three segregation academies—Pillow, Marvell, and DeSoto—along with dozens of others across the Cotton Belt South have persevered to this day, each still enrolling hundreds of students.
Historian Michael Fuquay writes that in some instances, “entire student bodies moved from formerly all-white public schools to new private schools,” built with public funds, legally and otherwise. “Private” was largely a “romantic subterfuge designed to evade the requirements of federal law without sacrificing the benefits of public support.” Until it was deemed unconstitutional, the State of Mississippi provided tuition vouchers and other resources to help finance the schools.
Books and other school materials were transferred to the new schools, along with desks, blackboards, and even buses, secured via hastily organized “purchases” for pennies on the dollar. In 1970, the New York Times reported that eight hundred textbooks had been transferred from a public school in Jackson, Mississippi, to Woodland Hills Academy, a new segregation academy opening just outside town. Often, these unlawful seizures were aided and abetted by white school officials. When confronted by federal authorities, R. B. Layton, the assistant superintendent of the Jackson public schools, claimed, “These are surplus books we don’t need because of our reduced enrollment. I don’t see what the fanfare is about.” Yet the private schools seized more than just books and furniture. They also tried to steal the identity and legacy of the local public schools. “They took along the trappings of the old school, its colors, its teams, mascots, symbols, its student newspaper, leaving behind the shell of the building,” write educational historians David Nevin and Robert Bills, who authored the authoritative text on the topic, The Schools That Fear Built. Some teachers followed their white students to the private schools as well. Debbie Hewitt Smith, who as a teen was part of the wave of whites fleeing the local public high school in Leland, Mississippi, for a segregation school, remembers these blatant attempts at appropriation: “Pep rallies were positively depressing compared to what they had been at Leland High.
There was no pep band, no victory torch, no tradition. In fact, when it came to creating traditions at Leland Academy, we basically stole them from our old public school. For the new school mascot, the student body came up with the Bruins, only a baby step away from the public school’s Cubs. Our fight song was the same one used at Leland High, just substituting the old school colors of maroon and white with [the] new colors red and blue.” Among the Cotton Belt counties, it was in the Delta, rich in income and property, where whites were best positioned to donate the land for, fund the construction of, and pay the tuition for the new schools. Mississippi whites in other areas of the state, however, soon followed suit as the United States Supreme Court’s 1969 decision in Alexander v. Holmes County Board of Education led to immediate integration in thirty school districts in Mississippi and the eventual integration of every public school in the state.
In less than a year, all but two of Mississippi’s school districts had been forced to adopt desegregation plans. A mass exodus of whites from public schools followed. In Mississippi alone, enrollment in private academies— standing at about 20,000 at the end of the 1960s—more than doubled over the next five years. And in the Arkansas Delta, where Marvell was one of only a handful of segregation academies established in the 1960s, that number would more than triple in the next half decade. Nevin and Bills estimated that by 1975, approximately 750,000 white students were attending segregation academies in the South.
Delta whites weren’t only the best resourced, they were also the most eager for the private schools. An obvious factor was the large Black majority in the region. Without segregation, not only would classrooms be predominantly Black, but properly educated Black people might regain the franchise by passing literacy tests. In 1971, Auburn University professors John Walder and Allen Cleveland reported that the “‘white flight’ [from public schools] continues to be especially strong in those areas where the school population is predominantly black. For example, in at least half a dozen [such] Alabama counties public school enrollment is now almost totally black. Only a comparative handful of white youngsters remain in the public schools. Resegregation is virtually complete in these districts.” Whites in Lowndes County, Alabama, where Blacks in the county seat of Hayneville outnumbered whites four to one, established Lowndes Academy, enrolling its first 265 students the same year Pillow and Marvell opened. While in many other Cotton Belt communities some level of integration was achieved, at least for a brief period, Hayneville School had not a single white student by 1970. Meanwhile, nearby Lowndes Academy enrolled 335 white students in twelve grades. As Charles S. Johnson and the other researchers studying the Cotton Belt in the 1930s had shown, Black-majority counties were characterized by a distinctive set of social relations: namely, the obsessive monitoring of the color line. While inequities were rife across the South, in the majorityBlack Cotton Belt counties there was a huge chasm between the facilities, per-pupil funding, length of term, and teacher preparation for white schools and those for Black schools. In these places, the prospect of integrating the schools was especially terrifying for whites because the education of Black children had been so degraded.
Segregation academies were also ideal vehicles to ensure that white supremacist beliefs were passed on to the next generation. Central to the mission of these academies was to inculcate the mythic history of the white, Protestant South, the myth of the “Lost Cause,” and the supposed horrors of Reconstruction. This was especially necessary in the 1960s, as the civil rights movement succeeded in shifting the national political culture to one that “viewed white supremacy as evil and its defenders as un-American.” In this context, according to historian Michael Fuquay, “segregationist parents hoped to recreate the social, cultural, and ideological environment of their own upbringing and thus nurture in their children a set of beliefs then being rejected by the outside world.” Segregation academies were supported (and sometimes financed), as we’ve seen, by the Citizens’ Councils, once described as the “uptown Klan.” White churches as well, from a variety of denominations but especially the Southern Baptists, rushed to lend support. With academies springing up almost overnight in little towns, the Sunday school classroom was very nearly the only spot academy classes could meet. As one expert put it, segregation academies found not just space but legal legitimation, too, “under the umbrella of the church school movement.” The NAACP Legal Defense and Educational Fund reported that of the private schools and “education centers” that had been opened for white students fleeing the Memphis public schools, fully twenty-six of forty-three were sponsored by or housed in Southern Baptist churches. The same report noted that twenty Baptist churches operated segregation academies in South Carolina, that dozens of Louisiana’s segregation academies were Southern Baptist enterprises, and that Catholic, Methodist, Lutheran, Episcopal, and Seventhday Adventist churches also operated segregated schools. While segregation academies may have been created in response to desegregation mandates, their defenders only rarely made this explicit. It is important to note the coded language used then, as it still is today. A 1969 feature in Time magazine profiled Sandy Run Academy, a segregation school in Swansea, South Carolina, just outside the state capital of Columbia. Headmaster William Jackson, a retired public school teacher, insisted that he and his staff were motivated purely by concerns about quality. “We’re not concerned with integration, de-integration, or whatever,” he stated. “We’re concerned with quality education.” Appealing to “quality” was indeed the most common justification voiced for these schools. Despite such denials, Time reported that several segregation academies in the state honored their graduates with diplomas and pins that featured a Confederate flag with the word “survivor” engraved across it.
Many Citizens’ Council members were less subtle. In 1964, W. J.
Simmons, a prominent Citizens’ Council leader known as the brains behind the group, was featured in Esquire. Under the headline “The Segs,” the blurb reads: “Perez, Harris, Shelton, Maddox, Simmons, the five most influential men in the southern resistance, tell you exactly what they think.” In 1966, Simmons wrote in the Citizens’ Council publication The Citizen, “[Parents] want their children to be raised and educated free from the tensions of racial conflict in the classroom, free from the frustrating drag of mass mediocrity, and free from the blight of self-styled progressive educators whose avowed aim is to turn young Americans from the established inheritance of their fathers to alien theories of collectivism and anti-white racism.” Not all segregation academies were created equal. In 1973, the Yale Law Journal identified three classes of segregation schools, roughly corresponding to the socioeconomic conditions of the white community: lower-class “rebel yell” academies; white community schools; and upperclass day schools. Rebel yell academies were organized by poorer white families and provided only a rudimentary education. Teachers did not have professional training. These schools were located in private homes, churches, or vacant commercial buildings. White community schools catered to middle-class families. Some may have eventually adopted an “open enrollment” policy, as required by the IRS to claim tax-exempt status, but their student population was almost entirely white. Tuition was occasionally waived for poor white families whose children would otherwise be forced to attend desegregated public schools. Finally, upperclass day schools offered complete academic programs. They creamed staff from public schools, sought accreditation by state and regional authorities, and built modern campuses with amenities such as athletic fields and science laboratories. These schools offered guidance counseling and foreign language instruction alongside other academic courses and the ubiquitous Bible classes.
Despite these distinctions, many of the schools failed to provide a basic education. In 1970, the New York Times reported that at southern segregation schools, “curriculums are generally not on a level with public schools in the same area. Their teachers as a rule earn less and are therefore not usually as qualified as their public-school counterparts. And their facilities and equipment are seldom comparable to those available in the public system.” A 1976 study came to a similar conclusion. “Analysis conducted on the eleven schools indicates that the schools operate under severe handicaps,” the study noted. “They frequently have an insufficient pool of children from which to draw to assure adequate financial support, inadequately prepared teachers, weak headmasters who lack training and experience in administrative roles, and a restricted curriculum.” Yet the report also pointed out that parents had strong positive attitudes toward segregation schools. “The schools offer solutions to situations which the parents believe would be catastrophic for their children,” it concluded.
As part of an oral history project, former pupils at segregation academies reflected on their schools’ deficiencies. Renee McCraine Taylor, who attended a Citizens’ Council–funded school in Jackson, Mississippi, said of her experience, “I started fourth grade at a just-organized private school operating at Hillcrest Baptist Church. Later on, I moved to Citizens [sic] Council School Number 2, later known as McCluer Academy. . . . The overnight all-white schools had barebones curriculum and resources. . . .
There were no foreign language or art classes at McCluer. No tennis courts or volleyball teams either. There were very few advanced classes available to students.” Others noted darker aspects of these schools. In a follow-up to an online essay that went viral in 2019 titled “Are You a Seg Academy Alum, Too? Let’s Talk,” Pillow Academy graduate Ellen Ann Fentress wrote, “In my school in the ’70s, slurs and racist jokes were as common as acne and eight-track tapes. Yet running even deeper was a persistent unspoken pathology: a willed ignorance to the world beyond our chosen white one. I never knew any Black kids my age in my Mississippi hometown, which was split racially 50-50.” We could not ascertain the degree to which the Lost Cause and other elements of white supremacist ideology have been passed down to the current generation of segregation school students, whether explicitly or obliquely, although there is some evidence this is so. Fentress wrote of the Pillow Academy curriculum in the early 1970s: “What we learned in Pillow history class was distorted. . . . Enslaved people had enjoyed good treatment and Reconstruction—the brief years when black Mississippians held office and voted in substantial numbers—was an era of white suffering like the Civil War itself. None of us heard a word about the lynching of Emmett Till in our hometown’s backyard, although the visiting Chicago teen’s death had drawn international coverage in 1955 and launched the civil-rights movement. When I finally heard about the Till case—I was 25, living 260 miles away. . . . I recognized the last names of classmates I’d known whose parents and grandparents had been in law enforcement or led the winning defense of Till’s murderers. . . . Bryant’s Store, the site where Till allegedly flirted with the owner, was nine miles from my former public school.” We could not learn much, either, about the current quality of the education in these private schools, which are not required to administer standardized tests. While the website of Pillow Academy boasts nearly 100 percent college attendance, the website Niche, which rates thousands of private schools through reports from students and parents, suggests that only about half of Pillow’s graduates go on to a four-year school and few of those leave the state. Niche gives the academy a B grade. Indeed, the quality of the private academies has been questionable from the outset, according to Michael Fuquay. “Remarkably, although private school advocates emphasized that their primary interest was educational quality, no one expressed concerns about the quality of education that would be provided by an under-resourced, upstart school run by admitted novices.
Segregation was the first and last word in educational ‘quality.’” In fact, Fuquay found, the racism of white families meant that children ended up going to worse schools than they would have if their parents had supported integrated public schools.
No comprehensive list of the South’s segregation academies exists, but researcher Christine Jang-Trettien’s archival research conducted for this book shows that for the 140 most disadvantaged counties in the South as measured by the Index of Deep Disadvantage, nearly half (65) mounted a full-frontal attack against desegregation by forming at least one all-white private school. Jang-Trettien documented a total of 136 such schools in these counties. The reports of such low educational quality in the segregation academies themselves and the rapidly declining performance of the public schools they left behind suggest a question: Is one central mechanism linking places of deepest disadvantage to their pasts the degree to which school systems have remained segregated by ethnicity and race?
CHICANO RESISTANCE AND WHITE FLIGHT
In the Winter Garden region of South Texas, Rochelle Garza, now in her early fifties, told us that she and her nine siblings, like so many other South Texas children of migrant workers, missed weeks of school each fall, delayed from returning home by the northern harvest. Come spring, the kids missed more school, as soon as the field work began in the cotton-growing regions of Texas or even farther north. Of her five-year-old brother, Garza recalled, “They’d give him a little bucket and he knew what he had to do [to help with the harvest].” In communities across South Texas, this pattern repeated itself. “Home” was a place where a family stayed for only a season. Agricultural work was available if the parents were willing to traverse the country with their family in tow—often in a rickety bus or even the back of a pickup truck. In the Winter Garden, the Anglo minority controlled everything in these migrants’ hometowns. Elections were held in the summer, when the large Mexican American majority was away. Across South Texas, public office and key community leadership roles such as school board and city council membership were held exclusively by Anglos. Many in the Mexican American community felt they had no choice but to go along with the whites in power, because the Anglos were the source of all the jobs. Over time, more Mexican American citizens remained in Crystal City yearround, some working at the giant Del Monte canning plant or for other employers around town. Faced with Brown v. Board of Education, the Anglo minority had to figure out how to maintain power over the Mexican American majority in the key social institution of the schools. They were good at it. In the wake of Brown, both the Carrizo Springs school district in Dimmit County and the Crystal City schools just down the road chose to desegregate, but both maintained de facto segregation in the elementary schools (the only somewhat integrated elementary school was the white “American” school). Several other South Texas school districts followed suit, including Brownsville and Laredo.
By the end of the 1960s, the vast majority of students at Crystal City High were of Mexican origin—roughly 85 percent. Yet a look at the yearbook from the time clearly indicates that most of the roles of distinction within the school—homecoming king and queen, for example—were reserved for Anglos. As the number of Mexican American students enrolled in high school rose, one Mexican American cheerleader was allowed on the squad, but the other three spots were reserved for Anglo girls. “It wasn’t a written rule, but everybody knew,” one student at the time, Diana Palacios, recalled when we spoke with her in 2021. Palacios had been a cheerleader in the segregated junior high school she attended. Once she enrolled in the “American” high school, though, she knew exactly what to expect: “The one Hispanic that got in would be [on the squad] until she graduated, and then someone else would come in.” At about the same time, a new generation of Hispanic leaders began to gain influence in Crystal City. Adopting the identity of “Chicanos,” these activists were looking for ways to challenge the system. One, Jose Gutiérrez, had grown up in Crystal City before becoming the rare Mexican American kid from town to go to college and earn a graduate degree. Once back home, he started a newspaper for the Chicano community, which “became very, very successful because people could read about themselves for the first time ever. You could make the front page, [which they] never did in the other [papers], unless you killed somebody or you were charged with a murder,” Gutiérrez told us in 2021. He and other young leaders wanted to fight what they saw as an unjust system: unfair voter registration laws, unfair labor practices, systemic racism in all its forms. But, as Gutiérrez explained, “people don’t understand that bullshit.” Instead, he said, “tell somebody, ‘They don’t want your daughter to be a cheerleader because she’s got brown legs, those are ugly, only white legs are pretty,’ [and they will understand that, because] no father has an ugly daughter.” In the spring of 1969, Diana Palacios wasn’t going to try out for the cheerleading squad because the Mexican American slot was already filled by her friend Diana Teres. But, she told us, another friend “kept after me, ‘You should try it out. You should try it out.’ I didn’t want to because I knew it was useless. Then she finally convinced me, [and] I thought . . .
‘What do I have to lose?’” Palacios’s father was a business owner, so his job wasn’t at risk, but the family knew there could be other consequences.
In the 1950s, when Palacios’s father had run for Zavala County sheriff, “he got a beating from the Texas Rangers. They wanted him to take his name off the ballot,” she told a newspaper reporter in 2001.
Even though she knew what she was up against, Palacios decided to try out. Once again, the teachers judging the girls filled three of the four slots with Anglos. “The students [were] upset, and they started talking about a walkout,” Palacios told us. The powers that be offered the students both a stick and a carrot. The stick: walk out and you lose all course credits for the term. The carrot: a larger team with three Anglos and three Mexican American girls. While they opted for the latter, Palacios said, “we weren’t satisfied because three and three is still not fair because we’re eighty-five percent of the student body!” Over the summer and into the fall of 1969, the students continued to meet. They decided not to back down. In fact, they upped the ante: not only should cheerleaders be selected on merit rather than the color of their skin, but the school also needed Hispanic teachers and counselors. “Most of the [Anglo] counselors, they would advise us to be a beautician . . . or be a maid in a hotel. Heaven forbid that you want to be a doctor, lawyer, or something!” Palacios recalled. “What happened with the cheerleaders opened our eyes to the actual discrimination, and how we were being shortchanged. . . . It got to the point that we’re like, ‘No, we’re not accepting that.’ And we’re also not accepting that we can’t speak Spanish in school. We [needed] books that gave us our history; any time [our textbooks] talked about a Mexican, he was having a siesta under a tree. It was always a lazy Mexican.” That fall, the students planned their offensive, partnering with Chicano movement leaders, including Gutiérrez. “We started going to school board meetings asking for things to change, but it all started escalating. When our parents went to a meeting and [the school board members] wouldn’t listen to them either, we decided to walk out,” Palacios recalled. Many of the student leaders’ parents—mostly independent business owners—were already known for their civil rights activism, a story we tell later in the book. The tension built to a crescendo on December 8, 1969: “Everybody started chanting: ‘Walkout! Walkout!’” After that day, Palacios said, “every school day started at the steps in front of the school. We said the Pledge of Allegiance and a prayer, and the rest of the day we walked around the school, [some of us holding posters reading], ‘Brown legs are beautiful, too, we want Chicana cheerleaders.’” Over time, more and more students and families joined their ranks.
Gutiérrez recalled that “by the end of the seventeenth day of the walkout, we had like seventeen hundred kids out, almost eighty percent of the school.” They carried signs that read, “We are not afraid to fight for our rights,” “Chicanos want to be heard,” and “Education and discrimination don’t mix.” As momentum built, those who had been hesitant to challenge the power structure lent their support. Parents “would bring us tacos and coffee and water. They’d be out there to watch [so] the police wouldn’t beat us,” Gutiérrez remembered.
Members of Congress brought the student protest leaders to Washington, DC, to discuss the matter. Diana Serna Aguilera, another cheerleader, told us she remembers asking for Senator Ted Kennedy’s autograph: “It meant a lot to me because he was a Kennedy. . . . Mexicans just love the Kennedys.” When the cheerleaders returned home, the local media were waiting. “We answered questions about what we did in Washington, what we accomplished. And that’s when we informed the community that . . . the federal government was going to come down to mediate because the school board had refused to meet with us, even though we were legally on the agenda. And it was our right to be heard and to petition the government.” Finally, on January 9, 1970, in the face of federal scrutiny, the board caved in to the students’ demands. The victory was far bigger than an extra spot on the high school cheerleading squad. This shift in the political winds was as strong as a twister. With one exception, before the 1971 elections only Anglos had served as city manager. But from then on, only Mexican Americans held that position. While the school superintendents had all been Anglo before 1971, they were all Mexican American thereafter. Yet Anglo backlash was as severe as it was sure. After the Anglos lost control of the schools, nearly all the Anglo students moved to nearby Carrizo Springs High School, which had a larger proportion of whites than Crystal City High, transferred to private schools, or left the area altogether. While the school had been roughly 85 percent Hispanic before these pivotal events occurred, the figure rose to 98 percent following the protests and white flight.
Nevertheless, what is now known locally as “the cheerleader revolt” or simply “the walkout” is still a point of pride in the town, and for good reason; it was a moment of great awakening for the girls and the Mexican American community at large. Soon after, Diana Serna Aguilera was part of the first all-Chicano cheerleading squad at Crystal City High. When that team went to the regional competition, she recalled, “we won superior ratings. . . . We brought home all these ribbons and all that [even though] we were competing with mostly Anglo girls. . . . It was so exciting and so affirming that ‘Hey, there was really no reason to keep us out of cheerleading.’ [We could] cheer as good or better than [the Anglo girls; the exclusion] was just because we were brown.”
QUALITY EDUCATION
These days at Marvell Academy in Phillips County, Arkansas, the homecoming court is all white. So are the football and basketball teams.
The cheerleading squad is all white. Yet the website reassures prospective students and parents that the academy was “founded on Christian principles” and that it strives to foster empathy, among a list of other virtues. At DeSoto School nearby, there are no students of color. Its website features an all-school photo with the youngest pupils in red, those a little older in blue, and the high schoolers in white, matching the flag in the foreground. The school’s purpose? A familiar refrain: “simply and directly stated as quality education.” The website tells us that DeSoto strives to give students “an understanding of our cultural heritage and respect for our country” along with “a strong moral character.” Like many segregation schools in Mississippi and elsewhere, Greenwood’s Pillow Academy adopted a nondiscriminatory admissions policy in 1989 to gain tax-exempt status and to qualify for grants, but a decade later there had yet to be a single Black student, according to the New York Times Magazine. Today, the academy has more than seven hundred K– 12 students, roughly 90 percent of whom are white and only 3 percent African American. Pillow’s website features an all-white administrative and teaching staff. The school’s diversity statement reads, in part, “Diversity is key to the mission of Pillow Academy. . . . Pillow Academy will continue its commitment to include diversity among the students and staff along with their perspectives into curriculum and activities.” To live this out, however, the school will have to address its long and shameful legacy.
Meanwhile, there are ample data on how the public schools in Leflore County are faring. In 2018–19, less than a quarter of the students in the school district tested proficient in math and less than a fifth in reading, compared to 47 and 42 percent, respectively, statewide. Leflore’s graduation rate lagged well behind the state’s average, and the proportion of students deemed college or career ready was less than a third of the state average.
ACT scores at Greenwood High averaged 15, considerably below the minimum (18) needed to attend one of the four leading state universities.
Mississippi schools in general perform below the national average, according to Quality Counts 2020, Education Week’s system for rating the nation’s schools, although Mississippi has made positive strides in recent years. Indeed, education officials have lauded the state’s improvement on the National Assessment of Educational Progress, which charts student achievement nationally in core instructional areas. In school districts across the Delta, however, low scores on state tests are still endemic. According to data from the Mississippi Department of Education, well over a dozen Delta school districts merited an F grade in the 2018–19 school year—the one just before the COVID-19 pandemic, after which education statistics became skewed or were not collected at all. Another dozen or so earned a D, and a few more rated a C. A smattering achieved higher grades, such as the Western Line School District in Washington County, which got a B. Schools in this district are far closer to evenly split between Black and white students. Outside the Delta, several Mississippi districts with a diverse student body earned an A grade, and there are numerous districts where Black students score at or above state averages in English and math.
The challenges faced by schools in Mississippi’s Cotton Belt are shared by other Cotton Belt schools. Of the Alabama public schools labeled failing in the 2018–19 school year, 43 percent were in Cotton Belt counties, a disproportionately large share. Sharp educational deficits are seen in the Georgia Cotton Belt as well.
Surprisingly, none of our low-income interviewees in Leflore County complained about the schools. We can’t say for sure, but perhaps this is because lower-income Black families have been exposed to segregated, low-performing schools for so long that they have become inured to the reality. However, Angela Curry, a Black community leader chairing the Greenwood-Leflore-Carroll Economic Development Foundation, sees a big problem: “Our public schools are probably ninety-eight percent African American. It’s not diverse at all. I see that being a problem within itself, because that’s just not the way the world is.” In the first decade of the 2000s, the city contemplated elements of the new Greenwood Comprehensive Plan and held a series of community forums. Attendees complained that the schools were “dated” or even “terrible.” Nonetheless, they noted that voters would not support a bond issue to renovate them. A report concluded that “public meeting and stakeholder input indicates that outmigration is due primarily to the perceived lack of quality of the public education system in Greenwood.” These stakeholders also noted that strong schools are key to recruiting new industry.
Teacher shortages, which are rampant in the Delta, are an important factor affecting the quality of education. One recent study found that the odds that a school district in the Mississippi Delta will have a teacher shortage is 115 times greater than in a non-Delta district in the state. Here again, in the challenge of teacher recruitment and retention, there is a direct link to the region’s past. As Michael Fuquay notes, “In communities where the academy became the de facto white public school, white elites, who had always resented the expense of public education, suddenly found it possible to dramatically slash property tax assessments.” The Hechinger Report, a publication that covers inequality in education, notes that, due to low stakes for whites, bond issues to build new high schools in Delta counties have failed to gain support. The publication points to Holmes County, adjacent to Leflore, where “educators and students . . . dream about what new school buildings, enough licensed teachers, new books, or even just a fresh coat of paint on peeling classroom walls would mean for them.” In the modern structure that replaced the historic 1930s-era Crystal City High School (now a middle school) where the cheerleader revolt occurred, only a handful of the school’s students are white. This situation is not unique to Crystal City: across Texas, more than a million Black and Hispanic children attend schools with few or no white peers. News stories regularly feature the ongoing struggle for integration in Texas schools. Yet Crystal City High School, with its brand-new auditorium, is the “pride point of the town,” helping “students have more pride in their school, not having to come into a run-down building,” one school official told us.
“Our town is known for its high school band. More so than the athletics, . . .
[the] band has always been the gold star for our talent.” Before, the band had to perform in an “old, run-down auditorium” that eventually became a storage area. Now, the auditorium is worthy of showcasing the young talent in town.
In 2019, that venue hosted a celebration of the fiftieth anniversary of the cheerleader revolt. Many of the original protesters and “dignitaries from all over the place” were on hand, Dr. Maricela Guzman, a guidance counselor at the high school, told us. “Then, of course, our [whole] town participates, so that’s something really neat.” School pride is also the glue that holds the community together in Brooks County, in the Trans-Nueces region of South Texas, where Falfurrias is the county seat. A recent graduate of Falfurrias High told us that people in his community are always eager to “hype up [the school teams or the] academics. . . . If one of our teams makes it to [a] playoff, they’ll decorate the town all in green, the school’s color. It’s like, ‘Who would do that?’ In this town, we do that. . . . We’re just very supportive in this town.” In dozens of conversations with Falfurrias residents, we heard stories of school spirit again and again.
We gleaned no such narrative from those interviewed in Greenwood, Mississippi. But as Dr. Tamala Boyd Shaw, founder and head of school at Leflore Legacy Academy, the city’s new charter high school, explained, it is hard to build community pride when the schools are so divided by race.
There are five high schools serving the small community: two public, which are nearly all Black; one private, which is almost all white; one for students with disabilities (North New Summit School); and one for boys who have struggled in the public schools (the privately run Delta Streets Academy).
Further, there is little collaboration across the schools. During the COVID19 pandemic, Boyd Shaw suggested to her peers that “we should be talking about ‘What is it that you’re doing to prevent the spread of Covid in your schools?’ ‘How are your virtual classes going?’ ‘What are the subjects that you offer at your high school?’ We should be asking Pillow, ‘How is it that all of your seniors are getting these scholarships . . . ?’ [But] there [is] no collaboration. Everybody’s cordial, but . . .” Boyd Shaw, a Greenwood native and graduate of the city’s all-Black Amanda Elzy public high school, is an experienced administrator who returned to Greenwood in her early forties to “give back.” In college, despite earning one of the top GPAs at her high school, she learned that “my peers were leaps and bounds beyond my knowledge.” She said, “I had teachers who did the very best they could with what they could. But at Amanda Elzy High School . . . we had books that were . . . reject books, the old books that other kids had [had]. Like my name would be the last name [in a textbook with] ten spaces to put a name. The books were that old. . . .
But over at Greenwood High School, air [conditioning], beautiful buildings, tissue in the restroom . . . and majority-white at that time.” Boyd Shaw aims to provide all students at Leflore Legacy Academy— including white kids whose parents can’t afford Pillow’s tuition—with the resources and enrichment she was denied. “What we want to do at Legacy is, we want our scholars to go play soccer . . . to go get on the softball team.
We want them to go to [the] theater. Somebody can tell me, ‘So Dr. [Boyd] Shaw, you just basically want to turn this into a Black Pillow Academy?’ Call it what you may. We want to improve outcomes for children.” If her school does manage to attract a more diverse student body than others have, and if Pillow Academy welcomes more Black learners, even these small gains in integration should be counted as progress against the seemingly immutable patterns of the past that have mired the community in educational mediocracy, not to mention robbed the town of a source of pride.
Meanwhile, in South Texas, the good news is that the public schools— still nearly all Hispanic—boast high graduation rates, a sharp departure from the past. The bad news is the poor student performance on standardized tests. As a result of the low test scores in both the Falfurrias and Crystal City districts, some of the elementary schools earned an F rating from the State of Texas in 2018–19, though both high schools in these districts merited a B. The school administrators we talked to were hard-pressed to explain the contradiction between high graduation rates and low test scores. But one read is that the paradox is simply another example of the present being linked to the past, when children spent precious little time in school but were sent on to the next grade nonetheless.
Dr. Guzman, one of the guidance counselors at Crystal City High, believes that tests themselves are the problem: “Those tests are not developmentally appropriate,” she told us. “So at a very young age, our kids get discouraged, because teachers are under pressure, campuses are under pressure, principals are under pressure to just make the grade on accountability.” Due to the pressure, she said, the kids “get their own stigma, ‘Oh I’m always going to fail,’ and you do have kids that have failed every single time. . . . Our kids in poverty tend to be the ones who score the lowest, and so they get discouraged.” She complained that, due to government mandates, teachers must “teach for the test. We should be teaching more creatively, like we used to.” Her school district is not alone in struggling to recruit and retain “our stronger teachers, because again, finances. [Because of the low pay] we lose good teachers sometimes. And we don’t develop [our new teachers] as well as we should, . . . so [their students] don’t do well on these standardized assessments.” But the quality problem may go deeper than the perceived troubles with the tests themselves. In interviews with nearly three dozen Zavala County residents, we heard one refrain repeatedly: Graduates from Crystal City High can’t make it at a four-year college. To catch up, they have to start at a junior college. Those few who told us they had transferred out of the Crystal City schools to a district in another region were shocked at the differences in students’ educational progress, as were those who had transferred in. Graduates who had gone to college often completed their degrees only in their thirties, or even into their forties, after multiple tries.
The young women involved in the cheerleader revolt, each of whom received scholarships to a prestigious school, completed college only after stopping and starting multiple times, as they told us when we interviewed two of them.
But there is another potential explanation for the problem. Many of the grandparents and some of the parents of kids who are now in school completed so few grades that they may struggle to help their kids with homework. In South Texas, adult literacy rates are among the lowest in the nation, just as they were at the height of the internal colony that operated there. In Brooks County, fully 55 percent of adults are at or below level one literacy, roughly comparable to that of a first grader just learning to read, while the figure is 60 percent in Zavala County. Student test scores in reading are especially low, perhaps a reflection of this reality.
Undaunted, Dr. Maria Casas, superintendent of the Brooks County schools, has been busy putting together a college readiness curriculum. Early results, she told us, have been dramatic. “Last year, for the first time in the history [of the school district], we had ten students, which is ten percent of the [twelfth-grade] student population, graduating with an associate [of arts] degree. Ninety percent of our students graduated with college hours and/or certifications. . . . One hundred percent of the students graduated with a financial aid application and applied for college. . . . We did that with an investment of less than $10,000.” Brownsville, Texas, has had a similar program in place for years. Despite poverty rates that nearly rival those in Brooks County, test scores regularly exceed the state average (though that average is admittedly low). What may seem impossible, Dr.
Casas insists, can be done.
Andi Guerrero, principal of Dr. Tomas Rivera Elementary School in Crystal City, described her school as a “campus of joy.” Even so, the school has faced severe challenges. “Texas will rate your campuses on an A to F scale, and my campus has been rated F . . . ever since I’ve been principal,” she said. The extraordinarily high rate of child poverty across much of South Texas poses challenges that are hard to overcome. Over the five-year period between 2016 and 2020, roughly a third of the population in Zavala County lived in poverty. For children, the rate was more than twice the national average.
Prior to the COVID-19 pandemic, Guerrero was optimistic that the necessary partnerships were forming; momentum was building for major change. While her school has a long history of students failing to meet state standards, Guerrero told us that the elementary and high schools are working “hand in hand” and seeing positive effects. When the pandemic hit, “everything kind of came to a standstill,” Guerrero said, but she hasn’t given up hope: “I think [the momentum is] on pause. I don’t think we’ve lost it.” One day, she hopes, there will be more to be proud of in Crystal City than the six-foot Popeye statue downtown. Meanwhile, the embedded history of separate and highly unequal education in South Texas is a key mechanism through which the deep disadvantages of the past are replicated in the present.
It’s worth remembering that these stories from the Delta and in South Texas represent only a microcosm of the narrative of American education nearly seven decades after Brown. Today, America’s schools remain sharply segregated by ethnicity and especially by race. As economist Rucker Johnson powerfully illustrates in his book Children of the Dream, American schools have been rapidly resegregating while academic performance has been falling, which suggests how profoundly the two may be linked.
Johnson quotes a 1983 Reagan administration report, Nation at Risk, which concluded, “If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war.” Johnson found that “the racial makeup and social conditions of our public school classrooms are nearly identical to those of the Jim Crow era.” What are the consequences of this lack of economically and racially integrated schools for Black, Hispanic, and white children? Johnson studied this question by following the story of the desegregation of schools after Brown. Bringing together data from surveys and from administrative records of each school district’s level of integration in a given year, he captured what is known as each individual’s “dose response”—in this case the number of years Black and white students were exposed to a school system under a desegregation order, ranging from zero for most people born before 1950 to up to twelve for those born later. He wiped out differences in family background or other variables by creating “virtual twins” who had experienced segregated and desegregated schools.
Johnson’s work sheds crucial light on a critical question: What if the places we have written about in these pages had made different choices about desegregation? What differences might it have made in the outcomes of their children today? He documented that following a desegregation order, segregation levels fell markedly, per-pupil spending increased dramatically, and there was a sharp reduction in class size.
Johnson then contrasted the experiences of Black students who had no exposure to court-ordered desegregation with those of students who experienced the full twelve-year desegregation dose. His statistical models found that twelve years of desegregated schooling for Black children was enough to eliminate differences in Black-white educational attainment.
Next he asked whether these benefits reverberated across a student’s lifetime. By comparing the average effects of five years of exposure to a desegregation order among Black students to no years of exposure, he found an increase in annual work hours and wages that combined to produce a 30 percent bump in annual earnings in adulthood, a significant decrease in poverty, large improvements in marital stability, and a sharp increase in the likelihood of being in good or excellent health. Perhaps more profoundly, given the structural cycle of violence described later in this book, he documented that for Black children, being exposed to a desegregation order beginning in the elementary grades yielded a 22 percent reduction in the probability of being incarcerated as an adult. These gains were driven by the increases in resources available in desegregated schools, the goal that animates the work of Dr. Boyd Shaw. Meanwhile, white children who experienced desegregation did as well on all these metrics as those who had not been exposed to a desegregation order. The conclusion: no one lost. Although Johnson’s analysis didn’t include the gains to Hispanic children from desegregation, there is no a priori reason the story should differ, as the patterns of segregation within their schools were so similar in the years before Brown.
The failure of separate schools in America is that they have never been even close to equal. Ongoing segregation and resegregation—both the resegregation we have seen in the Cotton Belt South, aided and abetted by segregation academies, and the resegregation in South Texas that has occurred due to Anglo flight—are the results of colossal human and policy failures. Segregation is a key mechanism whites have used to undercut the chance that Black and Hispanic children can do better than their parents.
The implications of Johnson’s results are that those who don’t have the benefit of attending equal schools will experience harms that extend beyond childhood into adulthood. School segregation casts a very long shadow— from before Brown to the present. Without decisive action, that shadow will persist.

3
Nothing to Do Here but Drugs
KEN BOLIN—A SOLIDLY BUILT sixty-something man with gray-blond hair and a scruffy beard, known locally as “Pastor Ken”—has led Manchester Baptist Church, in the county seat of Clay County, Kentucky, for many years. In 2019, he vividly recounted the events that finally spurred him to action in an interview with Bill Estep of the Lexington Herald-Leader. A twelveyear-old girl had started hanging around the church. It is hard to keep a secret in a small town, so Bolin knew her mom suffered from addiction and supported her habit through prostitution. He tried to find the woman help, but treatment beds were scarce. When he finally found a placement, she refused to go. Just days before Christmas 2002, she set out on foot to a drug dealer’s house to score, then collapsed on her way home. She died in the cold. Bolin officiated at her funeral on Christmas Eve. “It was devastating to me,” he told Estep. The death of that young mom instilled a deep conviction in Bolin that Clay County’s churches must step up to fight the scourge that, even in those early days of the opioid epidemic, had already begun to consume this mountain community.
Meanwhile, Doug Abner, pastor of one of the local Pentecostal churches, was learning of youths who were getting addicted, overdosing, and even dying. These kids weren’t only from tough backgrounds. Some were children of local business owners and other community leaders—even the daughter of the school superintendent had come close to death. For Abner and the whole community, what was happening to these young people was a real wake-up call: “When your kids start dying, you look at things different.” Thus began a regular Saturday prayer meeting where a small group of pastors gathered to pray for the community. More and more people—not just clergy but also parishioners, some with loved ones who had perished— joined the group. Quickly, they realized they had to take a public stand.
They decided to organize a march to confront the dealers who had taken over the town and the corrupt public officials who were complicit, willfully turning a blind eye.
On May 2, 2003, the day of the march, the weather turned rainy and cold. Bolin, Abner, and the other organizers feared turnout would be low.
That morning, Abner told the Lexington Herald-Leader, one local official— Abner wouldn’t say who—had called warning him that children shouldn’t attend. There was a rumor that a drug dealer planned to plow a truck through the crowd. “It was another one of those veiled threats,” Abner recalled. He hung up on the caller.
That afternoon, congregants from roughly sixty churches arrived at the parking lot of Eastern Kentucky University’s Manchester campus to walk to City Hall. As the Manchester Enterprise reported in a 2019 retrospective commemorating the event, “Nobody could have predicted what would happen that day. 3,500 people showed up for what many say was the largest single gathering in our county’s history. Nothing before had brought that many people together for one purpose. . . . It was evident to all, this group meant business. They weren’t intimidated by the political structure or power. This was an effort to save our county, our children and our families.”
America’s internal colonies are some of the sickest places in the nation, but the sickest region of all is central Appalachia. The most disproportionate cause of death here compared to the nation as a whole is the sharp rise in mortality due to drug overdose over the past thirty years. On November 17, 2021, local listeners tuned in to regional newscaster Carrie Hodousek’s reporting that following the onset of the COVID-19 pandemic and the rise in substance use that accompanied it, “yearly overdose deaths have topped 100,000 for the first time . . . according to new federal data published Wednesday.” While overdose deaths increased nearly 30 percent between April 2020 and April 2021 nationwide, “Kentucky ranked third with a 55 percent spike.” Hodousek concluded, “Nationwide drug overdoses now surpass deaths from car crashes, guns, and even flu and pneumonia. The total is close to that for diabetes, the nation’s No. 7 cause of death.” From the earliest days of the opioid epidemic, Clay and its neighboring counties have seen some of the highest opioid prescribing rates in the nation. When we began our fieldwork there in 2019, thirteen pharmacies were operating in the tiny hamlet of Manchester, a city of only eighteen hundred residents, all but two opening since Kentucky expanded Medicaid in 2014. Apparently, there has been plenty of business to go around. Even in 2019, when the dangers of opioids were well-known, 1.3 prescriptions for these powerful and addictive painkillers were filled annually for every person—man, woman, and child—in the county. Residents we spoke with reported high rates of concurrent substance use (e.g., methamphetamines and prescription opioids) and believed that methadone and buprenorphine (commonly known by the brand name Suboxone) were increasingly used recreationally. By the early 2020s, black tar heroin had arrived in town, a cheap but potentially lethal substitute for prescription opioids. Research has shown that most people who start using heroin were previously using prescription opioids.
The drug epidemic in central Appalachia is the fallout from a modernday extractive industry—led by Big Pharma—that in many ways mimics industries like Big Coal and Big Timber that have come before. We tell the story of these other industries later in this chapter, but here we focus on the modern-day version of capitalist extraction, one in which capitalists depend on the local family practitioner—and the pharmacist willing to turn a blind eye—to do their bidding. Witting or not, the resource these accomplices extract is the health and well-being—really the bodies—of their own community’s most vulnerable residents. The wreckage of the timber and coal industries pockmark the topography of the region, with its hillsides washed clean of topsoil, with its muddy streams, flat-topped mountains, piles of slag, rusted tipples, and ruins of former company stores.
Everywhere, too, are the signs of the human wreckage of opioids—on the roadsides, in the emergency rooms, and at the morgues.
Why central Appalachia? It is tempting to look no further than the easy cultural stereotypes that flow from memoirs such as J. D. Vance’s Hillbilly Elegy, which can be read as a reprise of the age-old notion of the backward mountaineer. Instead, Princeton economists Anne Case and Angus Deaton have attributed part of the rise in fatal overdoses and other “deaths of despair” (suicide and liver-related mortality) to a long-term decline in the life chances of the white working class, a trend as evident in this region as anywhere else. Surely the infiltration of drugs was driven by a surfeit of local demand caused by worsening economic conditions. But the bituminous coal region didn’t become an epicenter of opioid addiction as a result of long-term economic decline alone. Other researchers point out that these same economic conditions characterized other regions as well, yet the epidemic didn’t manifest in those places to nearly the same degree.
Patrick Keefe, author of Empire of Pain: The Secret History of the Sackler Dynasty, places the blame squarely on Big Pharma. Keefe found that Purdue Pharma intentionally chose the back roads of central Appalachia to market its new blockbuster long-acting opioid OxyContin.
Why? Long before OxyContin was approved by the Food and Drug Administration in the mid-1990s, enduring problems in the region had laid the foundation for an unprecedented human catastrophe. These vulnerabilities put a target on central Appalachia’s back.
Purdue “targeted certain regions in particular—places where there were a lot of family physicians,” whom they assumed would be more naive and thus more susceptible to persuasion, notes Keefe. Central Appalachia fit the bill, as nearly all the doctors there were family physicians. A related factor weighed heavily as well: central Appalachia topped the nation in disability claims. People on temporary (Workers’ Compensation) or permanent (SSI or SSDI) disability automatically qualify for Medicaid—providing a means to pay for prescription painkillers even before the Medicaid expansion of 2014. Many had qualified for these programs due to chronic pain.
Keefe describes how Richard Sackler himself, the president of Purdue, led the marketing campaign for the new drug, and how he said the company “focused our salesmen’s attention” on physicians “who write a lot of prescriptions for opioids.” A doctor who did so was an invaluable asset to Purdue. “Like casino employees talking about an especially profligate gambler,” Keefe writes, “the sales reps referred to these doctors as ‘whales.’” Physicians in the region had been prescribing, and probably overprescribing, other, less potent forms of opioids such as morphine, combined with Valium, well before the epidemic. Indeed, the pattern may well have gone back generations. In his 1963 classic, Night Comes to the Cumberlands: A Biography of a Depressed Area, Kentucky lawyer Harry M. Caudill writes that “pain-masking sedatives became commonplace in the region’s coalfields decades ago as doctors, stretched thin and pressed to help legions of injured miners and sick poor people, handed out ‘bags and bottles of pills.’” By most early measures, the May 2003 church-led march in Manchester was a remarkable success. It sparked a federal RICO (Racketeer Influenced and Corrupt Organizations Act) investigation into the corruption of public officials. Drug dealers went to jail, along with some complicit politicians who had provided protection for dealers—a theme we will explore in more detail later in the book. But in one vital respect, the march failed. Three years afterward, in 2006, when the rate of opioid prescriptions had begun to take off nationwide, there were 258 opioid prescriptions for every hundred people in Clay County, nearly three and a half times the national average.
The impact of these prescribing practices across central Appalachia was seldom subtle. According to Patrick Keefe, some “communities began to resemble a zombie movie, as the phenomenon claimed one citizen after another, sending previously well-adjusted, functioning adults into a spiral of dependence and addiction.” Research has found that state laws targeting “pill mills”—pharmacies that prescribe a disproportionate number of opioids—are linked to thousands of lives saved from prescription opioid overdose.
We saw a bit of the devastation of addiction firsthand when we stopped for gas while visiting Clay during a bitterly cold January day in 2019. A woman, pale and rail thin, with no winter coat, hat, gloves, or even shoes, approached our car asking if we could give her a ride home. She told us she had walked to town in search of cigarettes, yet we noticed that the package she was holding held none. When we delivered her down a narrow dirt road to her address, it became clear that she was living not in a house, but in a car parked on the side of the road. Repeatedly, Clay residents across the class divide had warned us about the danger posed by “walkers,” people living in the hollows (small groupings of homes amid the hills) who, they alleged, appeared in town—zombie-like—to buy, beg, or steal for a fix.
Today, nearly thirty years after OxyContin was first marketed in central Appalachia, these high-potency opioids are still prescribed, albeit in a more tamperproof form. Across the nation, the prescription rate peaked in 2012 at 81 prescriptions per 100 persons and then declined. Yet as of this writing, there are still more opioid prescriptions than people in Clay County. Several of the many pharmacies there will still fill those prescriptions. And Big Pharma continues to rake in enormous profits from its sales in the region.
In the months our team spent in Clay County during the summer of 2019, we would learn that part of what made the human field so fertile for this modern-day form of extraction was that there was “nothing to do here but drugs.” It was a singular refrain—voiced over and over by the dozens of residents we spoke to. In a word, the crumbling social infrastructure of the area was seen as a reason for the drugs. We asked dozens of residents across the class spectrum what would help the community the most. The top recommendation for change was jobs. After that, it was places providing things to do, like a community center, a movie theater, or an arcade—these topped the list.
Truth be told, the first few times we heard “nothing to do” as a cause of the epidemic, we dismissed it as unlikely, a cliché. Yet after hearing the same claim repeatedly, we began to take it seriously. Perhaps we needed to look beyond economic factors, government policies, and the behavior of Big Pharma to understand fully the sources of vulnerability to the epidemic.
What if having “something to do”—something to knit the community together and give young people the sense of belonging that could fend off the anomic allure of drugs—was a vital, yet unrecognized, contributor?
One Manchester resident who goes by the nickname Sweet Pea* put it this way: “There’s really nothing around here for kids. That’s why they go to drugs.” Her house sits at the end of a well-kept gravel road. Its two shades of vivid pink are the result of a well-meaning group of “missionaries” visiting the area who didn’t realize they had bought two different hues. When we spoke, she wore two versions of her favorite color (a bright purple T-shirt and eggplant-colored sweatpants) while leaning back in her plastic chair and flashing an infectious smile. Nothing at all to do in Clay and not much across the county line either, she told us: “There’s nothing [there] anymore, since their skating rink burned . . . they don’t got anything except one movie theater.” Political scientist Robert Putnam, following the nineteenth-century French political writer Alexis de Tocqueville and others who have argued that voluntary associations are the key to building social bonds, made famous the decline of the bowling league. He warned it was a harbinger of the eventual degradation of democracy. But in the small rural hamlets in central Appalachia, it wasn’t that bowling leagues had fallen out of fashion.
Rather, there was no longer a bowling alley at all.
Dolly’s small hands shook as she dipped her spoon in the bowl of soup on offer at God’s Closet, a homeless shelter and multiservice center run by Pastor Ken’s congregation. Her bright blue glasses were a little too large for her face. When we asked her what people should know about her community, she offered, “I just want things to change. I mean, better for the kids, better for the teenagers; stuff that the teenagers can do instead of getting on drugs. Parks for the little kids, something for the teenagers to do to get them out of trouble. Stuff that they can do.” Crystal, whom we also met at God’s Closet, offered a similar analysis: “There’s nothing really here for kids, and then they wonder why they get on drugs. Because there’s nothing for them to do. Like we had the movies a long time ago. And like I said, [the movie theater has] turned into a church. And there ain’t nothing here really for young’uns to do.” Travis lives with his girlfriend, Helena, in a trailer lacking airconditioning and running water. The structure sits on her family’s land; the two live there free of charge in exchange for working to fix up the place. He offered us a Mountain Dew from a dorm-sized fridge perched precariously on a countertop while a pair of kittens frolicked amid a stack of DVDs borrowed from the library. A horror movie played silently on the TV. This slight man with short brown hair told us he wasn’t fond of his hometown and didn’t have happy memories of growing up here. “There ain’t nothing around here to do,” he said, slouched in a chair tucked in one corner of the stiflingly hot trailer. “One time when I was younger, we was [hanging out] in Walmart’s parking lot, just listening to our stereos and stuff, [and] the cops come and run us off. . . . That’s the big flaw around here. That’s why I think everybody turns to drugs around here.” Marie moved to Clay County from Indiana while in her teens. She told us she is desperate to go back because “[they have] more stuff for the kids. . . . Down here, they just want to build roads and . . . drugstores, so it’s not nothing that you can really do down here.” Middle-class and poor, resident after resident in Clay County linked the opioid crisis to the loss of the bowling alley, the roller-skating rink, the swimming pool, and the movie theaters. Several noted that even the one park with a playground for young children had recently been bulldozed for the construction of a new highway through town. In his 2018 book, Palaces for the People, sociologist Eric Klinenberg argues forcefully that “the future of democratic societies rests not simply on shared values but on shared spaces: The libraries, childcare centers, bookstores, churches, synagogues, and parks in which crucial, sometimes life-saving connections are formed.” These spaces, including something as seemingly simple as a beauty salon in a neighborhood or town, play a vital role in forging the human connections that weave communities together. They can provide a tailwind that lifts people’s chances in life and, like a net for a tightrope walker, can catch them when they fall.
Klinenberg opens Palaces for the People with the story of the Chicago heat wave of 1995. As the temperature rose to 106 that July, the Centers for Disease Control and Prevention recorded almost 740 more deaths in Chicago than usual, “roughly seven times the toll from Superstorm Sandy and more than twice as many as in the Great Chicago Fire.” The morgues overflowed. To understand who was most vulnerable and what could be done to prevent a similar tragedy in years to come, researchers descended on Chicago, visiting hundreds of homes, comparing “matched pairs” of victims and survivors. Not surprisingly, a working air conditioner cut the risk of death by 80 percent. But social isolation also proved deadly. In Klinenberg’s own research, he used “matched pairs” as well—but of similar communities rather than of comparable people. Through the course of his work in the aftermath of the heat wave, he writes, “I’d discovered that the key differences between [like] neighborhoods . . . turned out to be what I call social infrastructure: the physical places and organizations that shape the way people interact.” He concluded that in the heat wave, living in a neighborhood with robust social infrastructure was “the rough equivalent of having a working air conditioner in every home.” Klinenberg shows that some forms of social infrastructure—like public libraries and community centers that offer regular programming, plus churches and schools that provide space for recurring interaction—foster durable relationships. Others, such as playgrounds and parks, support looser ties, connections that may grow as people form deeper bonds. “Countless close friendships between mothers, and then entire families, begin because two toddlers visit the same swing set,” he writes.
“But when the social infrastructure gets degraded, the consequences are unmistakable,” Klinenberg warns. People hunker down and stop frequenting public spaces. Social ties atrophy. Disorder and crime rise. Old folks grow lonely, and the young get high. Mistrust increases. Civic participation, such as volunteering and voting, fades. This description fits closely with community residents’ claims about what has happened in Clay County.
We don’t mean to overdraw the portrait of Clay as a place utterly lacking in anything to do but drugs. In all of our interviews, we collected detailed information on what people do for entertainment and recreation.
Cable television is key: almost all, whether middle-class or poor, have subscriptions to basic cable, as well as to streaming platforms like Netflix and Hulu. One could argue that it’s far more isolating to stream personal content than to consume content the old ways, where families watched TV together or young people from across the community attended the Saturday matinee. But there’s at least some social contact built into the fact that some residents cannot afford the cost of these subscriptions and so depend on someone else’s Wi-Fi to download movies onto their phones. Helena, for example, lives with Travis in the trailer owned by her mother-in-law, Margie. She doesn’t have a car, so her social calendar is limited. When asked what her daily routine is like, she replied, “[I’ll] walk up to Margie’s [in the morning] and get on the internet, download a couple movies [that I’ll watch later on].” The internet is also a huge pastime, though people also blame it for all manner of ills, including marital breakups.
Seventy-one-year-old Susie practices the time-honored Appalachian ritual of sitting on her porch “visiting” with her neighbors in the evenings.
Her children visit on Sundays, she told us. “My kids, they come every Sunday and have dinner after church. And I get to see them and my grandbabies.” Stephen and his wife also have a weekly visiting ritual: “We hang out and stuff on the weekends [at the home of a friend], and he’s got a pool at his house and it’s kind of what we do. . . . We made ribs last weekend . . . and cake.” While visiting the area, we were surprised at the number of aboveground pools we saw, often fronting 1970s-era trailers or tiny, century-old Jenny Lind coal camp homes, holdovers from the company towns that once dotted the area. Even more frequent were trampolines, a poor family’s substitute for a visit to the pricey Air Raid trampoline park in London, a city with considerably more amenities about forty minutes away.
Middle-class families in Clay have access to a greater range of leisure activities than poor families. Biking, car racing, fishing, and ATVing are all popular, and there is an emerging music scene. But these activities require equipment, some of it costly. In 2020, the price of an ATV ranged from $5,000 to $11,000. Also, while some residents use fishing and ATVing as ways to get together with others, there are no official, maintained spots for these activities. Jake is among the 4 percent of county residents who are Black. He is an avid fisherman, but, as he explained to us, it doesn’t earn him many social connections: “There’s a lake at the end of Beech Creek. If you didn’t know where you were going, you’d get lost. It’s not in plain view. . . . There’s no garbage cans out there. There’s no Porta Potties or anything like that out there. It’s just a lake.” Many in Clay County go to considerable lengths to seek out things to do. Jake regularly takes his three children to the movie theater in London.
Angel drives to David’s Steak House and Buffet in Corbin, nearly an hour away, for special occasions. Scarlet’s kids attend an arts program at Eastern Kentucky University’s Manchester campus, though she told us that the enrichment activity is only open to a few: “You’ve got a lot of people here who don’t participate because maybe they don’t have the gas money to get there. [Manchester would be a much better place to live] if they had the programs and could do the funding and know how to write grants for music and for arts and things.” A few local traditions are still alive in the area, though. Angel lives with her daughter and her daughter’s boyfriend, who spends some of his free time working at cockfights, preparing the animals and watching over them during events. Despite its illegality (not to mention cruelty), this is okay with Angel because “if that makes [the kids] happy and gives them something to do, and it ain’t drug-related, go ahead, peck them chickens, baby.” Aurora has a boyfriend who raises and fights chickens for extra income: “[He’ll] chicken fight during the summer, and [the owners] give him money for taking care of them. . . . He’ll help tag them, he trains them, and he’ll fight them. . . . He don’t have the money to get in there and [fight his own chickens]. But yeah, if they win, which they’ve won a couple times, [he gets some of the winnings].” While gatherings with a few kin or friends on a porch or at a cookout are not rare, what is notable about other forms of entertainment in Clay is that, with the exception of cockfighting, which can draw big crowds, they either involve solo pursuits—watching TV, streaming a movie—or are usually done with a close friend or relative, as with fishing or ATVing. For a while, one rural church congregation hosted a youth soccer league, but that fizzled.
Chad’s Hope, a faith-based addiction recovery program established in the aftermath of the 2003 march, is perhaps one of the town’s most important antidotes to the “nothing to do but drugs” scourge. The program’s modest space was built on land donated by local businessman Charlie McWhorter, who joined Bolin and Abner’s prayer group after he lost his son Chad to a drug overdose. When the Manchester Enterprise ran an ad seeking volunteers for Chad’s Hope, ninety people showed up.
Among the lower-income residents we spoke with, the institution on most people’s lips was Pastor Ken’s church. Under his direction, it operated the nonprofit God’s Closet, offering used clothes, hygiene products, and— on Mondays—a noon meal, until the COVID-19 pandemic closed it down.
It is perhaps due to the charity work of this church that Pastor Ken is the most beloved figure in town, according to our interviewees.
Not only does Manchester Baptist Church provide sustenance to the needy, it also offers vital opportunities for community engagement.
Volunteers include middle-class parishioners and welfare recipients alike.
When the soup kitchen was open, moms on welfare could complete their community work hours ladling soup or folding and organizing the used clothing at God’s Closet. For some, the opportunity to volunteer has been a literal lifesaver. James, who lives in the church’s homeless shelter, told us that volunteering for Pastor Ken was his only opportunity to interact with others. Prior to the pandemic, he helped with the Monday meal, but he also worked with out-of-state mission teams who often travel to the area to do home repairs and install wheelchair ramps for those in need. James said the work was important to him “because I’m not sitting [around] all day. I could never do that. I would commit suicide if I had to do that. Yeah. I don’t even have a wife, you know. Ain’t nothing [else] to live for.”
Could it be true that such a mundane establishment as a soup kitchen could count as the kind of social infrastructure that can turn the tide against the allure of opioids? Political scientist Michael Zoorob and epidemiologist Jason Salemi have found that the density of nonprofits and civic organizations, plus voting and other behaviors within a community are indeed strongly tied to overdose death rates. But they did not capture what are likely the key sites of social infrastructure in a community, such as its bowling alleys, movie theaters, beauty shops, and arcades.
For our own investigation, we partnered with sociologist Michael Evangelist to exploit a unique form of government data—a census of all US businesses. Using these data, we measured losses (or gains) of key sites of social infrastructure in every US county between 1996 and 2015. We then examined whether there was a relationship between changes in the presence of these forms of social infrastructure and changes in the overdose death rates (which are more reliably measured than addiction rates) during those same years, all else held equal. These data have big limitations, chief among them being that they include only institutions that have at least one paid employee other than the proprietor, leaving out many barbershops, churches, and other one-person operations. Parks and libraries are excluded, too. This isn’t a statistical analysis that we are ready to bet our careers on.
Yet even with these imperfect data, we have found evidence that places that maintain spaces that promote interactions at the community level have seen a lower rate of overdose deaths. This is true to an extent rivaling the effects of other, more well-accepted economic factors, such as wages and unemployment rates. While far from definitive, our research provides some support for the view of so many of the people we talked to in Clay County —that when these spaces are lost, the civic safety net a healthy community forms to catch people when they fall is torn apart.
As further evidence for this claim, Klinenberg points to what he characterizes as “a growing body of neurological research showing that opioids are, chemically speaking, a good analog for social connection.” In a New York Times guest essay, “Opioids Feel Like Love: That’s Why They’re Deadly in Tough Times,” science writer Maia Szalavitz offers a summary of this research: “Opioids mimic the neurotransmitters that are responsible for making social connection comforting—tying parent to child, lover to beloved. The brain also makes its own opioids. These endogenous ones include endorphins and enkephalins that are better recognized for their roles in pleasure and pain but are also critical to the formation and maintenance of social bonds.” While opioids may impersonate love, it is clear from our research that they increase the agony of social isolation. At the neighborhood level, responses to widespread addiction have made strangers out of neighbors and kin. Scores of marriages have broken down. Parents have been separated from their kids and desperately struggle for the sobriety that might help them regain custody. Anyone who drives the back roads of the region will note the plethora of placards affixed to telephone polls advertising for foster parents. With the breakdown in social ties at all levels in central Appalachia, it is hard to imagine a place in the nation where social isolation among the poor is more pronounced, where social cohesion is more fragile.
Demographer Emily Miller and sociologists Liv Mann and Lanora Johnson—all collaborators on this research—embedded in Clay County during the summer of 2019, combining forty-seven in-depth interviews (with twenty-two low-income families and twenty-five community leaders) with over fifty hours of participant observation in three months’ time.
Repeatedly, they were struck by the forcefulness with which many of the lower-income participants emphasized that they kept to themselves, as if it were a point of pride, or even a signal of morality. As they would learn, nearly all of these experiences and attitudes are colored by the scourge of drugs.
Aurora is afraid that her relatives, many of whom suffer from addiction, will be a drain on her limited resources: “I’ve had people that they’ll come down, ‘Well let me stay a night or two?’ And then we end up with them for months. . . . You let them borrow a little bit of money and . . . they won’t pay you back. And then there’s just some that you can’t be good to. No matter what you do, you can’t be good to them, [or else] they’ll steal from you. Let’s see, I had one little girl that . . . was kin to [my husband], and she stayed a couple weeks with us, but she robbed us [to buy drugs].” Similarly, Miranda has a sister who is battling addiction and “takes advantage of me sometimes. . . . She’ll come and steal something and sell it. . . . I have to watch her. She aggravates me when she comes over because she goes through my stuff. I tell her not to, but she won’t listen to me. So I have to stand over her and watch her, every move she makes.” For some, the sheer prevalence of drug addiction has led to wholesale withdrawal from the community. “You can’t hang out with people anymore because half of them is on drugs and stuff,” Lulu said. “When they ain’t on drugs, they’re drinking. That’s what they want to do. . . . I don’t want to hang out with people with drama and crap like that.” For similar reasons, Helena claimed she has no close relationships. She used to live in Ohio with her mother until she realized she had to get her daughters away from local kin, many of whom were addicted to heroin. Ironically, due to her husband’s kinship ties, the family landed in opioid- and methamphetaminerich Clay County. Another participant said she will no longer open the door to neighbors if they come calling for fear that they are using and might harm her.
Addiction also tears at the fabric of the nuclear family. After several years of dating, Crystal finally married the father of her youngest child. A year later, they divorced because “he got on drugs. . . . Drugs took over, and I’m not like that. I don’t do that.” Similarly, Lulu described what happened when her husband, whom she had been with since she was fourteen, became addicted: “Money started disappearing. He was staying out, staying out all night, getting high and stuff like that. His priorities just weren’t with me and the girls. . . . One day . . . when he was gone . . . I left. Got [in] my car. I left everything there. Didn’t [take anything with me]. Took off.” Especially poignant are the stories of mothers who became addicted and lost custody of their kids. Four in our small sample saw their children taken by the state. All claimed they were working hard to stay clean and regain custody. Loretta, a mother of two, told us, “I just knew that if I got out there on drugs [again] that I wouldn’t get to keep my kids, or I wouldn’t have a life for them. And it was either my kids or the drugs.” Paige, a mother of three, said, “So I’m staying clean and trying to do the right thing for my kids.” This brings us to the role of churches, so numerous in Clay County that they are difficult to count. Despite their role in building social capital, we found that these key institutions—consciously or not—can also function in ways that inhibit social cohesion across the community. Klinenberg acknowledges that institutions can “set boundaries that define who is part of the community and who is excluded. They can integrate or segregate, create opportunities or keep people in their place.” To illustrate how this can occur, Klinenberg draws on the history of America’s municipal swimming pools, chronicled in historian Jeff Wiltse’s 2007 book, Contested Waters. “Swimming pools and other social infrastructures with potential to facilitate sustained, intimate interaction across group lines can easily be used to segregate instead,” Klinenberg writes. According to Klinenberg, Wiltse shows how swimming pools have been places both of cohesion and conflict, precisely because they are spaces in which residents not only build a sense of community belonging but also define the boundaries of who is an insider and who is an outsider. When integration came, many southern towns chose to fill in the municipal pool with cement.
Our statistical analysis described earlier in this chapter revealed that on average, the continuing presence of churches is a net positive and associated with a lower level of opioid deaths. Some churchgoers—mainly those from the middle class—described to us the deep sense of belonging these institutions foster. Lindsay, for example, told us, “I’ve lived here my whole life, and I have been a member of the same church my whole life. It becomes a family. . . . My church family is as close to me as . . . what a lot of my biological family is.” Yet it’s not clear whether a majority of people living in central Appalachia feel as welcomed by the churches as Lindsay does. Despite the abundance of churches in the region, membership has always been notably low.
Few of our low-income interviewees said they are involved in local congregations. Marie told us that while God played a big role in her life, church did not. “I just don’t go to church, because a lot of churches are hypocrites. . . . Most of the churches I’ve went to they’ve always talked about me, or my makeup, or pants. So I just don’t go.” Miranda said that God is the most important thing in her life, a virtue she hopes to instill in her children. But when it comes to church, she said, “Problem is with church, there are people that look down on people. If there weren’t people like them, there’d be more people in the church. I’m not judging those people because that’s their business. That’s between them and God. But if it weren’t for people like that, I think a lot of people would go to church. I know I would.” The Christian faith is an important source of identity in the community.
Aurora, for example, told us, “I would say ninety percent of the people [in Manchester], they believe in God. They might not go to church, but they all believe in God.” Dolly reminisced about her childhood in the hollow as one of thirteen children whose family practiced subsistence farming. To Dolly, faith is part of Manchester’s DNA: “God is the most important thing in this little town we got.” Yet as Klinenberg points out, “Social cohesion develops through repeated human interaction and joint participation in shared projects,” not just from “principled commitment to abstract values and beliefs.” Manchester Baptist Church, with its many charitable endeavors, has clearly been a critical source of social cohesion. Other congregations seem effective at reaching across the class divide in meaningful ways as well.
Sweet Pea has had three husbands. She attends a Holiness church every Sunday. “I was born Catholic,” she explained. “Then I met my kids’ dad.
He’s Baptist, or whatever you want to call it. . . . And then I met the guy I’ve got now, he’s a holy roller. . . . [When I first visited his church], you could feel love when you walked in the door. . . . And they help you out.
Like, my husband had a stroke. After he came home, and within about a week, we went to church and everything. . . . They took an offering up and gave it to us and we wasn’t even expecting it. Just to help us. They gave us close to $200 and we weren’t expecting it.” There is clearly power for good in Clay County’s places of worship.
Indeed, the 2003 march against drugs would not have happened without them. Members of more than sixty area churches, presumably both the more established ones and the “holy roller” variety that Sweet Pea and her husband attend, participated in the march. Fully 3,500 people showed up— roughly one in seven county residents and nearly double the population of the town of Manchester. This suggests that it wasn’t just well-to-do parishioners who came out. Indeed, when we mentioned the march, our low-income interviewees uniformly praised it.
THE COLLAPSE OF BIG COAL
This is not the first time central Appalachia’s communities have experienced an implosion of social infrastructure. In a dramatic upheaval seven decades ago, much of the social infrastructure that existed in this region collapsed all at once, leaving only remnants found in county seats like Manchester and in the larger towns. The collapse came with the shuttering of the company towns.
Big Timber and Big Coal provided much of the early social infrastructure as the internal colony here began to develop and expand. For a time, the company-owned movie theaters, bowling alleys, and other institutions that brought people together and shaped interactions. Due to near-complete segregation by race, however, these institutions were never truly inclusive of all those living in the towns. Furthermore, the pluses of living in a company town came at a heavy price: sharp limits to autonomy placed on the residents of these places. Still, their benefits were felt.
Coal towns began to appear across the central Appalachian landscape as early as the 1880s. Construction peaked in the 1920s. Hundreds of company towns dotted the map of central Appalachia during that time. Towns were often named after the mine owners, their wives (or mistresses), or other officials. Commenting on this feature of the Appalachian landscape in a 1970s oral history, local activist and historian Warren Wright exclaimed, “We really honor our exploiters here, we mountain people do. Almost every town in the area is named for some bastard from outside the region who gutted it.” Life could be tough in the early days of the industry, especially in the smaller mining operations, like the string of fourteen independently owned coal camps that began just south of Manchester and moved west along Horse Creek, as well as the rough-hewn coal camp of Barwick, Kentucky, where Robert F. Kennedy visited that one-room school. But in places where the mines were owned by large corporations such as US Steel and International Harvester, the towns were often planned with worker satisfaction in mind. In southwestern Virginia, Stonega Coke and Coal was the force behind many of the company towns. SC&C was convinced that paternalism—what some corporations called “contentment sociology”— was the key to securing a stable labor force and keeping out the unions.
Knowledge is exceedingly scarce about what life was like in many of these SC&C and other coal towns, including in Clay County. Yet we know a great deal about Wheelwright, a model coal town in eastern Kentucky’s Floyd County, built by the Elkhorn Coal Company in 1916 and later purchased by Inland Steel. Town manager E. R. “Jack” Price was a huge believer in contentment sociology. Under his direction, Inland modernized the town by building a water system, filtration plant, and sewage system and, in 1942, installing flush toilets in every home. That same year, a lot on which a boardinghouse had stood was transformed into a swimming pool, and a bowling alley was built. Inland loaned the county $15,000 for the construction of a high school. Under Jack Price’s deft management, the town grew and prospered, boasting a fine library, an inn and restaurant, a hospital, a movie theater, and a department store. There was even an eighteen-hole golf course available for miners’ use. Many came to call Wheelwright “the town that Jack built.” Life in these towns was often far superior to that of a subsistence or tenant farmer, especially once the “frontier stage” of the coal towns was superseded by the “paternalist phase,” which came with greater mine consolidation and labor shortages during the First World War. White miner Melvin Proffit grew up in a farm family but went to work in the mines in 1921 when he was seventeen. When he married, Proffit returned home from the mines just once a month. “I had to do that in order to support the family,” he said. In his home county, “there has been a number of people to starve and almost went naked. Where I worked on one of these farms for seventy-five cents a day to [buy my] bread and meat, well I was making seven to ten [dollars] in the mines.” In 1944, young Melba Kizzire, from a Black tenant farming family in Tennessee, moved with her parents to a mining town on the edge of central Appalachia. She said that it was the first time she had ever seen a telephone or a “flushing john” or had lived in a painted house. What really struck her, though, was the wallpaper. “The most beautiful wallpaper I had ever seen,” she remembered. “I was stunned for about a week. It was like I had gone to heaven you know.” As the paternalistic period advanced, medical care and education were also far superior to that received at the county facilities because salaries were paid by the company, and the clinics, hospitals, and schools were run by it as well.
Not all amenities were accessible to everyone. Like most coal towns, Wheelwright was thoroughly segregated by race. Hall Hollow was the name of its Black settlement. In Coal Towns, historian Crandall Shifflett tells the story of Hilton Garrett, a Black miner who came to Wheelwright from Alabama in 1923: “When asked why he could not go to the soda fountain and be served, Garrett explained it was due to Jim Crowism. The interviewer responded: ‘I never heard of that. How’d that get started?’ Garrett replied in a way that suggested he believed the interviewer had to be putting him on.” Wheelwright’s Black students had their own school on the periphery of town, well away from the white school in the town center. It had only three or four teachers. When the swimming pool was built in 1942, Black families were not allowed to use it, and there were separate facilities for churches and recreation, too.
No coal town was without a company store, the “social and economic nexus of the company town.” Not only goods and clothing but furniture, appliances, buckets, nails, radios, garden tools, and household gadgets were some of the items available in a typical company store. You could get your hair cut and styled, have a shoeshine, and get your laundry done. A 1947 government report characterized the role of the company store as the “mecca for everyone in every coal camp. Even when the store is closed, the men gather there in their free time, frequently after working hours, and on Sundays and holidays. It is a common sight in summer to see miners, and often the women and children in the community, sitting or squatting on the porch or steps of the store, relaxing in idle talk.” It was the only place of relaxation for the wives of the miners, where they would “learn and dispense the local news, read their mail, and meet their friends, as well as buy their groceries and supplies. . . . It is commissary, club room, and bulletin board rolled into one.” The company church was also part and parcel of contentment sociology, just like the schools and all the leisure and recreation amenities. In Wheelwright, there were Methodist, Baptist, Church of Christ, and Church of God churches, often two of each kind for both Black and white congregations.
As company towns across central Appalachia closed in the early 1950s, much of this company-owned social infrastructure came crashing down in just a few years. The churches, whose operating costs had been covered by the company, were left to fend for themselves, as were the schools and hospitals, whose relatively high-skilled staff had usually been paid a premium to work in the model towns. Those movie theaters, bowling alleys, and the like? All owned by the company. Few survived. Same with the inns and restaurants. All of it, within just a few years, gone.
Hilton Garrett, the Black miner hailing from Alabama, described the mammoth changes in Wheelwright in the span of just a few decades’ time.
According to him, Wheelwright in the 1930s was loaded with “a good bunch of people” and “a bunch of youngsters.” But in the 1950s, few people remained except the elderly. “Gone too were the filling stations, shoe shops, a soda fountain, and the bathhouse. The trains no longer came through.
Even the train station disappeared,” Garrett said.
The story of the demise of social infrastructure is richly—and poignantly—illustrated by the case of baseball. Prior to the Second World War, the Sunday afternoon baseball game was the most revered social ritual in a company town. It was the miners’ sport. Every little town had a team.
Picnic baskets of corn bread and fried chicken were enjoyed by all. Children cavorted while their mothers gossiped and cheered for the home team. Mine operators engaged in stiff competition to lure the most talented players to their camps. Even today, as we noted on visits to the region, nearly every municipality has a baseball field in the center of town. Yet the same forces that beset the coal towns led to the decline of baseball. Shifflett explains: “When the UMWA [United Mine Workers of America] organized the miners, the coal operators ceased to sponsor the teams, buy uniforms, and organize leagues. Without company backing, the number of teams dwindled. . . . In the 1950s, the closing of the company towns virtually ended the sport.” It is remarkable that so much social infrastructure was destroyed all at once, like during a war.
A question remains: How did the loss of these places, and the bonds they fostered, get etched into the bodies of the residents and the fabric of the community there? Across a broad array of diseases, central Appalachians die younger and more often than people in other parts of the country. The myriad health problems facing the region have been severe for many years. Even before the opioid epidemic, exposure to environmental toxins, limited access to health care, and risky health behaviors such as smoking were all challenges the region faced. We see evidence, though, that the loss of social infrastructure also played some role. Possibly because social infrastructure is difficult to measure, we know of no research linking its demise to a decline in health outcomes, other than the nascent work tentatively linking the loss of social infrastructure to drug overdose deaths reviewed here. Yet given what we’ve learned about the social determinants of health, we think the idea is worth exploring.
This chapter started with a march against drugs that took place on May 2, 2003, in Manchester, Kentucky. In another southern town, Selma, in the heart of the Alabama Cotton Belt, a march took place nearly forty years earlier, on March 7, 1965, for the cause of civil rights. Its participants also numbered in the thousands.
We visited Selma as part of our 2021 tour. The city was to be our headquarters for exploring the cluster of Cotton Belt counties there.
Determined to avoid yet another meal at one of the ubiquitous chain restaurants one finds on the main drag of nearly every small southern town, we headed to what promised to be the fanciest place to dine in Selma—the Tally-Ho. Like scores of other eating establishments across the South, this hunting lodge turned supper club had resisted integration by becoming “members only.” To eliminate the possibility that any Black Americans not in the know would dare try to dine there, the original owners installed a solid door with a peephole—or so the former owner (his son-in-law now owns the restaurant) told us when he stopped by our table to ask where we were from and what we were doing in town. Even in this town of 18,000, drivers with out-of-state license plates are clearly the object of curiosity.
Quickly, the Tally-Ho’s former owner launched into what was obviously a favorite story. As a salesman from Columbus, Ohio, whose territory included parts of the Deep South, he had frequently traveled to Selma. On one trip, he learned that the infamous Tally-Ho, still members only in the mid-1980s, was for sale and jumped at the chance to buy it. The first thing he did was tear down that door and integrate the club, he told us. While we were dining, several other regulars stopped by to say hello, both Black and white, including the president of Selma’s Rotary Club, a Black entrepreneur who has a business near the foot of the Edmund Pettus Bridge, site of the 1965 march. She was eager to describe what the club was doing for the city.
For instance, they had recently installed an attractive bench near the bridge.
Then we got to chatting with our waiter, also Black, who was very interested in hearing about our research. He asked whether we were going to visit Selma again. We hoped to, we said. He replied, “Next time, you’ve got to bring something with you for our city.” We asked what Selma needed most. He paused, considering, then declared, “A bowling alley! We’re short of that sort of thing around here. There is nothing for young people to do in this town.”

4
A Tradition of Violence
ON A HOT, SUNNY DAY in late June, we headed from Memphis down into Tunica County, Mississippi, and to the little town of Dundee. Here we crossed the Mississippi River on US 49 and drove into Helena, Arkansas, the Phillips County seat (home to Marvell Academy and DeSoto School, both mentioned in chapter 2). Our GPS pointed us to the courthouse, standing just shy of the levee. Steps from this edifice, on the courthouse square, we stumbled upon a memorial meant to reflect the features of a church, its centerpiece a 14,000-pound granite slab resembling an altar. The engraving read: “Dedicated to those known and unknown who lost their lives in the Elaine Massacre.” On this visit, part of our 2021 fourteen-state road trip, we had been in the Delta for about a week. From our Airbnb farm stay in Shaw, Mississippi, 122 miles south of Memphis, we headed out each day to explore the dozens of counties in the Delta regions of Mississippi, Louisiana, and Arkansas, which ranked among the most disadvantaged on our index. In place after place, we discovered astonishing stories about the industries that fueled the rise of our nation, the workers who sustained them, and the histories of human suffering they wrought. We then traveled to Memphis, from which we could investigate the Delta’s northernmost counties. Here we learned that Phillips County, first on our itinerary on this day, was the site of one of the most violent acts of retribution by whites against Blacks in American history. A quick Google search revealed that on the evening of September 30, 1919, roughly one hundred Black farmers gathered in the Hoop Spur church near the town of Elaine for a meeting of the Progressive Farmers and Household Union. Late in the evening, as they discussed how to negotiate with their white landlords for fair pay, a car with one Black and two white passengers pulled up outside. Shots were fired and returned, leaving one of the white men injured and the other dead.
As rumors spread of an impending race war, an armed white mob amassed and descended on the county. The mob was joined by more than five hundred soldiers from nearby Camp Pike, dispatched by Arkansas governor Charles Hillman Brough, who personally accompanied the troops and gave orders to “round up” the “heavily armed negroes” and “shoot to kill any negro who refused to surrender immediately.” Accounts indicate that hundreds were slaughtered, not only men but any women and children unfortunate enough to be caught in the crossfire. One Black ex-soldier wrote that it was like the victims were “nothen But dogs.” Hundreds of Black Americans were hauled off to jail, where many were beaten and tortured. The first twelve tried, known thereafter as the “Elaine Twelve,” faced charges ranging from murder to night-riding. After deliberating for a matter of minutes, all were sentenced to the electric chair by the all-white jury. Their appeal would eventually work its way up to the United States Supreme Court, which would vote six to two to overturn the convictions.
For years, the story whites spun about these events was patently false: that it was a planned insurrection. They claimed that Black folks were out to kill white people, hardly anyone ended up dead, and order was quickly restored. But Ida B. Wells secretly interviewed members of the Elaine Twelve while they were in jail. She offered a different account of what took place in Phillips County, labeling it an “orgy of bloodshed.” The Equal Justice Initiative has documented 245 Black killings—considered lynchings due to the mob violence—in Phillips County, more than in any other county in the United States. The death toll from Elaine rivals estimates for the better-known 1921 Tulsa Race Massacre, which range from twenty-six to nearly three hundred killed.
Reading histories of the Cotton Belt, we came across other stories like that of the Elaine Massacre, some well-known, some not. What became clear was that these events should be understood as a fact of life in the region for decades, an ever-present reality, not an occasional tragedy. Lynchings were built into the very fabric of Cotton Belt society, serving an explicit goal of racial subjugation. Thumbing through historian John Willis’s Forgotten Time: The Yazoo-Mississippi Delta After the Civil War while trapped at home during the early months of the COVID-19 pandemic, we came across a brief mention of an event described as the Leflore Massacre. We would soon learn that twenty-nine years and eleven months before the massacre in Elaine, a strikingly similar set of events took place in another Delta county on the other side of the Mississippi River. The backdrop was an economic crisis: cotton prices were falling, threatening the solvency of Black and white farmers alike. Meanwhile, the number of independent Black farmers —and Black farmers’ unions—was on the rise. This led to a growing sense of unease among whites that would come to an explosive—and deadly— head on August 30, 1889.
As in Phillips County, the inciting spark was the meeting of a union, the Colored Farmers’ Alliance, organized by a man named Oliver Cromwell.
The goal of these farmers was to “improve themselves financially,” which involved, among other things, encouraging other farmers to stop purchasing from local merchants and to give their trade instead to the Farmers’ Alliance cooperative store. Local whites took notice of Cromwell’s activities and began to spread rumors about him and to send him threatening letters. How many attended the meeting that August day is not known, but the number may have been as high as three hundred, many of them armed due to fears of white retaliation. A group of union members marched to show their support for Cromwell. Rumors about the events swirled in the white community, and an armed group of whites seeking retribution began to organize. Fearing a race war was imminent, the sheriff telegraphed Mississippi governor Robert Lowry. Three divisions of the National Guard arrived, accompanied by Lowry himself, ostensibly to persuade the white vigilante mob to disperse.
What happened next remains unclear. One white witness, a traveling salesman, claimed he had seen men, women, and children pulled from their homes and “shot down like dogs.” Exact numbers were impossible to come by as Black witnesses were too terrified to speak. Meanwhile, members of the National Guard remained tight-lipped about what they had seen—and done. Four leaders of the Colored Farmers’ Alliance were shot and then hanged, according to several news sources. Without irony, news outlets reported that their crime was “resisting arrest.” These deaths, perhaps as many as one hundred, are not all included in the Equal Justice Initiative’s count of lynchings due to the murkiness of the historical record.
Nonetheless, Leflore County, where this event occurred, is tied for second in the nation in the number of documented lynchings between 1877 and 1950—forty-eight Blacks murdered in extrajudicial killings by white mobs.
Back in Phillips County, Arkansas, on September 29, 2019—exactly one hundred years after the first victim of the Elaine Massacre was slain—the historical narrative was finally set right when the memorial we came upon during our travels was erected in remembrance of the hundreds who were murdered. There is no memorial in Leflore County to commemorate the Leflore Massacre, and none in many locations where other horrific episodes of mob violence erupted in the Cotton Belt South with unnerving regularity for generations following the Civil War. These atrocities remain unacknowledged or, in the case of the Leflore Massacre, all but lost to memory outside of the Black community. Yet one lesson we have taken away from our study of these places is that the economic and social relations that marked these communities many decades or even a century or more ago still shape their fortunes today. In this chapter, we will show that violence is one key mechanism linking the events of the distant past to the present.
The well-known and horrific sin of human bondage in the United States was most concentrated in the Cotton Belt. Legions of enslaved Black people—outnumbering whites by large margins—were required to bring in the cotton crop, and the presence of large Black majorities in the region today reflects this past. Precious little changed for the Black majority once Reconstruction surrendered to Jim Crow and tenant farming. Lynching—the ultimate tool whites used to keep Blacks down in the Jim Crow era—was especially common and particularly brutal in Cotton Belt counties like Phillips and Leflore. Lynchings are often associated with the history of Black Americans, but they are equally fundamental to the history of white Americans. Nearly always, it was whites who made the accusations of Black transgressions. White people gathered the mobs and pursued the victims, while white law enforcement either turned a blind eye or was actively involved. White people committed these murders in brutal and heinous ways, and whites made up stories in the aftermath with the goal of absolving themselves of guilt. It was white perpetrators who almost always escaped blame.
Government bodies controlled by whites allowed lynchings to occur without any real fear of punishment. Northern white policy makers were all too willing to accept promises by southern states to crack down on lynchings despite not having any reason to think any such efforts would materialize. Over the course of more than a century, there were nearly two hundred failed attempts to enact an anti-lynching law before the federal government finally passed the Emmett Till Antilynching Act in 2022.
Part of reckoning with this history is acknowledging it publicly and grappling with its brutality. Another important step is to understand precisely how violence operated in the Cotton Belt, then and now. Across America’s internal colonies, the historical record is sometimes thin, but a rare treasure trove of interview and ethnographic data collected during the late 1920s and 1930s by Black and white researchers who were embedded in communities across the Cotton Belt offers a rare window into the ongoing role violence played during that period. These narratives were written during the very years when this internal colony—like the others we write about in this book—reached its peak.
Charles S. Johnson was trained at the University of Chicago by the eminent urban sociologist Robert E. Park. His first work, commissioned by the Chicago Commission on Race Relations, was a close analysis of the Chicago race riot of 1919. His second, initiated once he joined the faculty at Fisk University in 1926, where he would go on to become the institution’s first Black president, was the first comprehensive study of the Cotton Belt’s tenant farmers. During the late 1920s and early 1930s, Johnson and his team interviewed roughly 600 tenant farmers cultivating cotton in the fields surrounding the county seat of Tuskegee, in Macon County, Alabama. His book Shadow of the Plantation (1934) offers a rich description of the family life of these farmers: their economic straits, schools, and health conditions, along with the role of the church. The sequel, Growing Up in the Black Belt (1941), draws on interviews with more than 2,000 Black youths across the Cotton Belt South, including those in Macon County, Alabama; Greene County, Georgia; and Bolivar and Coahoma Counties in the Mississippi Delta. In that book, Johnson’s goal was to illuminate the devastating impact of Jim Crow on the well-being of Black youths. At about the same time, two white scholars from Yale, anthropologist Hortense Powdermaker (whose work was facilitated by Johnson) and psychologist John Dollard, chose Sunflower County, next door to Leflore, to conduct their ethnographic studies. Powdermaker’s After Freedom (1939) and Dollard’s Caste and Class in a Southern Town (1937) focused on the family, economic, social, and religious lives of those living in the county’s largest town, Indianola, and the rural areas surrounding it.
In 1933, two couples, one Black and one white, trained in anthropology by the community scholar W. Lloyd Warner, moved to Natchez, Mississippi. Allison Davis and his wife, Elizabeth, and Burleigh and Mary Gardner immersed themselves in the local cultures of their respective “caste groups” for nearly two years, meeting on deserted rural roads in the dark of night to compare notes. Their landmark work, Deep South (1941), documented the lived realities of American racism. Allison Davis would go on to become the first Black scholar with a full faculty appointment at an elite white university (the University of Chicago). The authors were aided by research assistant St. Clair Drake, who would later coauthor the classic work about Chicago’s Black communities, Black Metropolis (1945).
The counties where these studies took place were between 60 and 80 percent Black. In each county, most Black men, plus their wives and children, toiled on tenant farms. The significantly outnumbered white elite maintained an aggressive stance toward their Black neighbors. Writing about this dynamic in Deep South, Davis and colleagues reported that the culture required that a white person “must always be ready to maintain his superordinate position, even by physical violence.” Black people lived “under the shadow of an ever-present threat [that] whites can and will enforce their authority with punishment and death.” The severity of the violence whites unleashed on the Black majority was far beyond what was required to maintain the color caste line. Whites’ “real and neurotic fear” compounded to “build up a permanent necessity for severe measures against Negroes on the part of the white caste,” Dollard observes in Caste and Class in a Southern Town. Those in the white caste perpetrated violence in both legal and extralegal forms. Whites controlled all means of official law enforcement—the local sheriff, the police, the judges, and the all-white juries. Everyone knew that the so-called law was merely a tool of oppression of Black people. Perhaps more important, though, were the ways whites deployed extralegal force. “There are no socially effective patterns which confine violence to the rational and measured tread of the law,” Dollard writes—though in the Cotton Belt, “rational and measured” certainly overstated the merits of the law.
Black community members had to be vigilant. Should they make even the smallest misstep, they could suffer severe consequences. Whites punished any sign of social equivalence to an extreme degree. A Black person could not shake hands with a white person, could not enter through a white person’s front door (even to collect the rent from a white tenant).
Black people of all ages were required to call a white man “sir” and a white woman “Mrs.” or “Miss,” but were almost never accorded the same respect, even if they were professionals holding college degrees. Instead, whites referred to their Black fellow citizens in terms that reduced them to childish figures—through the use of “boy” or “girl”—or used overly familiar terms such as “uncle” and “auntie.” The litany of rules and restrictions were all but limitless. Powdermaker labeled these “the little things that prick.” Underlying these codes of conduct was the fundamental myth that physical contact with Black people was polluting. “The belief in organic inferiority of the Negro reaches its strongest expression in the common assertion that Negroes are ‘unclean,’” Davis and colleagues observe. Dollard notes that most white households provided separate dishes for their servants’ use. Any show of economic success among Blacks was greeted with alarm. They could expect to be “challenged if seen dressed up on the street during week days; again their place is in the fields.” Any Black person “who has achieved advancement beyond lower-class status . . . has been made aware of this envy and resentment at his aggressive mobility,” and any actual successes “are punished as aggressive acts.” Lynchings were the white minority’s ultimate tool because they reverberated so powerfully throughout the Black community, particularly among the young. Here it is worth quoting Charles S. Johnson’s Growing Up in the Black Belt at length: “In some communities, incidents have occurred which have left a vivid imprint on the minds of youth. . . .
Although some . . . are relatively quiet affairs and are accepted with resignation as ‘private’ concerns or as justified punishments for the indiscretion of some irresponsible Negro, others are mob outbursts which leave deep scars of horror, fear, and dismay. Lynchings usually followed a pattern. The mob, the man hunt, the brutality, the terrorization of the entire Negro community are standard features. The effect on children is profound and permanent.” Dollard concurs, writing that the “threat of lynching is likely to be in the mind of the Negro child from earliest days.” Both emphasize that the point was not to punish the individual, but to keep the entire Black community down. According to Dollard, “The posse wants to get the right man, of course, but it is not too serious a matter if it does not, since the warning is even more clear when it lands the wrong one. . . . Every Negro in the South knows that he is under a kind of sentence of death; he does not know when his turn will come, it may never come, but it may also be at any time.” Johnson calculated that between 1900 and 1931, lynchings were most common in the Mississippi Delta, where “the plantation system still flourishes. . . . In these counties lynching or the possibility of lynching is part of a cultural pattern. A type of adjustment to it exists. During and shortly after a lynching the Negro community lives in terror. Negroes remain at home and out of sight. When the white community quiets down, the Negroes go back to their usual occupations. The incident is not forgotten, but the routine of the plantation goes on. The lynching, in fact, is part of the routine.” White perpetrators deployed a trope to justify their actions, claiming that most lynchings were in response to the rape of a white woman by a Black man and thus deserved. That a Black man would soil a white woman’s virtue was such a deep-seated paranoia in the Cotton Belt that the conventional wisdom whites espoused was that it was “safer to lynch Negroes than to endure a spreading epidemic of attacks on white women.” Of course, the real rape victims were almost always Black and the perpetrators almost always white. Indeed, among the most disturbing aspects of reading historical accounts about this era in the South is the frequency of the rape of Black women by white men, with no consequence for the perpetrator or recourse for the victim. While Black-on-white rape was often the pretext, Davis and coauthors captured what may have been a common sentiment among whites in the Cotton Belt for many decades after the Civil War: “A very influential government official . . . felt that, ‘when a [n——] gets ideas, the best thing to do is to get him under the ground as quick as possible.’” Any Black person seen to have “ideas” about his standing was in danger. In Lake City, South Carolina, Frazier Baker was a Black citizen appointed as postmaster by President William McKinley before his assassination in 1901. About six months into Baker’s term, witnesses reported that a posse of white men circled his home one night, set it on fire, and fired up to one hundred bullets into the dwelling, with Baker and his wife and six children inside. Baker was shot to death while trying to escape.
As his wife fled carrying their daughter Julia, the infant was shot dead in her mother’s arms.
Ida B. Wells was spurred to become a fierce anti-lynching crusader after a white mob murdered three Black owners of People’s Grocery in Memphis, which was proving successful in enticing customers away from a white-owned competitor. Wells meticulously documented the circumstances of lynchings and asserted that they were means to beat back economic progress made by Black Americans. In fact, a Wells editorial making this case led to a riot and threats against Wells herself, causing her to permanently flee her hometown of Memphis.
Dollard devoted an entire chapter of Caste and Class in a Southern Town to the phenomenon of the widespread violence in Sunflower County, Mississippi. He advanced what would come to be known as the frustrationaggression hypothesis: “Some of the hostility properly directed toward the white caste is deflected from it and focused within the Negro group as well. . . . Since the hostility of the Negroes against whites is violently and effectively suppressed, we have a boiling of aggressive affect within the Negro group.” While “against the white rivals he is helpless, [yet among] his Negro rivals he can fight and shoot, and he does.” Writing in the early 1930s, Charles S. Johnson described “a tradition of violence which seems to mark personal relations to a high degree. . . . A woman who was asked about sleeping with her windows closed replied that ‘people do’s so much killing round here, I’se scared to leave ’em open.’ Another explained why they stopped attending dances: ‘Dere’s so much cutting and killing going on.’ One notes either casualness or fatalism in recounting deaths in the family by violence. . . . ‘My other boy got kilt. He was just stabbed to death. Oh, they sent the boy what done it to the reformatory.’” Johnson documents that, among the myriad possible causes of death in Macon County, including the many serious health conditions that plagued the Black community referred to earlier, violence was the leading cause of death according to health department records. Dollard also advanced the argument that whites used strategic underpolicing of Black areas to bolster the idea that Black communities were inherently more violent: “It is clear that this differential application of the law amounts to a condoning of Negro violence and gives immunity to Negroes to commit small or large crimes so long as they are on Negroes.” He concludes, “It seems quite possible that lack of adequate legal protection of the Negro’s life and person is itself an incitement to violence.” Whatever the cause, he adds, “one cannot help wondering if it does not serve the ends of the white caste to have a high level of violence in the Negro group, since disunity in the Negro caste tends to make it less resistant to the white domination.” Echoes of this line of reasoning can be found in the work of later criminologists, and especially in sociologist Jock Young’s assertion that violence may function as a response to the humiliation of poverty and structural exclusion.
Most Americans think about violence as mainly an urban problem, yet this is not so. For example, between 1999 and 2013, gun deaths were most likely to occur in counties that were poor and rural, after accounting for population size. Similarly, in twenty-one of thirty-three states studied by criminologists between 2008 and 2014, homicides involving guns were just as common in rural as in urban communities—and in some cases more common.
Deaths by Interpersonal Violence per 100,000 Residents in American Communities Source: Authors’ analysis of 2009–2014 data from the Institute for Health Metrics and Evaluation.
How violent are the most disadvantaged places in America? Comparing across communities is difficult. In some places, law enforcement does a poor job of reporting violent crimes. Legal entities sometimes misclassify or ignore crimes, and some violent acts do not come to the attention of authorities at all. Because we were concerned about these well-known problems in crime data, we turned instead to the Institute for Health Metrics and Evaluation (IHME). These data draw on official death certificates to provide counts of deaths by interpersonal violence, the violent killing of one person by another, along with other causes of death. Violent deaths are counted more reliably than other violent acts because a body must be reckoned with. With the IHME data as the basis for our analysis, we could bypass some of the issues of how effectively local law enforcement reports violence.
Using these data, we averaged the rate of deaths due to interpersonal violence for all counties in the United States between 2009 and 2014 (the most recent data available to us). By this measure, the prevalence of violence varied dramatically across the country. The average county saw 4.9 violent killings per 100,000 residents during that period, yet among the 200 most disadvantaged counties according to our index, the figure was 12.7, more than two and a half times higher. At the other end of the spectrum, in the 200 places of greatest advantage, the rate was approximately 2 violent killings per 100,000 residents. Appalachia and the Cotton Belt are among the most violent places in the nation (for reasons this chapter will make clear; South Texas is an exception to this pattern). Clay County, Kentucky, for example, had a rate of 9.2 violent killings per 100,000 residents. That was close to the rate for Cook County, Illinois, home to Chicago (10.5). In the Mississippi Delta, Leflore’s rate was twice as high as Clay’s at 19.2.
Some Delta counties were even more violent by this measure.
In Clay County, some of the violence stems from the local corruption that we write about in chapter 5. Travis lounged on the couch in his Clay County trailer while his girlfriend, Helena, clad in rainbow leggings and a spaghetti-strap shirt, sat cross-legged on the floor. “A few years back, they had a couple of the high officials in the police department linked with drug dealing,” he recalled. But as anyone who peruses the Manchester Enterprise, the newspaper serving the area, can attest, violent acts are routine among ordinary citizens as well. We met with Paige over coffee in a local diner, her blond hair pulled back into a ponytail. When we asked her to describe her relationship with the father of her youngest child, she told us he had been killed, “which I think made me go into my drug addiction even more.” Stevie shared an eerily similar story. She was in her mid-twenties when we met her, but she looked younger, with a heart-shaped face and fine, dark hair that she ran her fingers through as we talked. Speaking in a soft voice, she described how the father of her first child “had another child with someone else. . . . And [the other mother] was on drugs and he was taking her to court to get custody, and she paid two other men to [murder] him. . . .
There’s still not been a trial.” Pointing to a home near his own, Travis recounted how the “two people that lived there was actually shot and killed out in front of their house.” Henrietta, whose thick black hair cascades over her shoulders, told us in a hushed, conspiratorial whisper about a neighbor accused of murdering his own parents: “They were shot and killed at the house that my daughter lives at [now]. . . . And [at the spot] where they had shot the guy outside, his cane was still there . . . and, you know, bloodstains [are] still there on the cement.” In a six-month period, from March through August 2021, we examined the weekly Manchester Enterprise for reported incidents of violence within Clay County. On March 10, a front-page story recounted how a group hunting for mushrooms in the tiny community of Red Bird, about twenty miles from Manchester, had found a skull. There was speculation that it might belong to any one of several people who had gone missing in the county over the past several years. Two weeks later, the newspaper reported that a second skull had been found in the same area by children playing in the woods.
“Indictments!” shouted a headline on April 28, referring to three persons charged with the murder of former Eastern Kentucky University football player Jeremy “Ta-Ta” Caldwell, gunned down in a parking lot in December 2020. The story noted that this was probably drug-related, as the victim had earlier been charged with possession of large quantities of methamphetamines and cash.
Almost a month later, the Enterprise reported both a murder and an attempted murder. In the small Clay County town of Oneida, two men got into an altercation. It ended with one reportedly shooting the other with a semiautomatic rifle. In the second story, a local man was charged with attempted murder for allegedly stabbing his own mother ten or eleven times and shooting her four times.
The following week’s issue of the Enterprise recounted an act of unusual candor. A woman called 911 late on a Sunday evening to admit that she had just shot her boyfriend in the hip. She was charged with attempted murder and public intoxication. July brought news of a fatal shooting outside a Clay County residence, the result of a physical altercation between two middle-aged men.
Finally, August saw two more homicides resulting from altercations between male acquaintances. The first incident was the stabbing of a man in his thirties in his home; two men were charged with the crime. The second was a fatal shooting that occurred after one man threatened to enter the home of another and do him violence. When he entered the home, armed with a gun, both men exchanged fire and the intruder was fatally wounded.
An unnamed woman also present was shot and taken to the hospital in critical condition.
Depending on where you go in town, you might never learn that Greenwood, Mississippi, is an extremely violent place. In the part of town that lies north of the Yalobusha River, the main thoroughfare is lined with more than a thousand oaks meticulously spaced to form a cathedral arch.
The old homes along Grand Boulevard are surrounded by spacious lawns.
On the sidewalks, pets are walked, babies are pushed in strollers, and an occasional jogger passes by. This is where most of the people who run things in town live. These people worry about how to stem population decline and lure newcomers, how to attract more industry, how to further beautify the downtown. Carolyn McAdams, just weeks into her fourth term as mayor, told us her top priorities were to finish a road-paving project and to put in a dock where the river passes through the downtown so fishermen can tie up to have lunch in one of Greenwood’s upscale restaurants. When asked about the violence that was engulfing the city, she shrugged. There wasn’t much she could do about that, she told us. Her diagnosis? It’s the parenting.
McAdams’s priorities were not on the radar for people living on the predominantly Black south side of town. There is a vacant feel to this part of the city, with empty lots far outnumbering occupied ones. The homes that still stand are aged one-story shotgun shacks—many strikingly similar, with clipped-gable roofs and cramped front porches—the houses packed in so tightly they almost pile up on one another, leaving little space for ventilation or light. Whereas North Greenwood’s Grand Boulevard is so quiet you can hear the birds sing and the tree frogs emit their pulsing buzz of white noise, down in South Greenwood stereos blare from open car windows while old windowpanes shake with the vibration of the bass. Folks on this side of the river that divides the town told us that just days before our final trip to Greenwood in June 2021, a shooting by an unknown gunman had left a toddler dead and his mother and her three other children hospitalized. The family was simply driving along Champagne Street in the Rising Sun neighborhood on a Tuesday evening.
Indeed, while North Greenwood seems to slumber peacefully under those mighty oaks, South Greenwood is in the throes of a crisis that has caused many residents to withdraw into their homes, leaving the streets deserted even in the middle of the day. At regular intervals, the community finds itself mourning deeply for yet another victim of a violent crime. It is not unusual for kids to get caught in the crossfire. Because the south side is a place where everybody knows everybody, the mourning reverberates throughout the entire community. As one Black community leader we spoke with put it, “Everyone that’s [involved in] gun violence down here, you either going to know them or know their parent.” That rings true for Ebony, whom we met at the motel where she lives on the edge of downtown Greenwood, next door to a deserted bowling alley.
The place has seen better days, although Ebony’s room is brightened by Mardi Gras decorations, which she found in a dumpster, arranged artfully on the wall, giving the place a celebratory air. When she opened the door, we crowded inside the room, two of us perched on the edge of the bed while the other took a seat in Ebony’s wheelchair, parked near the entrance.
Ebony presided over our conversation from the room’s only armchair.
Ebony has lived here with her boyfriend, Lee, for several years. Nearly all the patrons of the motel are permanent residents. For most, it’s here or the homeless shelter, she explained. The place has some advantages over other lodgings: no first and last required, no credit check, no lease. Utilities and phone are included in the $650 rent. Still, that puts quite a dent in her $771 disability check. Ebony needs a wheelchair to get around, and everything she needs is low to the ground, lined up in the narrow four-foot corridor between the wall and the bed: toaster oven, hot plate, deep fryer, Crock-Pot, and mini fridge. Dishes? “Oh, I wash them in the bathroom sink, there.” When we asked about her family, Ebony leaned over to rummage in a cabinet below the TV and extracted a clear plastic file folder with a purple flap. She pulled out a photo album with a blue-and-yellow-checked cover.
Opening it, she pointed to a daughter, an aunt, a grandchild. She turned the folder upside down, and dozens of pictures spilled out onto the brocade bedspread: a kid’s school photo, a daughter in cap and gown, a relative formally posed in front of the false backdrop of a library with leather-bound volumes, a father-daughter shot. This is when Ebony, in her late forties, told us that each of the three fathers of her four children was dead, all of them shot to death, one by his “daddy-in-law.” There were other documents reading “In loving memory” and “Order of Service.” “So this looks like these are programs for funerals,” one of us said. Ebony concurred and pointed to several yellowing news clippings.
“Obituaries and stuff,” she replied. She indicated one obituary: “This is my stepdad.” She showed us another: “This is my cousin’s wife.” “That’s my aunt Sara.” “That’s my uncle Ned.” We pointed to another clipping, asking, “And how about this gentleman?” Ebony told us, “He had a shop. He used to be a drug dealer, like off the block. But he had a shop. And he died.” We noted that he looked young in the photo. “Yeah. The street got him.” Barely pausing, she moved on. “The [boyfriend] I got now, we been together about twenty-some years before his momma was murdered,” she said, pointing to another clipping. “[Her husband] was a dealer. . . . I guess a friend told [him] she was ready to leave him alone, because she go to church and stuff.
And I guess a friend told him about it, and he got in the car. . . . Killed her.
Put her in a ditch out there. Shot [her] in the head. . . . He went to prison for all of it. . . . But he died over there.” Then Ebony related the story of a friend “who got killed just about a year ago. Last year, got shot over there at a little house, gambling. . . . Mr.
Harold, we called him. He was like a son because he run with my children.
He slept in the same bed, eat the same food, he come over when his mom gone or dad gone and come and be my kid. . . . But this child had growed up. He was a little dealer out there in the streets, they say. . . . Next thing you know, he was shot. . . . He drove himself to the hospital, and he died.
It’s like a little small Chicago now [in Greenwood]. Just that bad.” She shifted her focus. “I’m going to find my baby looking like a convict,” she said.
Finally, after riffling through several newspaper clippings showing mug shots, she located one headlined “Man Charged in Fatal Killing.” “Yeah, [he was only] twenty-seven. . . . Says [my son] has been charged with murder and the shooting death of a thirty-five-year-old auto mechanic.
[That man] used to come stay here with us and eat and everything. I couldn’t understand it, Lord! But he said he didn’t do it. Whoever did, they shot the man’s face off, Lord! It’s so sad.” In 2019 and 2020, sociologist Ryan Parsons and other members of our team held extensive one-on-one conversations with Ebony and thirty-four other lower-income residents of Greenwood, nearly all living on the south side. Though we were prepared to engage in wide-ranging exchanges about their life experiences and views of the community, we hadn’t planned to ask them directly about violence. Nonetheless, it kept coming up. When asked what they believed was Greenwood’s biggest challenge, every one of the Black residents we interviewed put violence at the top of the list.
Sandy, fifty-eight, described how her son usually walked to and from work at a fast-food restaurant. He was getting a ride home from a coworker one night, she said, “and as he was getting out the guy’s truck, somebody shot all in the truck.” Her boy was hit three times. “I was at Jackson [hospital] with him for four weeks until they was able to get him back on crutches and stuff because he had to learn to walk all over again.” Jarvis, twenty-one years old, lives with his uncle and works odd jobs.
He recalled two close calls in 2017; both times he was shot at in the middle of the day. “I’m coming across Zhen Market . . . and I got shot at. [Just two days later] I was going to visit my sister on Avenue I. Somebody parked across the street, and they was shooting at me. Luckily, they missed.” Rena, forty-nine, is no stranger to violence. She grew up in Compton, California, in the late 1970s, when her neighborhood was “gang central.” She told us she witnessed five people die on the streets when she was young. Yet on a recent evening, she said, this was the scene at the motel where she lives: “Everybody was in here on the ground because somebody was out there talking about shooting each other, and they were pretty loud. . . . And I was like, ‘This is crazy. Who would’ve thought I’d be in here on the bathroom floor like I’m in Los Angeles?’” When Sandy first moved with her husband to Greenwood’s McLaurin neighborhood, she was thrilled by the proximity to a park, not realizing its notorious reputation for drugs and violence. “I was like, ‘Oh my God, there’s a park right here that my kids can go to. . . .’ And then . . . I started hearing gunshots going across the park. You got to run [to] try to get your kid off the park. I can remember being in the middle of the park on my knees praying. [The neighbors are] like, ‘Oh, she crazy, she stupid.’ [No,] you need to be praying, too, because they [are] shooting past your house as well!” These accounts were underlined by reports in the local daily, the Greenwood Commonwealth, which we tracked for two years. The first homicide of 2019, on January 15, was a fatal shooting of a man at the Curtis Moore Apartments, a low-income housing complex on the edge of town. A second man was shot in the same apartment complex that day in a separate incident, though he survived. Most shootings occurred on public streets or commercial areas with innocent bystanders present: at the Williams Landing Apartments, on George White Circle; at the W. J. Bishop Apartments; at the Greenwood Gin; at the Wash Time Car Wash. Four people were shot at 5:30 P.M. on August 12, right behind a small grocery store in the south side’s Baptist Town neighborhood. Gun violence continued into 2020. Ten homicides were recorded in the first five months of the year in this city of 13,000 residents. Kenton Johnson was the first victim, killed on the corner of Broad Street and Avenue I in the middle of the afternoon, just as our researcher Ryan Parsons was conducting an interview a few hundred feet away. The second homicide occurred on Elzy Road at 3 P.M. on February 25. Three men were shot around 1:20 P.M. on March 1 in front of the Shell gas station on Highway 82. One of the year’s only non-gun-related homicides took place on March 28 when Labrandon Baugh was beaten with a baseball bat on the corner of Main and McGhee. He was taken to Greenwood Leflore Hospital, pronounced brain-dead, and taken off life support. The suspected perpetrator in the killing had participated in local anti-violence rallies and had written letters to the editor of the Greenwood Commonwealth, urging fellow citizens to take a more active stance against violence.
June 2020 began with relative calm. Then, on the tenth, police were called to the intersection of Avenue H and McLaurin around 9 P.M., where they found two victims with multiple gunshot wounds. A half hour later, another call came in; shots had been fired at the Curtis Moore Apartments along Martin Luther King Drive. Two more gunshot victims were taken to the hospital. On June 16, a twenty-one-year-old male was shot outside the Curtis Moore Apartments around 9:30 P.M. He was pronounced dead at the hospital an hour later. A second victim was transported to a hospital in Jackson, about an hour and a half away. The following day, Valdemir Beverly, a twenty-two-year-old veteran who had served in Afghanistan and had only recently returned to town to take over a family business, was gunned down on Jackson Street. The shooting was a case of mistaken identity. Beverly died at the hospital around 12:40 P.M. On June 24, the Commonwealth reported that in the past two weeks, nine shooting incidents had occurred in Greenwood and the area just beyond the city limits. The following day, the newspaper announced another. The victim was a thirtyeight-year-old from a nearby county who had been visiting friends.
For those who live on Greenwood’s predominantly Black south side, the fear of pervasive gun violence is palpable. Brent, age twenty-three, who works several part-time jobs, told us, “Greenwood’s got too much violence. . . . Just like in Memphis!” A week after Greenwood’s first 2020 homicide, on January 23, the city finally took action, announcing it would install surveillance cameras around the south side. “This is the only way to get some [evidence] of the shootings and crimes in the city,” explained an exasperated Mayor McAdams. In March, the Leflore County Board of Supervisors voted to allocate an extra $200,000 to the sheriff’s office for additional staff and resources to aid in the investigation of gun crimes. At that meeting, board president Robert Collins exclaimed, “You can’t even drive down the street without a bullet hole being put in your car. It’s like World War II—shooting people, people laying out in the middle of the road.” In the wake of the many shootings, Lavoris Weathers, a six-foot-seven former all-state basketball player for Greenwood High, decided to take a stand. In June 2020, he organized Operation Peace Treaty, which sponsored a series of barbecues in several of the south side neighborhoods most affected by the shootings: at the Robert Moore recreation building down the road from the Boys & Girls Club; at the Greenwood Mentoring Group’s building on Avenue G, near downtown; at the W. J. Bishop Apartments, a low-income housing complex whose light poles have no lights but plenty of surveillance cameras; at the M. A. Snowden Jones Apartments on the southern edge of town; at Broad Street Park (“Stokely Carmichael gave his Black Power speech in this park,” Weathers pointed out); at a small park at the north end of Avenue A in the Baptist Town section; and at a car dealership near the intersection of Main and MLK. This last location was just steps from the site of a mass shooting that would occur a few months after the barbecue held there, at a family gathering in late October 2020 following a grandmother’s funeral, leaving two visitors from Chicago dead and eight local residents wounded.
At each Operation Peace Treaty event, attendees were asked to fill out a short survey. The 527 surveys that were completed and tallied corroborated our own findings: nearly two-thirds of the respondents claimed gun violence was what they were most afraid of in their community, while only six respondents cited COVID-19. In open-ended answers, people noted that gun violence in Greenwood was arbitrary and that there was little one could do to avoid it. One five-year-old at the event complained that he was not allowed to come to the park to play because of all the shooting. Another community organizer active with Operation Peace Treaty, Shun Pearson, described how he spent summer evenings sitting in his yard listening for gunshots. Eugene, fifty-seven, had served for twenty years in the Air Force.
He was retired and living on his military pension in the McLaurin section, one of the town’s most beleaguered neighborhoods. He elaborated on this theme when we spoke with him: “You don’t want to be in that part of town.
You scared. You don’t know what they’ve done to somebody else. And you don’t know who’s going to come through there shooting and you get hit.” Fredrick, thirty-four, was living in the same hotel as Ebony but was about to move to Texas with his girlfriend, in part because of the pervasive violence. As he deep-fried a pig’s tail, he told our team, “That’s why I don’t really associate with nobody, because you don’t know who to associate with. Everybody is beefing with everybody. So there’s no need to put myself in that, in their way.” As we talked in her motel room, Ebony declared, “I don’t go to [church] picnics and stuff because, to me, these outings and stuff [are dangerous]. When you get around a lot of folks, it’s going to be trouble.” Evangeline, a fifty-one-year-old on disability because of lupus, told us much the same thing: “I don’t participate in a lot of things in Greenwood. . . . I don’t participate because I’m scared of the violence. A lot of things they have downtown, the Christmas parades and all that, I don’t participate because I’m scared somebody might get to shooting or something might happen. When something is going on outside and the public is invited, I do not participate.” What were the reasons people gave for the prevalence of gun violence in Leflore County? Several pointed to interpersonal disputes. Brent declared, “People get angry over dumb stuff. People say something about your mama, and they get mad over it. I be seeing people talking about people, messing with people, every day. Ready to kill them when people get mad at them.” Another resident put it this way: “It’s just . . . we at the club, I might get into [it] with you. You from [the] Bishop [housing complex], I’m from [the McLaurin neighborhood]. Now here we go. And now we shooting at each other. Three, four times out of the week. . . . It’s just about nothing. Because I don’t like you and you don’t like me.” But as he collected surveys one Saturday afternoon, Lavoris Weathers insisted it is not about the beefs or the drugs. There just isn’t enough money in drugs to spark the level of violence seen in the community, he said, which he characterized as “worse than Chicago in the eighties.” No, he explained, the cause of the violence is “mostly hunger”—hunger being a metaphor for lack of opportunity. The violence felt in these communities today didn’t appear out of thin air. It is baked into the histories of these places. A growing body of research has directly linked a history of violence to violence in the present day.
Criminologist Steven Messner and colleagues have shown that in southern states, the frequency of lynchings is strongly associated with current rates of homicide, a finding that “underscore[s] the relevance of the historical context.” Legal scholars Nick Peterson and Geoff Ward have found that violent opposition to the civil rights movement is related both to the lynchings that preceded it and to elevated homicide rates in subsequent years. Jhacova Williams and Carl Romer of the Economic Policy Institute also have found that counties where lynchings were more common (after accounting for population size) have higher rates of police officer–involved shootings today.
We have seen that sociologist Charles S. Johnson and anti-lynching crusader Ida B. Wells argued that lynchings were public spectacles used by whites—including prominent citizens and elected officials—to send a message to the Black community at large about the consequences of challenging the racial hierarchy. Both held that the primary objective was to keep the balance of economic power in whites’ favor and, to use our terms, to shore up the internal colonies they had created. Their efforts were met with remarkable success: they managed to block the economic mobility of their Black neighbors for generations. Through the years, white elites used myriad other means to keep Black citizens down, as the other chapters in this book reveal.
Today, much as with lynchings, the fallout from police shootings is not limited to the people directly involved. These shootings send a message, intentional or not, to the entire community about how authorities view the value of Black Americans’ lives. Using data from a survey of more than 100,000 Black respondents, public health researcher David Williams and colleagues demonstrated that Blacks who lived in a state with a recent police killing faced significantly more mental health challenges than those who did not. There are echoes of Charles S. Johnson and John Dollard in these results.
Research has shown that a child’s exposure to community violence is linked to long-term trauma, delayed cognitive development, and problems at school. For example, sociologists Patrick Sharkey and Gerard TorratsEspinosa have demonstrated that high rates of violence in a place limit the chances that poor youths growing up there can rise to the middle class. In essence, growing up with violence sharply constrains a child’s life chances.
Thanks to Sharkey and Torrats-Espinosa’s research, we can say with relative certainty that violence within a community hinders intergenerational social mobility—the chance that a child who grows up poor can rise to the middle class or beyond. But what if the opposite is also true? What if blocked mobility within a place incites violence? This is what John Dollard argued a century ago, and what Greenwood native Lavoris Weathers is asserting with his “hunger hypothesis”—that the high rate of violence in Leflore County is a response to blocked opportunity. The sense that it is nearly impossible to get ahead, plus the lack of ability to defend oneself against white reprisals, means that the anger boiling up in the Black community has no outlet except against those who are similarly oppressed.
That the white authorities don’t care only further stokes the anger.
Turning to big data, we decided to test Weathers’s hunger hypothesis.
Our team, led by researcher Liv Mann, examined whether simply growing up in a place where social mobility is hindered may itself spark higher rates of violence in that community. Consistent with Weathers’s theory, nowhere in the nation is intergenerational mobility less common than in the Cotton Belt. Of the one hundred places with the lowest rates of mobility in the United States, more than half of them are in this region. Leflore County, Mississippi, has the ninth-lowest rate of social mobility of the 3,600 counties and large cities across the nation.
Drawing on mobility data gleaned from IRS records, plus the administrative data from a variety of sources, Mann examined the impact of rates of social mobility among kids born in the 1980s on the violent killing of one person by another when this cohort reached young adulthood. She compared the strength of this relationship to other, more common predictors of violence in a community. She found that a low probability that a poor child growing up in a given community could rise to the middle class was a powerful predictor of a high rate of violence in that community, a relationship that was more important than the number of police officers on the streets or the level of inequality in the community, and one that rivaled the community’s poverty rate.
South Texas—which earned its place on our Index of Deep Disadvantage mostly due to the extraordinarily high rates of poverty there —has intergenerational mobility rates that actually exceed the national average. Even in Zavala and Brooks Counties, the rates are just below the national average. Perhaps due to the Anglo flight that occurred in the late 1960s and 1970s, these areas have been less successful in keeping the “have-nots” down. Nearly all the middle-class community leaders we spoke with there described childhoods as migrant laborers, yet many had earned advanced degrees. South Texas is the exception that proves the rule: its low rate of violence may well reflect the relatively high rates of mobility there.
Mann’s results lend credence to Weathers’s theory that a lack of opportunity for intergenerational mobility in a place will itself spark violence. But, as Sharkey and Torrats-Espinosa have shown, violence can then hinder mobility. Taken together, these dynamics can trap communities in a structural cycle of violence, with impacts that cascade through time.
Current estimates, compiled by the Equal Justice Initiative, of the number of lynchings in Leflore County, Mississippi, extend only to 1950. They thus fail to include what is most certainly Leflore County’s most notorious lynching. In 1955, fourteen-year-old Chicago native Emmett Till was seized from a relative’s home outside Greenwood, beaten with a tire iron, shot, tied to a cotton gin fan with barbed wire, and then drowned in the Tallahatchie River, all for the alleged crime of whistling at a white woman. Dr. Joyce Ladner, a Black sociologist who would eventually become a professor and interim president of Howard University, was a twelve-year-old girl living near Hattiesburg, Mississippi, at the time. In a 1987 magazine article, she explained that she was part of what she called the “Emmett Till Generation”—the group of young Black Americans growing up in the 1950s who bore indelible scars as a result of his murder. She recalled, “I had a scrapbook and I used to clip these articles from the local paper and from magazines. And I had a friend. . . . I used to go up to her house and we would talk about Emmett Till. We would lie on the floor and look at these pictures and cry. I would feel absolutely powerless.” A historical marker installed in 2008 memorializes the site where Till’s body was recovered from the Tallahatchie River. The marker has been repeatedly vandalized—shot through with hundreds of bullets—replaced, and then vandalized again. In 2019, the marker was removed after an image surfaced of three white University of Mississippi fraternity brothers posing with guns next to the bullet-riddled sign. This time, it was replaced with a bulletproof marker and equipped with a webcam. The second of the four signs that preceded it is now on exhibit at the Smithsonian’s National Museum of American History. It features 317 bullet holes.
For Ladner, agony over Till eventually turned from despair to activism.
She became a volunteer for the Student Nonviolent Coordinating Committee. “I can name you ten SNCC workers who saw that picture [of Emmett Till’s disfigured body] in Jet magazine, who remember it as the key thing about their youth, that was emblazoned in their mind,” she recalled.
Another member of the Emmett Till generation, Ida Mae Holland, was growing up in poverty in the Gee Pee section of Greenwood when Till was murdered. In her memoir, From the Mississippi Delta, she writes, “After Emmett Till’s death, I began to see, even if I did not understand it all, that we black folks had to be careful around whites, and mind never to get out of our ‘place’ around them.” There were further scars. Holland recounts the story, passed down from generation to generation in the Black part of town, about the horror of an event that had occurred roughly ninety years before.
“In my hometown . . . they celebrated as a holiday the Leflore County Massacre of 1889, in which . . . blacks seeking political rights had been slaughtered by white posses. Here, in this paradise lost, black people could take nothing for granted—not life, not liberty, not ‘the pursuit of happiness.’” Like Ladner, Holland, who was raped by her white employer at the age of eleven, would be transformed by her work with SNCC. There, she rubbed shoulders with Ladner, Bob Moses, Fannie Lou Hamer, Harry Belafonte, and Martin Luther King Jr. She helped organize Freedom Summer, then traveled to Atlantic City as a representative of the Mississippi Freedom Democratic Party at the Democratic National Convention in 1964.
The MFDP demanded that they, instead of the official Mississippi delegation, be seated, due to the lack of voting rights for the state’s Black citizens. Holland, who later added “Endesha” to her name, toured the country advocating for civil rights and went on to become a prizewinning playwright and professor of African American studies at the University of Southern California.
Resistance in the face of repression is not the exception. It is emblematic of each of the regions where the deepest disadvantage in our nation is felt. But resistance has been costly, as we will show in the coming chapters.


5
Little Kingdoms
“MISSISSIPPI IS REJECTING Nearly All of the Poor People Who Apply for Welfare and the State Won’t Explain Why,” read the headline of an April 2017 article on the website of the progressive Washington, DC, think tank ThinkProgress. The 2016 analysis found that roughly 11,000 families had applied for the state’s Temporary Assistance for Needy Families (TANF) program—known colloquially as “welfare”—which offered a maximum of $170 per month to a family of three, by far the lowest benefit of any state.
Of that number, only 167 applications were approved—a rejection rate of over 98 percent.
This anomaly caught the eye of veteran reporter Jimmie E. Gates, who was covering the government beat for the Clarion-Ledger, a statewide daily published in Jackson. He sought verification from the Mississippi Department of Human Services. Paul Nelson, the MDHS spokesperson who confirmed the numbers, told Gates, “There are many reasons . . . an application should be denied.” But state representative Jarvis Dortch wasn’t buying it: “The question is we have a lot of poor people in Mississippi so why are there just under two percent being approved for TANF?” Gates’s story piqued the curiosity of a newbie on the Clarion-Ledger’s staff, investigative journalist Anna Wolfe. She embarked on a quest that would take her far beyond where she might have imagined at the start.
“Over the next two or three years, I filed dozens of public requests . . .
fought the agency while they put up big walls . . . lots of secrecy. . . . They would send me these very meager, vague reports . . . about how they were spending the money. And it didn’t show which entities were receiving funds or what they were doing with the money. And so, it was just a constant fight for years.” Ironically, Wolfe never unearthed the drivers of Mississippi’s 98 percent welfare rejection rate; that remains a mystery. But she did find out where many of the state’s welfare dollars were going. Her queries revealed a sprawling public embezzlement scheme, the largest known in state history, one that ensnared a family of all-star wrestlers, several retired professional football players, and a state university’s athletic foundation, among others.
It all started when Greenwood native Nancy New, founder of the nonprofit Mississippi Community Education Center and a darling of Mississippi’s then governor Phil Bryant and first lady Deborah Bryant, conspired with the director of MDHS, John Davis, to embezzle or redirect millions of dollars from the TANF program. They allegedly misspent tens of millions more of funds meant to aid the poorest Mississippians—$77 million according to state auditor Shad White and independent forensic auditors.
Some of the money went into the bank accounts of celebrity athletes who hailed from the state, such as former Los Angeles Rams running back Marcus Dupree, who ran a nonprofit with the mission of providing “equestrian activities for underprivileged children.” Nancy New’s nonprofit signed a six-year lease agreement for Dupree’s newly purchased fifteenacre ranch, which included the five-bedroom home where Dupree resided.
We can find no evidence that any robust programming actually took place, although Dupree continues to deny any wrongdoing.
Meanwhile, retired World Wrestling Entertainment star Brett DiBiase, son of the all-star wrestler Ted DiBiase, “the Million Dollar Man”—whose memoir is ironically titled Every Man Has His Price—contracted with the state to provide drug addiction classes for an up-front fee of $48,000.
Instead, he ended up in rehab himself, clocking four months in a luxury facility in Malibu, California. According to a forensic audit, this treatment was paid for with TANF funds—at a cost of $160,000. TANF dollars were also flowing into the coffers of Ted DiBiase, who ran a Christian organization called Heart of David Ministry with his other son, Ted Jr. (also a retired celebrity wrestler). State auditor Shad White ordered the ministry to repay $722,299 to the State of Mississippi, while Ted Jr. must reimburse the state $3.9 million. But what really grabbed the headlines was a scathing 2020 state audit alleging that NFL Hall of Famer Brett Favre, the quarterback who led the Green Bay Packers to a Super Bowl victory in 1997, was paid $1.1 million by New’s nonprofit for speaking events. According to the state auditor, no record of such events exists. Favre vehemently denies the state’s claim.
Subsequently, some of Favre’s text messages were released to the public, including one where he asked Nancy New, whose nonprofit was the conduit for the money, whether there was any way the media could become aware of the payment to him. She was quick to offer assurances that they could not. Favre later informed New—in a message punctuated by emojis— of his excitement concerning the money’s arrival. Favre has since paid back the money he received—but not the interest that the state says he owes.
Welfare money was also funneled—again via New’s nonprofit—into a new volleyball stadium at Favre’s alma mater, the University of Southern Mississippi, where his daughter was on the volleyball team at the time.
New’s nonprofit sent $5 million in TANF dollars to the project with the thin justification that the facility would host “activities that benefit the area’s underserved population.” The single such event we are aware of is a Healthy Teens rally held at the Coliseum in October 2018. Money was also invested in a pharmaceutical start-up Favre was involved in called Prevacus.
How was all of this possible? Since a significant reform to the nation’s welfare program in 1996, funding for states’ welfare programs come largely from a federal block grant. This means the federal government sends a set amount to the states based on how many families got cash aid in the early 1990s, no matter how many families they serve now. To spend the funds to help needy families, the states must navigate myriad rules and reporting requirements. But to use the money for other purposes, they need only justify that the expense is relevant to one of the core purposes of the program: to increase self-sufficiency, curb nonmarital births, or promote marriage. These criteria leave a lot of wiggle room, to say the least.
Mississippi, a state that ranks among the most corrupt by any measure, took that wiggle room to the extreme.
Not surprisingly, the state’s welfare rolls have fallen dramatically in the years since 1996. While nearly 34,000 parents and children received benefits in 2000, that number plummeted to 11,387 in 2016, then dropped again to 6,125 by 2019, covering only a tiny fraction of the nearly 200,000 children living in poverty that year. Mississippi had the highest official child poverty rate in the nation in 2019, at 28 percent. That year, only 5 percent of the state’s welfare dollars were being spent on cash assistance for those needy children and their caregivers. Meanwhile, MDHS director John Davis and Nancy New were traveling the country garnering attention for how, under their leadership, the state had embarked on a “multigenerational, collaborative approach which addresses barriers associated with poverty.” In fact, Davis essentially privatized the state welfare program in 2017 when he funneled tens of millions of TANF dollars into two nonprofits, including the one run by Nancy New. In the words of Greenwood Commonwealth editor Tim Kalich, “One person at the Department of Human Services, John Davis, was given nearly limitless authority to decide how tens of millions of dollars in [TANF] grant money was awarded. If that one person was either incompetent or corrupt, not only would massive amounts of taxpayer money be squandered, but the poor would be worse off. Welfare . . . executed this way, would increase poverty, not relieve it.” In 2020, Nancy New and her son Zach New, welfare commissioner Davis, and three other codefendants were charged with multiple crimes, including conspiracy to embezzle, embezzlement, making fraudulent statements, and conspiracy to commit mail fraud. Then in 2021, new charges were brought against both News at the federal level, this time for defrauding the Mississippi Department of Education out of $4 million in public education dollars for students not actually enrolled in New’s New Summit schools—which received state dollars for serving children with disabilities—and for teachers not employed at these schools (including New herself). Mother and son were charged with conspiracy to commit wire fraud, money laundering, and conspiracy, among other charges. They faced sentences of up to 218 years in prison and $5 million in fines each. It was not until the spring of 2022 that Nancy and Zach New pleaded guilty in both state and federal court, and agreed to testify against Davis and the other codefendants, potentially including others who have not yet been charged. Their guilty pleas saved them from the prospect of spending any time in the notorious Parchman Farm or another of the state’s barbaric penal institutions.
The celebrity “welfare recipients” involved have been ordered to pay restitution despite claims they have done nothing wrong and didn’t know where the money originated. Nonetheless, Brett Favre insisted he “would never do anything to take away from the children I have fought to help!” Anna Wolfe questioned that claim in a conversation with former US congressman Ronnie Shows on his radio program: “So where did they think the money was coming from . . . ? [It was from] a nonprofit, so how do you get a million dollars from a nonprofit? I mean, nonprofits are charities.” When asked by Shows to reflect on all that her reporting had revealed, she said, “It’s kind of mind boggling to me . . . that the folks who rail against these social safety net programs . . . are often the same people who are making their careers at the government trough, privatizing these programs and getting cushy jobs for their [friends].”
Corruption is a common theme across America’s internal colonies and is a key mechanism through which the sins of the past continue to wreak havoc on these places today.
In the Brooks County seat of Falfurrias, in South Texas, Mayor Pro Tem Letty Garza became embroiled in an illegal gambling scheme in the 2010s, landing her and her coconspirators in federal prison. Illegal casinos like the ones she abetted are all too common in the communities we visited in the region. Over in Zavala County in 2016, the mayor, the city manager, and three current or former members of Crystal City’s city council were charged as part of a conspiracy and bribery scheme. Another council member had previously been indicted on human trafficking charges. That left only one councilor to run the town of 7,300. News coverage implicated not only the city but also the region: “Why is Public Corruption So Common in South Texas?” A Washington Post headline proclaimed, hyperbolically, “This Might Be the Most Corrupt Little Town in America.” Meanwhile, in September 2009, over in the Pee Dee region of South Carolina, a story in Charleston’s Post and Courier heralded the good news: “A county with one of the highest unemployment rates in South Carolina is getting a jobs boost.” According to a news release from the South Carolina Department of Commerce, Georgia-based Softee Supreme Diaper Corporation was investing $6 million in Marion County and bringing in 262 jobs. Several years later, another headline revealed the bad news: “Foul Smell for Taxpayers in S.C. Diaper Plant Project.” In 2015, plant owner Jonathan Pinson, a former South Carolina State University board chair, was sentenced to five years in prison on multiple counts of embezzlement, bribery, kickbacks, and filing false papers. Later on, some of these convictions would be vacated while others would stand, and his sentence was reduced to four years. According to the indictment, Pinson and others “devised a plan to submit falsified invoices to Marion County for engineering services supposedly provided to the diaper plant, illegally billing Marion County at grossly inflated rates for work which was not always completed.” In the interim, there had been more bad news. On October 18, 2011, the Florence, South Carolina–based news source SCNow (“The Voice of the Pee Dee”) reported that “Former Marion School District 2 Superintendent Dr. Nathaniel Miller will spend six years in prison for theft of more than $500,000 in public funds.” But of all of the regions examined in this book, it is in eastern Kentucky where the corrosive effects of corruption are most keenly felt. As we will show, for generations the region’s penchant for rampant political corruption has been a millstone around its neck, stunting economic growth and thus miring the region in disadvantage so deep that it has persisted even in the face of substantial federal aid meted out during the War on Poverty and beyond.
Nowhere is this tale of Appalachian corruption more lurid than in Clay County, which, along with Kentucky’s other counties, functioned as a “semi-autonomous state” from its inception—a “little kingdom,” to use historian Robert Ireland’s colorful description, plagued by a “tradition of parochialism, corruption, and inefficiency” and a “prevailing lawlessness.” When we first spoke with Carmen Lewis, former one-term mayor of the Clay County seat, Manchester, in the summer of 2019, she pointed to the area’s poor showing five years earlier in a New York Times analysis, which had named Clay the “hardest place in America to live.” The Times averaged every US county’s rank on a grab bag of indicators: education, income, unemployment, disability, life expectancy, and obesity. Clay County topped the list. In this community, it all boiled down to a single dynamic that had been centuries in the making. Since as far back as the early 1800s, Clay’s citizens had been held captive by a corrupt political elite they kept voting into office. TWO CENTURIES OF CORRUPTION IN CLAY
COUNTY
It began with a vital commodity, salt, leading to the first capitalist enterprise in central Appalachia. Recall that salt, mined from the brine along the banks of the county’s Goose Creek starting around the turn of the nineteenth century, was the commodity first linking Clay County to regional markets.
For a half century, its role in this community (and several others across central Appalachia) was akin to that of cotton in the Deep South. The salt industry, as with cotton in the Delta, laid the foundation for the political and social order still at play in Clay.
From the colonial era, salt was in constant demand, tying early settlers to the coast and limiting westward expansion and development. One cannot overstate the significance of salt, which played a key role in the development of roads, the improvement of river transport, the commercialization of the livestock industry, and urbanization. Furthermore, “salt was the one commodity readily converted into cash and the one commodity readily accepted in barter. Participation in the salt trade, therefore, proved requisite [for trade in that era],” writes historian John Jakle.
In Kentucky’s salt industry, Clay County quickly took center stage. As early as 1810, four salt capitalists there were generating 7,000 bushels of salt each year, a fourth of the state’s entire output. That figure would skyrocket to nearly a quarter of a million bushels at the industry’s peak in the 1840s. Not only did salt link antebellum Clay County to markets beyond its borders, it led to the importation of hundreds of enslaved Americans and enabled the rise of a local elite that would come to exercise control of the region from its founding, in 1807, to the present day.
Abner Baker was one such member of the local elite. The younger son of a Virginia slaveholder with a large estate, he left Virginia for Kentucky to seek his fortune in 1793. After a decade and a half in the Bluegrass region, Baker found a foothold in the newly formed county of Clay when he was appointed clerk of the county and circuit courts. A cousin would soon begin to mine salt along Goose Creek, and Baker would eventually join him. He was perhaps the first to discover the key to control of the region: the twin pursuit of both political and economic power. In this way, the Baker family soon rose to the top of Clay’s highly stratified social order. The Road to Poverty, sociologists Dwight Billings and Kathleen Blee’s lucid analysis of the roots of long-standing poverty in the central Appalachian region, takes Clay County as its case study, and the story as relayed here is deeply indebted to their work. They note that during even the very earliest days of its development, Clay was vastly unequal, due to the great inequality not only between enslaved people and their owners but also between capitalists and subsistence farmers. As John Campbell, one of the earliest chroniclers of the history of the region, observes, the success of the early saltworks contributed to the “picture of wealthy landlords . . .
living on baronial mountain estates in almost feudal fashion, surrounded by slaves and retainers.” The most important of these backcountry elites other than the Bakers were the Whites and the Garrards. James White, a wealthy Virginian, first learned about the opportunities for salt mining in Clay while serving in the Revolutionary War. Just after the turn of the century, he and his brother Hugh, along with the scions of other rich slave owners, acquired land and built up the salt industry on the county’s riverbeds. Hugh made the county his home and would become one of its most powerful men.
Like Abner Baker, Hugh White built power not only in the economic but also in the political sphere through a series of public offices he and other members of his family held. White’s descendants would serve in pretty much every local political post over the next two centuries, as well as in the state legislature and the halls of Congress. Hugh White’s son John was speaker of the US House during the Twenty-Seventh Congress.
Meanwhile, the Whites would mine salt using the labor of enslaved people and amass vast tracts of land—roughly 20,000 acres by 1860.
For James Garrard, who became Kentucky’s second governor, the story was much the same. Hailing from the Bluegrass region of the state, Garrard sent his son Daniel to Clay County in 1806 to establish a salt mine. The Garrards also secured more than 45,000 acres of land for farming. For Daniel, involvement in local politics quickly followed. He and his sons James and “T.T.” would also rise to the highest level of the power structure in Clay. Together, the “Whites and the Garrards, along with a few other families, thus established economic and political dynasties in Clay County based on enslaved labor, salt manufacturing, commerce, and large-scale farming that persisted throughout the antebellum and early postbellum periods and, in some cases, even into the modern era.” Before the Civil War, these salt capitalists had enormous power. As highway commissioners, they could direct the building of roads in ways that linked their mines to regional transportation. They could compel locals to undertake road maintenance and improvements. Records show they reimbursed themselves for making the river improvements needed to transport salt—at local taxpayers’ expense. They had the authority to appoint patrols to monitor the activities of the people they enslaved. As justices, they were charged with administering relief and widows’ pensions, but they also had the standing to remove children from the homes of poor families, binding them as apprentices to the wealthy, not an uncommon practice at the time. Across central Appalachia, as scholar Mary Beth Pudup documents, “the intersection of professional occupation, political office, county-seat residence, and property ownership—all within the compass of a certain few families—was the rule, not the exception.” Ostensibly, it was the murder of salt miner Daniel Bates in 1844 by Abner Baker’s son Dr. Abner Baker Jr. that precipitated one of the more infamous Kentucky mountain feuds: the war between the Garrards, who believed Baker Jr. was insane and thus could not be executed by law, and the Whites, who wanted to see him hanged. Baker Jr. had accused his wife —James White’s daughter Susan—of adulterous affairs with several men, including Bates. But as many Appalachian scholars have argued, this feud, like so many others in the eastern Kentucky mountains, stemmed less from the revenge motive of a husband who believed himself a cuckold than from competition between economic elites. Ultimately, the Whites won, and Baker was hanged. According to historian Altina Waller, who has written the definitive text on the topic, “Conflicts identified as feuds or vendettas were conflicts between gentlemen of property and standing.” The consequences of the White-Garrard feud for Clay County were both formative and tragic, for the conflict deformed the local government and public spheres, and Clay’s society was altered to reflect the rivalries among these warring factions.
During the mid-1800s, the salt industry began to collapse in the face of competition from other regions where the commodity was easier to extract and bring to market. During the Civil War, Goose Creek’s saltworks were destroyed by the Union army for fear they would be seized by the Confederates. This dealt the industry a death blow. After the war, “the county slipped into an era of deepening economic and geographical isolation that lasted more than a generation.” At the same time, subsistence farming was becoming an unviable way of life. Between 1875 and 1900, the population grew because of the extraordinarily high fertility rates among farm families, which required subdividing the land. Thus, food production per capita fell sharply. Harry M. Caudill quipped, “The stork outran the grubbing hoe and plow.” This not only reduced living standards but, in the words of historian Paul Salstrom, softened residents up “for later industrial exploitation.” Meanwhile, little changed in the trench warfare between the Whites and the Garrards. Billings and Blee recount the story of a traveling preacher who arrived in Clay County in 1897 hoping to win souls. While his message of salvation fell on deaf ears, he was able to record an outbreak of violence that coincided with his arrival: two killings along the banks of a local creek, an ambush-murder-arson, a gun battle, an assault, a murder by a mob of “Ku Klux,” plus “quarrels, brawls and pistol drawings . . . too tedious to mention. . . . As I sat in my room in the second floor of the Lucas Hotel, I could hear the bullets whiz through the air.” He would soon learn that the violence was far from random. It was driven in large part by the stiff competition for political power between the two rival clans. Billings and Blee observe that “for the first decades after the Abner Baker conflict, these wars of position took place through local elections and the offices of congressman, clerk, treasurer, and state representative oscillated between the Garrards and Whites.”
Between the late 1880s and 1920, a massive, systematic drive to clear-cut the central Appalachian Mountains by the American timber industry revived a regional economy that had floundered after the twin collapse of the salt industry and subsistence farming. In fact, by the turn of the century, the region was producing 30 percent of the hardwood in the United States.
Logging had long been a side occupation for local farmers who culled a limited number of hardwood trees each year. Now it became a full-time job as companies assembled large crews and housed them in logging camps and towns. By 1920, the “virgin forest was gone, except for the pathetic remnants of a few hundred acres.” Meanwhile, many loggers had completely abandoned their farms.
While the leading families pursued timbering and mining to a limited extent, they had neither the capital nor the credit to do so on a large scale. It was absentee capitalists rather than the local elites who would go on to become the principal industrial ownership class. Due to their hegemony over the political and social order, however, mountain elites like the Whites and Garrards were ideally positioned to benefit from connections to the corporations that invaded central Appalachia in successive waves during those years. Because these elites owned much of the land, they profited handsomely from real estate speculation and the leap in land values as corporate interest in the region’s potential mineral wealth grew. It was Clay’s elites—with their preexisting advantages in the old economy—who profited most from the arrival of the railroad in 1917. Both a Garrard and a White reaped huge profits by selling rights-of-way for the new lines.
Summing up the period, Mary Beth Pudup chronicles how, among the mountain elite, some “maintained their holdings to become coal owners and operators on their own account, others sold out and made a fortune, while others traded their land for equity positions in new coal and land firms.” Taken together, neither the timber boom nor the arrival of Big Coal did much to bridge the chasm between the tiny cadre of elites and the much larger numbers of struggling subsistence farmers and wage laborers employed in resource extraction. Rather, these industries fostered what has been called “the development of underdevelopment” in central Appalachia.
Historian Ronald Eller, author of Miners, Millhands, and Mountaineers: Industrialization of the Appalachian South, 1880–1930, writes, “By the turn of the century, the Appalachian South had become the economic colony of the urban northeast.” The patterns of political corruption and elite capture of resources from the early 1800s onward were richly on display a century and a half later, when Clay’s Republican leadership controlled the patronage in the county —especially jobs in the schools, the largest employer—while its Democratic leadership controlled state programs, such as the maintenance of local highways, another prominent source of employment. Political patronage served both to reward and threaten—with promises of jobs and threats of dismissal used as tools to maintain political power. In the early 1960s, business leaders formed the Clay County Development Association to attract new industry to Manchester. Douglas Arnett, a historian who studied the program, reported that “the local elite was willing to tolerate the work of the development association as long as the innovations . . .
[wouldn’t] threaten the social structure.” Later that decade, the War on Poverty would infuse more than $1.5 million in federal funds into the community but also would require that the county ensure what the War on Poverty architects referred to as “maximum feasible participation” among all groups, including the poor. This requirement was a “direct challenge to the local political machine and its control over patronage,” according to Billings and Blee. To meet the perceived threat, the school superintendent attempted to co-opt the efforts of the US Office of Economic Opportunity (OEO) by incorporating an entity that would be the conduit of federal funds. Arnett described how, when this failed, local officials mounted an intimidation campaign to ensure that the populace voted for the political machine’s representatives to serve on the board of the OEO’s Community Action Program (CAP). Due to persistent corruption, Clay’s CAP was the first in the nation to be defunded for self-dealing, as well as an utter failure to achieve “maximum feasible participation” of the poor to any meaningful degree. It later merged with a multicounty CAP the OEO hoped would be less vulnerable to elite exploitation. As we will show, that would not be the case.
Throughout this period, just as in other internal colonies, local officeholders relied on the denial of suffrage—in the form of vote buying and voter intimidation—to retain power. Historian Thomas Kiffmeyer described the experiences of an anti-poverty group that began its work in the region just before the launch of the War on Poverty. These volunteers witnessed myriad instances where contentious elections resulted in both discriminatory hiring and firing, with the end goal of controlling local voters. When local elected officials felt threatened, they resorted to the use of bribes of food, moonshine, and cash to bring voters around.
In 1997, graft in the now regional Community Action Program was alleged. Lexington Herald-Leader reporter Karen Samples wrote, “Strange things were going on at the Daniel Boone Development Council [the regional CAP] when state officials arrived in September, 1996, for an annual review.” Some of the poverty relief agency’s money was ending up in secret bank accounts controlled by a private company, records show. “A truckload of donated floor tile had gone to Clay County residents who weren’t poor . . . middle-class folks had been renting the council’s vans for vacation trips—even though the agency’s mission is to help the disadvantaged.” In the following decade, public corruption continued. Daugh White, Manchester’s long-serving mayor and direct descendant of the early nineteenth-century Clay County settler Hugh White, pleaded guilty to racketeering conspiracy in 2007 for demanding kickbacks from companies bidding on city contracts. The companies had been coerced into making the payments to the city manager, a position created by Daugh and filled by his son Kennon. While serving time, Daugh was also implicated in a scheme in which a drug dealer burned down a vacant home in Manchester, which allowed the city to then purchase the property to build a new police station.
As was true in antebellum Clay County, the power of local officials in eastern Kentucky is nearly absolute these days, too. They still control most of the well-paying jobs. Clint Harris, a circuit court judge, told us he was offered his job as an assistant county attorney because he was “Jack Harris’s kid.” Harris explained, “I was getting ready to . . . go take the bar exam, and [the county attorney] called me and he said, ‘I talked to my daddy, and Daddy said, Call Jack Harris’s boy. He’s in law school, so I believe he’d be a good one.” Favoritism in hiring is most noticeable in the schools. One high school employee told us that cronyism in hiring is ubiquitous: “In this town, it’s more likely that someone will get hired because of who they are more than they can do their job well. It will always be like that because that’s your friend’s dad or son.” This is not new, as the historical record attests: “Over the course of the nineteenth century local political officeholding in the mountains gained a high premium. . . . The distribution of teaching positions in each county became one of the most closely guarded patronage preserves.” Many among our low-income interviewees know full well that a person needs to have connections—preferably family ties—to get one of the good jobs in town. Paige put it this way: “It’s all about who you know.” Jake described Manchester as a “family-owned town [where] they only care about [themselves] and their money. . . . They don’t care about everybody else. . . . The rich gets richer, the poor gets poorer. . . . Really, there’s probably four or five big names, and they run the town as far as jobs and businesses . . . they allow to come in and don’t allow to come in. . . . It’s always been like that, honestly, it really has.” But Clay’s experiences in recent years vividly illustrate how control of local offices enables public servants to facilitate any number of illicit moneymaking schemes—some of which may risk their constituents’ lives.
Most shocking has been the willingness of Clay elites to make common cause with criminals, bringing drugs and illegal painkillers into the county.
Despite years of allegations of corruption, it wasn’t until Daugh White was under federal investigation that he was finally unseated after seven terms as mayor. Daugh’s family patriarch Jennings White, who served as county clerk, pleaded guilty to laundering money for one of the largest known cocaine and marijuana drug rings in eastern Kentucky, headed by kingpin Kenneth Day, a Manchester pawnshop owner. Day also served on the county board of elections in the 1990s and held the post of Republican election commissioner, a position that allowed him to select precinct officers. These positions were instrumental in the Whites’ ability to buy their way into public office year after year. Denver Sizemore, a convicted drug dealer, testified that Jennings White gave him $25,000 to kill one of White’s political opponents—a charge White still denies. Vernon Hacker, a city councilman and director of 911 operations, who was also aligned with White, used his position to tip off dealers and facilitate the illicit sale of OxyContin and cocaine. A number of our interviewees relayed stories about police arriving to raid a drug house and finding that the last call received there was from their own police department, presumably warning about the impending raid.
By 2002, circuit judge Cletus Maricle and school superintendent Doug Adams, both of whom had daughters who had fallen prey to illicit pain pills, had had enough. They decided to wage war against the Whites. The 2002 election was a particularly violent one, as the New York Times reported: “In one of the bloodiest election seasons in more than 50 years in these fabled Kentucky hills, Sheriff Edd Jordan of Clay County is watching his back.” Violence included the “riddling of County Clerk Jennings White’s van with bullets,” although later court testimony revealed that White shot up his van himself in an attempt to gain the sympathy of voters. That election cycle also included the attempted murder of the man Denver Sizemore testified he was paid to kill but didn’t. The target was a private investigator who—whether by Sizemore or someone else—was shot six times in the back (and yet somehow survived) on his way home from Indiana where he was said to have gotten some dirt on White. Violence wasn’t limited to Clay County in that election cycle; in another nearby county, the Times reported that “two candidates in sheriff’s races have been killed,” and yet other races in the region were “punctuated with gunfire and fistfights, and there [were] widespread accusations of swapping votes for liquor, cash and even the addictive prescription pain-killer OxyContin.” After the 2002 election, the opioid crisis in the county—as well as the collusion by officeholders—only intensified. A coalition to defeat the Whites and their allies continued into the elections of 2004 and 2006.
Problem was, in their quest to unseat the corrupt White clan, crusaders Cletus Maricle, Doug Adams, and their allies relied on an age-old eastern Kentucky tactic—vote buying—perhaps believing that the ends justified the means. In 2006, newly minted Manchester mayor Carmen Lewis (who had refused the coalition’s backing and would later accuse it of extortion) and her chief of police, Jeff Culver, chose to cooperate with a federal investigation of the vote-buying scheme of Maricle, Adams, and their allies.
In the end, the prosecution relied on testimony from none other than members of the White slate, who agreed to testify in exchange for lesser sentences on their own convictions.
The story of public corruption in Clay County is a cautionary tale bringing into sharp relief how, even in one of the most deeply disadvantaged places in our nation, there are still plenty of spoils ripe for elite capture. If a place has even the basic elements of local government—schools, city and county government, and local law enforcement—blood can be squeezed from a stone. In Clay, the graft has been obvious—even flamboyant. Yet for nearly two centuries, Clay’s citizens have mostly stayed silent. As one of Mayor Carmen Lewis’s chief supporters, a former high school principal, told us, “You speak up, you lose your job. . . . Your family doesn’t eat.” Mayor Lewis and Chief of Police Culver probably risked their lives to help end the corruption so pervasive that it had stifled any prospect of economic progress. Both received death threats and lost their posts in the next election (a new chief of police was appointed by the incoming mayor).
They became, in Lewis’s words, “two of the most hated people in Clay County.” Meanwhile, corruption of this kind was occurring across central Appalachia. In 2010, when Maricle, Adams, and their codefendants were all convicted, Clay was just one of eleven eastern Kentucky counties in which officials were charged with corruption. Lewis referred to it cynically as “mountain politics.” During the sensational trials of those on the Maricle-Adams side, Lewis told us, locals minimized the crimes as “only vote buying,” which they claimed was “not that bad”—or at least not as bad as committing arson for profit, accepting kickbacks, tipping off drug dealers, and laundering money for a drug cartel while in public office, crimes for which some of the White clan and their allies had been convicted. She said ruefully, “It is as if the whole traumatized populace is still in the grip of Stockholm syndrome,” referring to the rare psychological condition where traumatized victims bond with their abusers. In line with these observations, on a trip to Greenwood, Mississippi, in June 2021, we spoke with several local leaders who mentioned Nancy New. They played down her alleged crimes and praised her schools, especially the New Summit Academy in Greenwood, which does “so much good.”
FROM TURNING A BLIND EYE TO ABSOLUTION
Much as many of Greenwood’s community leaders tend to downplay the violence on the south side while also blaming it on those who live there, a number of Clay County’s leaders seem willing to turn a blind eye to rampant public corruption while blaming the poor, or the government programs that serve them, for the community’s problems.
In his 2018 book, The Left Behind: Decline and Rage in Small-Town America, sociologist Robert Wuthnow describes the dozens of rural communities he studied as places that, while rich in favorable attributes, tend to exclude those who don’t belong, often “the poor who townspeople figure are on welfare and probably up to no good.” These include “newcomers of different ancestry who don’t quite fit in.” Paul Bowling is a fifty-four-year-old principal officer of the Clay County Cancer Coalition, a nonprofit that relies on government grants to help with the gas bills of cancer patients who must travel significant distances for treatment. Clearly an insider, he explained to us that as “friendly as people are, when you’re an outsider, you’re an outsider. . . . Our people are our people.” Another prominent scholar of rural poverty, Cynthia Duncan, refers to this state of mind in rural America as one of “good rich people” and “bad poor people.” But in America’s internal colonies, where inequality is among the highest in the nation and has been for generations, how does the moral community operate? Here, our evidence suggests, divisions between the haves and the have-nots may be reinforced to an extreme degree. This in itself may be a mechanism by which the life choices of people today are hindered—through racial, ethnic, and class cleavages that have persisted for hundreds of years. For the poor of eastern Kentucky, there is an extreme level of social exclusion that has a lot in common with that seen today in the Cotton Belt and in South Texas. For elites, the dividing line between the classes is defined as a moral—rather than a racial, ethnic, economic, or political—divide. This construct offers an absolution of sorts, for the moral narrative shields the elite from having to grapple with the grinding poverty in this poorest white-majority place in the nation, with the history of extraction and exploitation that has marked the lives of the have-nots for generations, and with the very real corruption that is crippling the community and perpetuating the class divides.
For the community leaders we spoke to in Clay County, the divide comes down to one central distinction: Do you work or collect a government check? Government checks, said Pastor Ken Bolin of Manchester Baptist Church, make people feel entitled. They no longer know what it feels like to work for a living, he told us. Nearly every other community leader agreed. Almost all extended their critique to child disability. Government benefits are choking the educational aspirations of poor children, they claimed, because poor kids’ parents coach them to do poorly in school so they can qualify as disabled and collect “stupid checks.” Stories about “stupid checks” were on nearly every community leader’s lips when we were conducting fieldwork in Clay. The term is perhaps an echo of a centuries-old program authorized by the Kentucky legislature in 1793 that provided those deemed “pauper idiots”—many of them children —with modest financial support.
Nicholas Kristof, reporting from neighboring Breathitt County for the New York Times in 2012, quoted a local school district official, Melanie Stevens, who said, “The greatest challenge we face as educators is how to break that dependency on government. In second grade, they have a dream.
In seventh grade, they have a plan”: to draw government assistance for life.
Christy Rice, a Clay County High School guidance counselor, shares the belief that what she sees as government dependence is due to “culture.” While acknowledging the high rate of poverty in the county, the many health challenges, and the lack of well-paying jobs, she told us she doesn’t approve of those who claim government checks, which she believes have “alienated” generations from the “culture of work.” It is worthwhile to consider whether the numbers lend credence to these assertions. Among our participants, some had certainly applied for SSI (Supplemental Security Income) on behalf of their kids. Yet of the roughly 4,200 children in Clay County in 2020, only 213 were getting a check from the SSI program, a rate of about 5 percent. Only 185 kids in Breathitt County received SSI in 2019, a rate of about 6 percent. Meanwhile, local nonprofit leader Jason Bailey writing for The Daily Yonder, a rural Appalachian news outlet, wondered in 2013 whether the school district’s woes might have a different cause: “Breathitt County’s school system was recently taken over by the state due to corruption and mismanagement at the top. The superintendent was indicted for vote-buying; he is the eleventh county leader (including the sheriff) to have been recently convicted or pled guilty to illegal activity associated with political battles among county elites.” What is really holding back children from success in school in Breathitt County? A relatively small fraction of children receiving a modest monthly check, or persistent and rampant corruption in the institutions charged with serving them?
Joe Farmer, the forty-six-year-old founder of the Axis Coffee Shop & Gathering Place in Manchester, battled both poverty and addiction in his younger years. Now a business owner, he, too, blames government assistance for Clay County’s woes, telling us that it has “killed our communities. It really has. The War on Poverty began in the Appalachians . . . and people are worse off because [they] have forfeited their rugged individualism that made this country great.” Fifty-two-year-old Jeff Culver, Carmen Lewis’s former chief of police, is now head of security at the Manchester hospital. He also is eager to condemn government checks, as he explained to us: “Help us, but don’t give it to us. . . . We’re a very proud country. When you take someone’s pride from them, you’ve stripped them of their dignity. Feed us but feed us with jobs. Feed us with hope. Feed us with a life to look forward to, to get up and go to work. To make something. To be productive in your community.” One cannot dismiss these narratives out of hand. Clay County ranks third in the nation in adult disability claims, when adjusted for population size. As of 2019, the county’s official labor force participation rate was only 37 percent, on a par with most of its eastern Kentucky neighbors but far below the national average of 63 percent. Among the working-age population, we estimate more than 20 percent receive some form of disability. Furthermore, Culver’s comments are not completely at odds with our low-income interviewees’ accounts. Seven of twenty-two were receiving SSI, while another was benefiting from her husband’s SSDI (Social Security Disability Insurance). Many of them described their boredom and isolation.
None were making as much money from the program as some of the local elites who have taken advantage of the program have, such as Eric Conn, a lawyer specializing in disability claims in eastern Kentucky who was convicted of defrauding federal disability programs. Conn was indicted for conspiring with doctors who falsified medical records and with a Social Security Administration appeals judge who pleaded guilty to taking more than $600,000 in kickbacks in exchange for approving Conn’s cases. Conn —who cut off his ankle bracelet, fled to Honduras, was captured, and pleaded guilty to conspiracy to defraud the government and to retaliation against a witness—is now serving a twenty-seven-year prison sentence. His clients continue to face the fallout of his crimes. One of them, Bryan McCown, told a local reporter that he had “fractured his neck and back in a fall from a truck at a Pike County coal mine.” He claimed he was wholly unaware that Conn was bribing a judge on his behalf, but that didn’t keep the Social Security Administration from cutting off his disability payment after the fraud was uncovered.
While there may be some whose desire to work is sapped by the lure of a government check, it is also true that blaming government benefits for breaking the spirit of the poor is just the latest guise in a long history of elites blaming Appalachian highlanders for the region’s problems. In fact, most of the low-income people we spoke with told us they were desperate to do more than rely on a disability check. Even those with severe disabilities hoped to have a job one day so they could contribute to society.
Paige believes that the lack of jobs is the real problem: “If they had more jobs besides just restaurant work, or something like that, it would help a lot of households.” Jake concurs: “It all comes down to more jobs, more jobs for people to survive and live, because if not, man, everybody’s going to be on welfare. . . . That’s what most people turn to.” One Clay County resident admitted he had given up: “I tried working and they don’t really pay enough to actually survive down here. So I do a lot of little side hustles, I guess,” including selling his prescription Suboxone on the black market to pay his rent. Yet others, like Stevie, said that she was so determined to work that she would “walk to work in the rain, in the snow. The police picked me up a couple of times and asked me what I was doing out because it was storming.” Similarly, Helena described her determination to keep her job, which meant walking to work, even while eight months pregnant.
To claim that government assistance has led to a decline in people working also poses a “chicken and egg” problem. Which came first? A report examining the decline in workforce participation prepared by the Kentucky Chamber Foundation concludes that the causes “are many, including demographic change, poor health outcomes, substance abuse, incarceration, among many others.” “Demographic change” refers to the aging of the population: the share of adults in or approaching retirement in the region has grown relative to younger individuals in their prime working years. The report does mention government benefits, though this is not high on the list. Even then, the problem is cast as one of bad economic trade-offs faced by beneficiaries rather than moral failings. That is, even given their meager benefits, disability programs may offer a more stable and livable income than most available jobs, at least for those without connections or a college degree. Given that, is the right solution to cut off the one income source providing some stability? Or is it to increase opportunities for livingwage jobs for those able to work?
Isolated rural locations such as Clay County have low living costs, which is a plus when a disabled individual’s income ranges from $840 to $1,200 per month, depending on which disability program they qualify for.
For those with limited incomes, places like Clay are “sticky”—meaning that people can seldom afford to move away. Real differences in morbidity are also at play. Economists Anne Case and Angus Deaton, among others, have found that reports of pain are sky-high in Appalachia and several other rural regions possibly as a result of the history of physically demanding work in these locales. Among the twenty-two low-income Clay residents we interviewed, fully three-quarters reported a disabling condition, about three times the number claiming disability benefits. In fact, their disorders were often serious and overlapping. Three had been diagnosed with cancer, three with hepatitis C (possibly related to past drug use), and two with arthritis.
Three were on medication for high blood pressure. Other conditions included seizures, epilepsy, diabetes, high cholesterol, migraines, degenerative disc disease, lupus, Raynaud’s disease, Graves’ disease, bulging discs, and fluid on the hips. Eight were being treated for one or more mental health disorders, including bipolar disorder (two), anxiety (four), depression (three), and schizophrenia (three). These underlying vulnerabilities are precisely why places like Clay County were targeted by pharmaceutical companies like Purdue for the rollout of OxyContin and other painkillers.
Each of the communities we examined for this book suffer not only from a legacy of grossly underfunded and often racially segregated schools, violence, and a collapse of the local social infrastructure but also from weak and often corrupt local government. In this way, America’s internal colonies are similar to former colonies established by foreign nations around the world, where government corruption “has deep historical roots that go all the way back to their colonial experience.” In Clay County, public corruption has emerged in an especially virulent form, arguably due to unique historical processes, including the formation of a political and economic elite that coalesced well before the Civil War.
Corruption is a seldom recognized form of exploitation in which an elite few are allowed to live off the spoils of public office and to preserve the status quo. Under these conditions, it is nearly impossible for a community to improve. In 2011, the Lexington Herald-Leader editorialized, “How does a business know it can count on roads being maintained, public utilities providing the best service at the lowest cost, the police and courts treating all fairly, schools hiring the best educators they can find?” An editorial in the same newspaper opined, “Economic development will be hard, if not impossible, in places governed by small-time political machines that maintain power in ways that can’t bear scrutiny.” It noted that the “State Department recently devoted an issue of ‘The Foreign Service Journal’ to the topic of corruption. . . . Former secretary of state John Kerry was quoted as saying, ‘Corruption is an opportunity destroyer because it discourages honest and accountable investment; it makes businesses more expensive to operate; it drives up the cost of public services for local taxpayers.’” What is true around the world is most certainly true in Clay County, Kentucky, and America’s other internal colonies.