American Dream Read online

Page 12


  While the state of the ghettos demanded redress, the problem of self-destructive behavior wasn’t one that liberalism was programmed to confront: it smelled too much of blaming the victim. The phrase “blaming the victim” itself stems from what might be considered the protobattle over welfare reform: the outcry over Daniel Patrick Moynihan’s 1965 report on the black family. In sounding an alarm about the percentage of black children born to single mothers, Moynihan didn’t blame welfare. He blamed “three centuries of exploitation,” from slavery to industrial unemployment. But with phrases like “tangle of pathology,” the report drew famously bitter condemnations and left most liberals reluctant to discuss the social problems of the ghetto—welfare included. If anything, they romanticized the lives of poor single mothers, turning Angie Jobes into Tom Joads. With economists in control, most poverty academics had gotten out of the business of talking to poor people altogether; tenure passed through data sets, not inner-city streets. The experts spoke a desiccated, technical language, mostly to themselves.

  Liberals further constrained their influence when they began to argue that mothers were right to stay on the rolls until they could land “good jobs”—no maid work and no Burger King. The quest for better jobs was generally a good thing, and for southern black women it had a special resonance, since they had been exploited for generations as field hands and domestics. But as it played out in the welfare debate, the good-jobs philosophy proved problematic. Substantively, it required long stays in training programs that typically proved ineffective. Politically, it made liberals look as though they had an inherently prowelfare bias. Even at the end of the Reagan era, prominent congressional Democrats were still denouncing modest work proposals as “slavefare.” The silence about self-defeating behavior, combined with the rejection of entry-level jobs, left the liberals with an increasingly cramped message: Don’t expect too much from people in the ghettos. In particular, don’t expect them to work—at least not in the kind of jobs most could actually get. A common liberal move was not to talk about poor adults at all, but instead shift the locus to children, who were innocent. The leading advocacy group of the eighties was the Children’s Defense Fund, whose logo featured a child’s crayon drawing.

  In 1984 an obscure social scientist named Charles Murray published a book called Losing Ground, which purported to explain what had gone wrong: welfare had ruined the poor. AFDC was one program that Murray had in mind, but also food stamps, Medicaid, subsidized housing, even workers’ compensation, all of them skewing the normal incentives to work, marry, and form stable lives. Such suspicions were ancient ones, but Murray gave them fresh legs with a calm marshaling of statistics and a tone of abundant good intentions. He also pushed the logic to a radical new conclusion. Don’t reform welfare; abolish it. The “lives of large numbers of poor people would be radically changed for the better.” Elegant, accessible, in sync with its times, the book created a sensation. Within a few years when Hollywood wanted to cast a hip, tough-minded undergraduate, it showed him crossing Harvard Yard with a copy of Losing Ground. Pre-Murray, welfare’s main critics had attacked on equity grounds: it was costly, wasteful, unfair to taxpayers. Post-Murray, the criticisms became much more profound: welfare was the evil from which all other evils flowed, from crime to family breakdown. To cut was to care.

  Murray ignited a liberal revival. While some critics attacked on empirical grounds—if welfare drove poverty and nonmarital births, why had both conditions continued to rise as benefits declined?—others began working toward underclass theories of their own. Nicholas Lemann started his research on the ties to sharecropper life, bringing in a racial link that Murray had ignored. William Julius Wilson published a book of masterful sweep called The Truly Disadvantaged, which established the reigning explanation for the rise of the underclass. Where Murray pointed to welfare, Wilson described a complex interplay between industrial decline (which deprived inner-city men of decent jobs), desegregation (which allowed middle-class blacks to escape), and self-defeating cultural forces (which took on a life of their own in communities stripped of middle-class ballast). He put little emphasis on welfare, but he left no doubt that ghetto life had entered a tragic new state, defending the word underclass from those who found it too harsh and chiding fellow intellectuals on the Left to speak more bluntly about disturbing ghetto behaviors.

  Another important writer to emerge post-Murray was the journalist Mickey Kaus, who called for saving the underclass with guaranteed government jobs. In a long essay in The New Republic in 1986, Kaus, a self-styled “neoliberal,” attacked standard liberal positions that had emphasized voluntary training and education and produced modest results. “Our goal, in contrast, is to break the culture of poverty” through work requirements, he wrote. Kaus took an essentially Marxist version of work’s centrality to life; his subsequent book, The End of Equality, quoted everyone from Eugene Debs to George Orwell on the dignity of menial labor. If welfare was replaced with government jobs, Kaus wrote,

  the ghetto-poor culture would be transformed. . . . Once work is the norm, and the subsidy of AFDC is removed, the natural incentives toward the formation of two-parent families will reassert themselves . . . if a mother has to set her alarm clock, she’s likely to teach her children to set their alarm clocks as well. . . . It won’t happen in one generation, necessarily, or even two. But it will happen. Underclass culture can’t survive the end of welfare any more than feudal culture could survive the advent of capitalism.

  Murray and Kaus hailed from opposite poles—one called for a radical constriction of government, the other, for several million government jobs. But they proceeded from a common assumption, that welfare was destroying American life.

  Still, not much happened. Murray’s plan was too radical even for Murray—he couched it as a “thought experiment,” and even a politician as antiwelfare as Ronald Reagan wouldn’t get close to it. Reagan praised workfare programs, but given the costs and complexity involved he didn’t push for one. Instead, his first budget, in 1981, simply cut the existing program (trimming the rolls 7 percent) by making it harder for recipients to collect aid once they got jobs. The 1981 law also gave states greater latitude to experiment with mandatory work and training programs. More than half the states opted to do so, including Arkansas.

  By 1983, as the experiments were starting, half the mothers of preschool children worked outside the house, further eroding the rationale for letting welfare recipients stay home. That year, a groundbreaking study by two Harvard professors raised new welfare concerns. Until then, the basic facts about program usage were subject to dispute. While conservatives warned of long-term dependency, liberals said the average recipient left within two years. Mary Jo Bane and David Ellwood proved them both right. Most people who entered the system did leave within two years. But a substantial minority stayed, and over time they came to dominate; the average woman on the rolls at any given moment would draw aid for ten years. To illustrate the concept, Bane and Ellwood used the analogy of a hospital ward, with two beds turning over daily and eight devoted to chronic care; though lots of people came and went, the typical occupant of the ward was in the middle of a long stay. While liberals continued to emphasize the turnover, it became impossible, in the light of the data, to dismiss long-term welfare receipt as a figment of conservative bias.

  Not long after, the first results of the work experiments appeared, with an encouraging report. Out of eleven state programs studied, nine raised employment and earnings, albeit modestly. The studies were conducted by a prestigious nonprofit organization, the Manpower Demonstration Research Corporation, and they used control groups, which lent them the gloss of hard science. Critics had called mandatory programs ineffective and punitive, arguing that the money would be better spent on volunteers. But participants told MDRC they considered the work rules fair. Plus the programs saved money—while some up-front costs were involved, the investment more than paid for itself, usually within five years. As Jonah Edelman h
as written, if Ellwood and Bane showed long stays were a problem, the MDRC studies showed that “mandatory work and training programs were a viable solution.” Soon blue-ribbon panels were hailing a “New Consensus.” Conservatives would agree to “invest” more in welfare-to-work programs; liberals would agree to require some people to join. Spend more, demand more—it was a compelling idea.

  It takes more than consensus to pass a bill. It takes politicians. After ignoring the issue for much of his presidency, Ronald Reagan was preparing his State of the Union speech in January 1986 just as Bill Moyers aired Timothy McSeed. Responding to the clamor, Reagan announced he would appoint a task force on welfare reform. The group’s plan, unveiled at the end of the year, urged states to turn AFDC, food stamps, and Medicaid into a “block grant,” with capped federal funding but expanded local control. It was an old Reagan idea, but deaf to the politics of the moment. It arrived DOA in a Democratic Congress, which regarded it as a stalking horse for more budget cuts—far from the spend-more, require-more “New Consensus.” Congress had begun hearings, which emphasized the new theory of mutual obligation. The welfare commissioners put out a spend-more, ask-more plan, and so did the National Governors Association, led by the chairman of its welfare task force, Bill Clinton, who rejected block grants. The Clinton plan called for spending an extra $1 billion to $2 billion a year but emphasized the eventual savings. Getting the governors to endorse higher welfare spending wasn’t easy, but Clinton was ambitious, intelligent, and charming, and he pressed hard. The vote was 49 to 1 (with Tommy Thompson the lone dissenter). Moynihan, by now a senator, took to calling a similar plan of his own “the governors’ bill,” and started his hearings by quoting a Clinton welfare speech. It would prove, in the years ahead, a rare moment of harmony for the two.

  No freestanding welfare bill had passed Congress in twenty-six years, and before any plan became a law, it would have to find common ground between Ronald Reagan, who controlled the veto, and the liberal Democrats, who controlled the House. Liberals wanted to raise benefits, especially in the South. Conservatives wanted to hold down costs and ratchet up work demands. The safest bet was that nothing would happen, but the governors’ involvement was an unprecedented plus and lent the effort a bipartisan air. And Clinton lobbied furiously, with calls to nervous southern Democrats, who worried about raising welfare costs. At one point, Clinton, by now chairman of the whole NGA (and visibly interested in higher office), virtually acted as a legislator himself, sitting in as House members drafted the bill. The bill still ran the risk of a veto as it sat in a House-Senate conference in the summer of 1988. But it got an unexpected boost from Vice President George Bush, who wanted to neutralize the issue for the fall elections, since his opponent, Governor Michael Dukakis of Massachusetts, had the more accomplished welfare record. Even with so many moons aligned, the bill had a near-death experience, squeezing through a crucial committee by a single vote.

  The Family Support Act was signed in a Rose Garden ceremony on October 13, 1988. The next day, Jewell had her first child and went on the rolls; Angie by then had three. The law created the JOBS program—the one that sent Angie to nursing aides’ school—and offered states up to $1 billion a year in matching funds. In exchange, when fully phased in, it required states to make sure that 20 percent of their eligible recipients enrolled. (About half the caseload was exempt.) Spend more, demand more—the law seemed the very embodiment of the “New Consensus,” and its passage was celebrated as a historic breakthrough. Moynihan was especially ebullient, predicting it would “bring a generation of American women back into the mainstream.” Instead, over the next few years, the caseloads swelled by a third. By then Clinton was back in the picture, and he had a new idea.

  SIX

  The Establishment Fails: Washington, 1992-1994

  The speechwriter whose slogan ended welfare had never met anyone on welfare. That hardly disqualified him from a leading role in the spectacle about to unfold. Over the next five years, the drive to end welfare would attract an impassioned cast of the sort not found in civics class. There were the pollsters and admen of the primary season, awed by the power of the pledge to win votes, and the professors of the Clinton Camelot, vexed by its technical challenge. There was a grandiose Republican Speaker of the House, promising to liberate the poor, and his off-message troops likening them to “alligators” and “wolves.” By turns surprisingly earnest and shamelessly cynical, the process swirled around an enigmatic president whose intentions were impossible to read. When the ink had dried, many would gripe the process was driven by expediency and bias rather than by a somber reading of the welfare literature. Which isn’t to say that where it wound up was all wrong.

  The speechwriter, Bruce Reed, came from a prosperous family in Idaho, a sparsely populated, overwhelmingly white state, and the one with the smallest percentage of children on public aid—Milwaukee alone had six times as many welfare recipients. With bookish parents who treated their children to European vacations, Reed couldn’t have spent his formative years farther from the ghetto. After leaving Coeur d’Alene, he studied English literature at Princeton and Oxford (as a Rhodes Scholar), then made his way to Washington as a speechwriter and policy entrepreneur. He was trying to jump-start a struggling campaign when he set down the resonant phrase “end welfare as we know it.” Reed’s real interest wasn’t welfare per se but the fate of liberalism. His parents had been stalwarts of a state Democratic party sliding toward extinction, and Reed had spent his childhood as a door knocker for increasingly doomed causes—“struggling,” as he put it, “to defend every tenet of liberalism at the wrong end of the gun.” The defining political moment of his youth came in 1980, when Ronald Reagan won the White House and Frank Church lost his seat in the United States Senate, depriving Idaho of its liberal icon and Reed of his boyhood hero. Reed’s father felt so alienated he bought a shelf of books on the Middle Ages and repaired to the twelfth century. Reed, in his junior year of college, started to question his politics. One place where liberalism had erred, he decided, was in its defense of welfare.

  In 1990, Reed took a job at the Democratic Leadership Council, a group formed in the Reagan years to rethink the party’s liberal commitments. A few months later, the group appointed an exciting new chairman, Bill Clinton, whose distillation of his prolific interests into the twin themes of “opportunity” and “responsibility” seemed genuinely new. Clinton’s tenure at the DLC reached its high-water mark in Cleveland in May 1991, when he warned that voters no longer trusted Democrats “to put their values into social policy” and used the example of welfare checks that came from “taxpayers’ hides.” “We should invest more money in people on welfare to give them the skills they need,” Clinton said. “But we should demand that everybody who can go to work do it, for work is the best social program this country has ever devised.”

  Reed was dazzled. In person, he was boyish and smiling, littering his speech with pauses and “ums.” But three weeks after the Cleveland event, Reed sent Clinton a fire-breathing memo, urging him to “build a mad as hell movement” and “say and do what it takes to win.” When Clinton declared his candidacy in October 1991, his vague call to move families “off the welfare rolls and onto work rolls” wouldn’t win the attention a dark horse needs. With a speech at Georgetown University, Clinton had another chance. It took Reed a half-dozen drafts, but he finally put down a phrase he liked: End welfare as we know it.

  The slogan had arrived before the policy. What did it mean?

  The next morning, Sunday, October 20, a colleague gave Reed a paper by a young Harvard professor named David Ellwood. The paper elaborated on ideas that Ellwood had laid out in his book Poor Support, which endorsed time limits on welfare but only as part of a larger expansion of aid. Ellwood pictured universal health care, job training, child care, and child support “assurance”—in effect, a guaranteed income for single mothers, since the government would make support payments if fathers did not. With those “poor supp
orts” in place, Ellwood argued, the government could limit welfare to between eighteen and thirty-six months; then recipients would be offered a public job. Ignoring Ellwood’s preconditions, Reed zeroed in on the most provocative issue—time limits—and chose a midpoint of two years. After that, he decided, welfare mothers should work.

  The move from vague calls for work requirements (which Clinton had long supported) to time limits (which no prominent politician had endorsed) was a quantum leap. How would the government come up with the jobs? What would they cost? But Reed wasn’t running a seminar. He gave Clinton the Ellwood paper the day before the speech, and Clinton signed off. On October 23, 1991, Clinton set forces in motion. “In a Clinton administration,” he said,

  we’re going to put an end to welfare as we know it. . . . We’ll give them all the help they need for up to two years. But after that, if they’re able to work, they’ll have to take a job in the private sector or start earning their way through community service.

  The pledge worked, in part, because of Clinton’s credibility. As the son of a low-income single mother, he was no stranger to struggle. His friend-of-the-poor record was strong, and so were his calls for health care, child care, and wage supplements. And since Clinton had shown a long interest in welfare, both as the governors’ point man and in Arkansas, he couldn’t be accused, like his opponent, George Bush, of concocting the issue in a pollster’s lab. But much of the electoral power radiated from the phrase itself. “End welfare” sounded definite and bold; most voters heard it as a cost-saving pledge; and as his fears about being likened to David Duke made clear, some whites welcomed it as an attack on blacks. “As we know it” offered an all-purpose hedge. After all, Clinton wasn’t really proposing to end welfare; he was proposing that people work for it, in government-created jobs. Fiscally, his plan wasn’t conservative at all. “I think we ought to end welfare as we know it by spending even more,” Clinton said early in the race, even as his pollster, Stan Greenberg, argued that the plan “shows Clinton’s skepticism about spending.” While the unknowing took “end welfare” as a vow to end welfare, nervous elites detected a wink from a man they judged one of their own. Ending Welfare proved the perfect pledge for the perfectly protean candidate.