Post-World War II federal urban renewal is today widely viewed as a failure. Yet cities are repeating the mistake with tax increment financing.
Walk today through the parts of my hometown of Charlottesville, Virginia, that underwent “urban renewal,” and you start to grasp the policy’s failures nationwide. The southern city that’s home to Thomas Jefferson and UVA is defined by charming historic neighborhoods. But several downtown areas disrupt this fabric with an oddly suburban aesthetic of parking lots and insular apartment complexes. Without foreknowledge, one might guess that these resulted from a free market that plopped down modern development without regard for context. But the opposite is true.
Throughout the 1960s and 1970s, Charlottesville’s government demolished these areas, most notably Vinegar Hill, a functioning black neighborhood. Aiming for slum clearance, the city razed nearly 40 structures, replacing them with a road and empty lots slated for redevelopment; and moved the tenants into public housing. But not until 20 years later did Vinegar Hill attract an anchor development—the Omni hotel—and the neighborhood remains underwhelming. Much of Charlottesville’s black community, meanwhile, remains in the government projects, which are mired in crime and unemployment. And the episode itself is universally despised, prompting a city apology in 2011.
The absurdity of this story—of a city sabotaging its own people and character—might make Charlottesville seem abnormally cruel. But it has been common across the U.S. After World War II, federal money was redistributed for such renewal, namely of black areas, and it produced similar results, as countless neighborhoods went from vibrant to barren. This has not prevented such policies from still being employed, however, by public authorities bent on redevelopment. The policies have become more localized, empowered legally by the Kelo v. New London Supreme Court decision, and financially by tax increment financing. But they are similarly wasteful and destructive.
The desire for urban renewal in U.S. cities actually predated federal intervention, having become local policy in the 1930s. By then New York City, particularly, began moving families out of tenements and into some of America’s first public housing. The federal government followed in 1937, helping cities replace their substandard housing with government projects. But the concept didn’t become mainstream until Title I of the 1949 Housing Act—which authorized “slum clearance and community development”— and follow-up legislation in the 1950s. More than just building housing, these bills were meant to holistically reorient cities around the Modernist principles being practiced in Europe. The Modernist movement—expressed aesthetically in the International Style of architecture—assumed that industrial-era cities were anachronisms that needed retrofitting. This could be accomplished, planners believed, through redevelopment programs that cleared dense areas using eminent domain, replaced them with highways and open space, and put displaced residents into public housing. But urban renewal also had a private component, demolishing neighborhoods for glassy new malls, offices, and entertainment districts that were considered essential for middle-class retention.
By 1974, the federal government had spent $57 billion (in 2013 dollars) on 2,100 projects, affecting cities great and small. But many initiatives, says Purdue University historian Jon Teaford, were supplemented with local and state funding, and especially appealed to large liberal cities. New York City remained the movement’s leader, with whole swaths reinvented by Robert Moses. The leading per capita recipient of federal money was New Haven, Connecticut, which under Mayor Dick Lee razed several interior neighborhoods, including for the expansion of Yale University. Detroit demolished Black Bottom—an African-American hub described as “better than Harlem”— while Chicago executed similar plans.
Despite their intent, the policies proved unpopular, thanks partly to the displacement they caused, which was estimated at one million people in 993 cities. Such removals had been justified because tenants would go from substandard housing into what were considered comfortable new projects. Due to delays, however, by 1962 only one unit had been built for every three demolished, leading to severe housing shortages, namely for black populations already limited by housing discrimination. The policies also generally favored large developers, embodying a crony capitalism that would look familiar today. Neighborhoods that were removed—which included not only black ones, but Irish, Italian, Mexican, and Japanese ones—made way for projects like Manhattan’s Stuyvesant Town, an enormous apartment complex run by MetLife Insurance. And urban renewal imposed an unwanted aesthetic, replacing human-scale neighborhoods with outsized structures.
But perhaps the main strike against many projects was their lack of success. Another rationale had been that such developments would finally place modern uses onto prime real estate, helping cities generate more tax revenue. But like in Charlottesville, many cities saw their targeted lots sit empty for years. A Buffalo housing project that was approved in 1954, and that displaced 2,200, had little development a decade later, causing one local to quip that the 29-block stretch could lead “plane pilots to assume the city was constructing a landing strip.” Another project in Pittsburgh’s Lower Hill neighborhood that displaced 8,000 residents still had large vacant parcels after 25 years. A 1966 study found that urban renewal projects averaged over a decade to complete, and many businesses that finally emerged failed to reverse decline, or defaulted.
“In one city after another,” wrote Teaford, “middle and upper-income Americans shunned areas whose reputations had been so odious that they required renewal.”
By the 1960s, what began as neighborhood agitation against these policies became mainstream outcry, led by books like Jane Jacobs’ The Death and Life of Great American Cities and Martin Anderson’s The Federal Bulldozer. But the most searing indictment came from the affected communities, some of which rioted.
The common narrative behind the 1960s black race riots is that such communities had long simmered from segregation and police brutality, and erupted following Martin Luther King, Jr.’s assassination. But one understated factor was urban renewal. In the decades leading into the riots, black communities had been taken through the wringer of property confiscation, displacement, and warehousing. It was no surprise that they eventually rebelled, and in the cities with the harshest schemes, like Chicago, Newark and Cleveland. Economist David Henderson notes that in Detroit, the black area that “avoided rioting had also successfully resisted urban renewal,” while the most riotous one had been a longtime target.
For these reasons, urban renewal was by the early 1970s considered a failure, marked by the dynamiting of St. Louis’ Pruitt-Igoe, a 33-building complex that two decades prior was meant to showcase the policy’s potential. In 1974, Congress ended Title I, and decades later the federal government was tearing down other infamous projects that it had first built.
But this didn’t prevent cities from continuing with redevelopment. Most of them, after all, still had public corporations around from the urban renewal days. Ones like the Boston Redevelopment Authority—an unelected body with dubious financing methods and unrivaled zoning control—had previously organized vast slum clearances, and their ambitions hadn’t just waned once void of federal funding. But they now needed to find local funding sources, and the one that most settled on was tax increment financing (TIF).
This method allows local or state agencies to draw boundaries around an area for redevelopment. The agencies then sell bonds, and use that money to create incentives—usually cash or free land—for target businesses. Future sales or property tax revenue that comes from within the boundary is then used to pay the bonds.
In 1974, TIF was legal in only 9 states. But in the decade following urban renewal, it became legal in 28, and today is in each one except Arizona. By the early 1990s, a majority of cities with over 100,000 people had used TIF, and from then until 2010, TIF bond sales nearly doubled to $3.3 billion. TIF is now a go-to funding source for stadium, retail, condo, and other developments, having become urban renewal’s modern incarnation.
This is not to say that they are identical, most notably not in scale. One only need look at New York City’s numerous postwar public housing projects, with their brick towers stretching block after block, to recognize the grand ambitions that inspired urban renewal. TIF, meanwhile, emerged after top-down planning became unpopular, and is used more tactically. It also has less of a social justice element: while first designed for inner cities, TIF has evolved into a way for any area to spur growth. According to law professor Richard Briffault, it “is now an all-purpose local government tool for financing public investment in market-oriented development rather than simply a mechanism for combating blight.”
But TIF still shares urban renewal’s problems, for example by perpetuating crony capitalism. At first glance, its subsidies don’t seem like handouts, since they supposedly pay for themselves through increased revenue from new projects. But it is unclear whether revenues truly increase because of these projects, or from inflation. And the money that pays for them would otherwise fund core services, causing misplaced priorities in many cities. For example, the luxury grocer Whole Foods has received a combined $16.5 million in public money to locate in Detroit and Chicago, two cities that can’t even provide adequate policing. Chicago has also used TIF to lure a Walmart and a Marriot hotel.
TIF also plays an unfortunate role as an enabler for eminent domain, which would otherwise be unaffordable for cities, argues Cato Institute economist Randal O’Toole. Since urban renewal ended, some of the worst cases of localities seizing private property for private uses—like Detroit’s demolition of Poletown for a G.M. plant–were TIF-funded. Since 2005, such takings were further validated by the Kelo v. New London case. The decision—which found that New London, Connecticut could, in the name of economic development, raze resident Susette Kelo’s home for a Pfizer plant—was blasted by critics, and countered by pro-property rights legislation in most states. But this hasn’t prevented further government looting. A 2006 Los Angeles deal cleared 30 businesses for a development spearheaded by the W Hotel. Denver, which was employing aggressive redevelopment plans before Kelo, has since declared much of its historic Five Points neighborhood blighted. And a recent state court ruling found that the New Jersey Casino Reinvestment Development Authority could demolish an Atlantic City man’s home—and dozens more—for a development near the recently-shuttered Revel casino (which three years ago received $260 million in TIF money). Two decades before, courts had prevented this authority from razing another home to expand a casino for Donald Trump. But the recent case was decided under a more confiscatory legal climate.
Like with urban renewal, TIF projects also generally don’t work, encountering the same misallocations and delays. This was summarized in two studies that showed their ineffectiveness, respectively, at producing growth and tax revenue. A study by David Merriman and Richard Dye found that “cities that adopt TIF grow more slowly than those that do not,” since concentrating spending in one area causes service reductions elsewhere. The other, by Dean Stansel and Carrie Kerekes, found that eminent domain activity doesn’t increase government revenues, and may decrease them. Perhaps no better example of this futility exists than in New London, where, despite the $78 million public expenditure, Kelo’s lot remains undeveloped over a decade later.
The strongest similarity between past and present deals is not their failure, however, but the fact that they have generally been applied by the same cities, germinating in ones with liberal political heritages. Chicago was a leader in urban renewal initiatives that destroyed much of the city, yet has become a leading TIF user, splurging on downtown luxury projects despite terminal decline. New York City experienced similar whitewashing, yet has become the poster child for 21st-century confiscation, placing blight designations that would’ve made Robert Moses blush on serviceable properties near the Atlantic Yards and Columbia University. Despite Charlottesville’s Vinegar Hill experience—marked not only by displacement, but the city’s eventual $11.3 million bailout of the Omni—it continues to subsidize developers. And this congruity exists in Cleveland, Detroit, and other liberal cities. Officials within them are hardwired to believe that, even without thriving markets, growth can occur through top-down government plans. But their ideological bubble prevents them from noticing how such plans can be wasteful and abusive—which would be evident if only those officials glanced into the past.
[This article was originally published by Forbes.]
Like it? Share it