By Chuck Strouse
By Scott Fishman
By Terrence McCoy
By Ryan Yousefi
By Ciara LaVelle, Kat Bein, Carolina Del Busto, and Liz Tracy
By Pepe Billete
By Ryan Yousefi
By Kyle Swenson
But what exactly does the Census Bureau mean by poor? And how did it determine that Miami is now the poorest big city in America? Well, if that single mom had two youngsters to feed, clothe, and house, she was considered poor if she made less than $13,423. In Miami 3398 similar single-mom families fell below that magic number. Overall 19,779 families (100,405 individuals) were poor in 1999. That amounted to nearly one out of every three people who live in the city, and that was enough to boost Miami from the fourth-poorest big city (population over 250,000) a decade ago to the top spot today. This ignominious honor was achieved during a time when most of the nation was enjoying unprecedented levels of prosperity fed by a booming Wall Street, a projected federal budget surplus, and a red-hot job market.
All that melted away about a year and a half ago, when people began to realize that much of their prosperity was built on little more than hype and creative accounting. So it's more than a little ironic that this nation's official method for measuring poverty is likewise based on faulty assumptions that purposely paint a rosier picture than actually exists.
It's bad enough that the 2000 Census found Miami sinking under the weight of poverty, but in fact the situation is even worse than the numbers suggest.
The formula used to calculate those numbers was the idea of a plain-faced, no-nonsense Social Security Administration economist by the name of Mollie Orshansky, who in 1963 worked out a simple method for measuring American poverty. Ms. Orshansky figured that if you took the bare-bones amount of money it cost a family to feed itself (a number she got from the U.S. Department of Agriculture) and multiplied that by three, you'd arrive at an acceptable figure for a minimum standard of living. The food number was multiplied by three because economists at the time assumed most families spent one-third of their incomes on food. The other two-thirds was used for shelter, clothing, and other basic needs. Of course families come in different sizes, so Orshansky proposed to adjust for the number of people in the family and whether they were under 18 years old or over 65 (kids and senior citizens eat less, or so the theory holds).
Since 1965 our nation's leaders have been using this formula to draw a neat fence around a terribly messy problem. It became the basis for President Lyndon Johnson's War on Poverty and has remained the one absolute even as the social, political, and economic structure of American life has warped in unprecedented, unpredictable ways. Although the War on Poverty was largely undone by Johnson's successors in the Eighties and Nineties, its measuring standard remains, and it continues to drive inadequate national and local policies designed to identify and assist our disadvantaged fellow citizens.
The census determines a city's poverty rate by dividing the number of people living under the poverty threshold by the total number of city residents. (This doesn't include people living in group quarters, such as military barracks, jails, and institutions.) For Miami that now works out to a poverty rate of 28.5 percent, making the city numero uno, surpassing the grim realities of such rust-belt contenders as Buffalo and Cleveland.
But even that figure is an overly optimistic assessment of real need in many areas, such as Miami's inner-city neighborhoods, according to poverty experts who've been arguing for years that the government should adopt a more realistic measure. In 1995 more than 40 economists, sociologists, researchers, and public-policy experts from some of the nation's most prestigious institutions sent a letter to the federal government calling for major revisions. Citing a report by the National Research Council, the letter pointed out the painfully obvious -- that times have changed since the Sixties. A few examples: the explosive growth of single-parent homes and working women; steeply higher rents and health-care costs; grandparents raising kids; radical changes in food consumption (think fast-food nation, prepackaged supermarket goods, and the effects of industrialized agriculture on food costs). Nor does the old method measure the benefits of government programs such as food stamps or Medicaid that provide some buffer for those who qualify.
But government officials were unmoved and today persist in using Mollie Orshansky's formula, which is adjusted each year only for inflation. It doesn't account for even basic cost-of-living differences around the nation -- meaning that Uncle Sam thinks a dollar goes as far in New York City as it does in Fargo, North Dakota. And since patterns of consumption have changed, the idea that most families still spend one-third of their income on food is simply wrong, experts argue. "Those assumptions are no longer accurate, if they ever were," says Keith Kilty, co-editor of the Journal of Poverty and a professor of social work at Ohio State University. "Poor people probably spend at least half their income on rent, especially in large cities like Miami."