PRRAC Poverty & Race Research Action Council
Home About PRRAC Current Projects Publications Newsletters Resources Contact Us Support PRRAC Join Our Email List

"Economic Growth and Poverty: Lessons from the 1980s and 1990s,"

by Jared Bernstein January/February 2001 issue of Poverty & Race

Over the course of the 1990s business cycle, the U.S. economy racked up some pretty impressive statistics. Unemployment in 2000 fell to a 30-year low, and the tightening labor market finally led to wage gains for many groups of workers whose wages had stagnated or declined for decades. Productivity growth — a measure of how efficiently the economy is generating its output — sped up for the first time in years, handily paying for the aforementioned wage gains. Even the intractable problem of growing inequality slowed considerably in the latter half of the 1990s (it continued to grow over the decade, but half as fast as during the 1980s).

The poverty rate, however, was virtually the same in 1998 as it was in 1989.

Actually, it was one-tenth of a percent lower in 1998 — 12.7% of the population was poor in 1998 vs. 12.8% in 1989: statistically indistinguishable. And since the population grew since 1989, more people were poor in the latter year (34.5 million vs. 31.5 million).

How did this come to pass? Has the recent wave of prosperity washed over the poor? Are low-income families immune to the benefits of the so-called “new economy”? And if so, then what’s so new about it?

Before addressing these larger issues, some facts are needed for context. First, a word about how we measure poverty. Each year, the Census Bureau collects a representative sample of data on family incomes, adds in the value of cash transfers (such as welfare benefits), and compares this amount to the poverty thresholds, which are adjusted for family size. In 1998, the threshold for a family of four with two children was $16,530.

The current thresholds were derived in the early 1960s, based on data from the mid-1950s, and most experts agree that they are far out of date. They reflect neither the true resources available to many low-income families (for example, the cash value of food stamps), nor the higher thresholds that would prevail if the original measures were updated. Recent analysis by the U.S. Census Bureau shows that an updated method would lead to poverty rates on average about 3.5 percentage points higher in the 1990s. The trends over time, however, of the official and alternative rates are similar, and because of the pervasive use of the official rate, I will focus on it throughout. But it is well within our means to do a better job measuring poverty, and we should do so.

In much of what follows, I stress trends in the poverty rate, i.e., how it has evolved over time, since this is the best way to address the above set of questions. For 1998, the most recent year for which data are currently available, 12.7% of the population were poor by the official definition, as noted. Poverty rates for minorities are higher, since their income levels are lower, on average. The rate for African-Americans was 26.1%, for Hispanics it was 25.6%. The child poverty rate was 18.9%, and the black child poverty rate was 36.7%. Child poverty rates in the U.S. are typically at least twice those in many comparably advanced economies.

Some Key Trends

In order to examine the connection between economic growth and poverty trends, we must begin by looking more closely at some key trends over the last few decades. As the 1980s wound down, economists noticed that poverty rates had been pretty “sticky” over the decade. That is, they did not respond to economic growth as they had in the past. Of course, poverty continued to be cyclical, rising in economic downturns and falling in recoveries; but, while the rate rose as high as 15.2% as a result of the early 1980s recession, it never fell back as far as would have been expected, ending the decade a point higher in 1989 than in 1979 (12.8% vs. 11.7%).

The beginning of the 1990s looked like more of the same. Despite the relatively shallow nature of the early 1990s recession, poverty rates rose to 15.1%, about as high as in the earlier, much deeper recession of the early 1980s. Thus, in terms of their response to overall growth, it appeared that poverty rates were less likely to fall, and more likely to rise, for a given amount of overall growth: the worst of both worlds!

Numerous factors were behind this unfortunate dynamic. The most important was the striking increase in inequality that took place over these years. Whether we look at wages, incomes or wealth, the indicators show that the gap between the top, middle and the bottom expanded to historic highs over these years. Low-income persons were disproportionately affected, and actually lost ground in real terms. For example, between 1979 and 1989, the average income of the bottom 20% of families fell 6% in real terms, while that of the top 20% grew 20%. In a situation like that, aggregate indicators, such as GDP growth or productivity, tell us very little about who’s getting ahead. In fact, the pie was growing, but the poor were getting ever-smaller slices.

Also, we know now that the labor market never really tightened up in the 1980s and early 1990s relative to later years. The overall unemployment rate was 7.1% in the 1979-89 period, well above both the averages of prior business cycles, and, most importantly for this analysis, the 5.7% average that prevailed over the 1989-99 period. It wasn’t until things really heated up in the mid-1990s that more of the rising tide finally began to reach some of the smaller vessels.

One of the most common explanations for poverty’s failure to respond to growth is the growing share of family types more vulnerable to poverty, specifically mother-only families. It’s important to examine this claim closely, because if it’s true that this is a major explanation, then all the economic growth in world won’t help much. But, while it seems like simple accounting — more single-mom families=more poverty — a look at the relevant trends challenges a simple demographic story (“trends” is the key word: any explanation of changes over time must address not simply the level of poverty but the trend). When female-headed families were forming most rapidly, in the 1970s, their poverty rates were actually falling. Over the 1980s, their growth as a share of the population fell to one-third the 1970s rate, but their poverty rates grew. And over the 1990s, their poverty rates fell again, while their share continued to grow. (One relevant counterargument here is that if faster growth led to fewer single-parent families, then poverty would fall as a result.)

Thus, it’s not so simple. While at any given point in time, more single-parent families lead to a higher poverty rate, there is no evidence that this factor has become more important over time. In fact, a more thorough decomposition shows that it’s become considerably less important. In the 1970s, this shift raised poverty rates by two percentage points. In the 1990s, the effect was one-half of one percent.

Clues from the 1990s

The latter half of the 1990s was a better economic period for the majority of American families, including the poor. Poverty rates fell from 15.1% in 1993 to 12.7 % in 1998. For African-Americans, the decline was much steeper, from 33.1% to 26.1%. We expect poverty rates to fall over a recovery, but these declines are greater than those that occurred over the 1980s.

Of course, this positive trend cannot diminish the deeply disturbing fact that over one-quarter of the black population in this country is poor, with incomes far below the level needed to meet their most basic needs (the same goes for Hispanics, whose poverty rate, as noted, was 25.6% in 1998). But the question at hand is: what useful lessons can we learn about the economy of the late 1990s? After all, we will never lower these levels if we fail to understand what drives the trends. As we will see, the past few years offer some important lessons about how our nation can reduce poverty.

For the first time in 30 years, as the 1990s came to a close, the labor market was once again operating at or close to full employment. This doesn’t mean everyone who wanted a job had one. It does, however, mean that, on average, the ranks of the unemployed were much smaller (as a share of the workforce) than in the past. It also meant that labor demand was stronger than it had been in decades, finally giving a lift to the millions of workers left behind in previous upturns. Again, the level of joblessness was still much higher for minority workers than for whites — the rule of thumb that the black unemployment rate is more than twice that of whites has not been revoked. But just as minority poverty rates fell the most in the latter half of the 1990s, so the labor market indicators for many minority workers improved the most.

But how did this affect the poor, who purportedly don’t work very much? In fact, in tandem with welfare reform, it led to a dramatic increase in their success in the labor market. It is true that the hours spent by poor families in the labor force are much lower than those spent by the non-poor. But the trends are again very clear.

In both 1979 and 1989, the share of poor families who did not work at all was 41%. By 1998, that share had fallen to 34%. After growing 4% over the 1980s, the average hours worked by poor families grew 11% over the 1990s, to 1,112 annual hours. Among poor white families, average hours were unchanged in the 1990s: they worked about 1,145 hours in both 1989 and 1998. But the annual hours worked by poor African-American families grew 21%, from 723 to 825. For poor single mother families with children, average hours increased by 40%, from 577 to 808, an increase that partially reflects the welfare-to-work component of welfare reform. Poor families still work much less than non-poor families, but they are working much more than they have in the past.

These increases, coming off of such low bases, should not be expected to send poverty rates down to zero, but we would hope that they made a big difference in the poverty rates of poor families. Yet, while poverty rates for most families fell in the 1990s, the gains could be characterized as disappointing. For mother-only families, the group whose hours increased the most in percentage terms, poverty fell from 42.9% to 38.7% in the 1989-98 period. Black family poverty fell from 27.9% to 23.2%; Hispanic poverty from 23.4% to 22.7%. These gains all point in the right direction and reinforce the point that the trends of the 1990s have made a big difference. But we might have hoped that “the best economy in 30 years,” in tandem with increased work effort among the poor, would have knocked the rates down further.

The reasons these hopes were not realized has to do with the relatively few hours worked by the poor, their low wages and the loss of cash benefits. Regarding the first point, it is often stressed that the poor do not work very much, and the above annual hours support this assertion. But why don’t the poor work more? First, about half are children or elderly. When we take out the disabled and those going to school, and look just at those in their prime labor market years, we find that over two-thirds worked at some point during the year. But a very small share, relative to the non-poor with the same characteristics, worked full-time, year round.

If we are committed to a labor market strategy for lowering poverty, we need to learn to more about why this is the case. For example, research on welfare-to-work often finds that reliable and safe child care is a significant barrier for single mothers.

However, even if the poor were to double their annual hours spent in the labor force — an extremely tall order — their market wages (the pre-tax wage) would leave them below the poverty line. The average hourly family wage — family earnings divided by family hours — for poor families headed by someone aged 25-54 was about $6.30 in 1998 for families. (The average family hours statistics cited above include families with zero hours. These family wage calculations exclude such families.) Their average annual hours that year were 1,273. If this were a family of four, and they doubled their hours, they would still be poor, at least before adding in the Earned Income Tax Credit — EITC (a wage subsidy for low-income workers) and food stamps, which would lift them above the poverty line.

But another trend is the decline in cash transfers to poor families which occurred over the late 1990s. Some of this was legislated, as in welfare reform, but some of it, particularly the decline in the food stamp rolls, appears to be due in part to unauthorized administrative tightening of eligibility. Thankfully, the EITC was considerably expanded in the 1990s, and this has proven to be an important antipoverty tool.

Lessons from the 1990s

The lessons of the above are not hard to identify. If we as a nation agree that work in the paid labor market is our primary antipoverty strategy, then the path is fairly clear. We have to identify and address the barriers that keep poor families from working more hours. To the extent that these have to do with economic constraints, such as lack of access to affordable child care of an acceptable quality, we must subsidize their use. At the same time, we need to implement policies to raise both the pre- and post-tax wages of the working poor.

On the pre-tax side, the best plan is to make sure we stay at full employment. One of the key lessons of the 1990s is that this is the best way to create enough pressure in the labor market to raise the wages of those left behind when there is too much slack in the labor market, as was the case in the 1980s and early 1990s. Full employment is not an accident of fate; it results in large part from actions by the Federal Reserve, which, in the latter years of the 1990s, finally allowed the unemployment rate to fall. The minimum wage can be increased — in real terms, it still remains over 20% below the levels of the late 1970s. And of course, providing education and training is a key longer-term strategy to raise the earnings the poor can command in the job market.

On the post-tax side, expanding the EITC is a popular plan with considerable political support. Finally, we need to make sure poor families get the benefits to which they still are entitled, and add more benefits that are tied to work, such as child care subsidies.

Some of these ideas mean increased public expenditures (others, like full employment and the minimum wage, do not). But that is the inevitable outcome of a work-based strategy in an economy with such a large and flourishing low-wage labor market. It’s simple: if we insist that the able-bodied poor work, and the low-wage labor market does not generate the wages needed to avoid poverty, even with full-time work, then we have two choices. We either make a very bad deal with the poor, i.e., you work, but you and your children continue to live in privation; or we apply public policy in the spirit of a good-faith agreement: you work, and we as a society will make sure you have enough income to meet your needs. In my view, and, I sincerely believe, in the view of most Americans, only the good deal is acceptable.

Jared Bernstein a senior economist at the Economic Policy Institute, is co-author of The State of Working America. In 1995-96, he was deputy chief economist at the U.S. Dept. of Labor. jbernstein@epinet.org
 

Notes:

A more detailed treatment of this subject, with data through 1999, can be found in The State of Working America, 2000-01 (Ithaca: ILS Press, 2001), co-authored with Lawrence Mishel and John Schmitt.

[1788]

 
Join Our Email List
Search for:             
Join Our Email List