For Americans born in 1940, the math was overwhelmingly in their favor. Roughly 92% of them grew up to earn more than their parents in inflation-adjusted terms. That wasn't a story about exceptional talent or grit. It was arithmetic: a growing economy distributing its gains broadly enough that the rising tide actually did lift most boats [1].
For Americans born in 1984, the number is 50%. A coin flip.
The economy didn't stop growing. Real GDP per capita increased roughly sixfold from 1940 to today [3]. The United States is, by any measure, fabulously wealthier than it was when those 1940-cohort children were starting their lives. And yet when the Federal Reserve asked Americans in late 2024 how they'd handle an unexpected $400 expense, 37% said they couldn't cover it with cash [2]. Only 29% rated the national economy as "good" or "excellent" (down from 50% just five years earlier), even as the stock market sat near record highs [2].
Something happened between 1940 and now. Not to the economy's capacity to generate wealth, but to who gets it.

The Era That Wasn't a Myth
The postwar decades, roughly 1945 to 1975, are often dismissed as nostalgia. They weren't. The economic data from this period is among the most remarkable in modern history, and it's worth examining without sentimentality because it establishes the baseline against which everything since must be measured.
From 1948 to 1979, net productivity in the American economy grew 118.4%. Over the same period, typical worker compensation grew 107.5% [4]. Productivity and pay rose in near-lockstep. A factory worker, a clerk, an electrician — each could see in their paycheck a direct relationship between how much more the economy produced and how much better they lived.
This wasn't charity. It was structural. Union membership reached 33.5% of the workforce in 1954 [25]. Top marginal income tax rates sat at 91–92%, not because politicians hated the rich, but because at those rates, it made more economic sense for a business owner to reinvest profits into equipment, wages, and expansion than to extract them as personal income [15]. The GI Bill sent a generation to college and flooded the housing market with affordable mortgages. Interstate highways knit the economy together. Corporate governance operated under a "retain and reinvest" model, where firms plowed most earnings back into productive capacity rather than distributing them to shareholders.
The results speak in concrete terms. A median-income worker in 1960 could buy a median-priced home for approximately 2.3 times his annual gross income [8]. First-time homebuyers made up roughly 40% of the housing market [9]. A single earner, usually without a college degree, supported a family of four, owned that house, put money away, and expected his children to do better still. And 92% of the time, they did [1].
This system had deep flaws. The GI Bill's benefits were systematically denied to Black veterans. Redlining excluded minority families from the primary wealth-building mechanism of the era. The prosperity was concentrated among white households in ways whose compounding effects persist today: the median white household still holds 6.4 times the wealth of the median Black household [26]. The postwar economy worked spectacularly for those it included, but it excluded millions by design.
What matters for the present argument is this: the engine worked. Economic growth translated into broadly shared prosperity through identifiable institutional mechanisms. Those mechanisms were later dismantled through identifiable policy decisions.
The Break
Between roughly 1973 and 1982, the American economy's distributional architecture was fundamentally altered. The changes were not the product of invisible market forces. They were discrete decisions, made by specific people, on specific dates.
The Taft-Hartley Act of 1947 had already planted the seed by authorizing "right-to-work" laws that allowed workers to benefit from union contracts without paying dues. By 2024, 27 states had enacted such laws, and private-sector union membership had collapsed from 33.5% in 1954 to 6% [25]. The union wage premium (roughly 10–20% above comparable non-union workers) didn't just benefit union members. It set a floor that pulled up wages across entire industries through competitive pressure. When that floor fell away, so did the mechanism that connected worker productivity to worker compensation.
In 1982, the Reagan-era SEC adopted Rule 10b-18, creating a safe harbor for stock buybacks that had previously been legally risky due to market manipulation concerns [18]. Before the rule, corporations returned profits to shareholders primarily through dividends; the rest went into R&D, equipment, and wages. After it, buybacks exploded from roughly $6.6 billion in 1980 to over $1 trillion in 2018, the year after the Tax Cuts and Jobs Act [18]. Between 2003 and 2012, the 50 largest S&P 500 buyback spenders returned 91% of their earnings to shareholders through buybacks and dividends, leaving 9% for actual investment in their businesses or workers [18].
From 1981 to 1988, the top marginal income tax rate was cut from 70% to 28% [15]. The top corporate rate followed a similar trajectory: from 52.8% in 1960 to 21% after the 2017 TCJA [16]. The effective tax rate on the 400 highest-earning Americans dropped from roughly 56% in 1955 to approximately 23% by 2018 [15]. And in 1999, the Gramm-Leach-Bliley Act repealed the Glass-Steagall firewall between commercial and investment banking, enabling the rise of "too big to fail" financial conglomerates whose collapse a decade later would trigger the largest economic crisis since the Depression.
Four policy shifts. Four traceable dates. The economy kept growing after each one, but the distribution of that growth changed direction.
The Productivity Heist
The single most important graph in American economics shows two lines. From 1948 to 1973, they rise together. After 1979, they diverge: one keeps climbing, the other flatlines.
The climbing line is productivity. From 1979 to 2023, net productivity in the American economy rose 80.9%. The other line is typical worker compensation. Over the same 44 years, it rose 29.4% [4]. Productivity grew nearly 2.8 times faster than the pay of the workers generating it.
If median compensation had tracked productivity growth since 1948, the typical American worker would earn approximately $27 per hour instead of the actual $17 per hour measured in 2018 [4]. That's roughly $12,000 to $15,000 per year in income that was generated by workers but never reached their paychecks. Scale that across 44 years and tens of millions of workers, and the aggregate sum transferred from labor to capital runs into the trillions.
Where did it go? The answer is visible in every dataset that measures the top of the distribution. The Congressional Budget Office found that after-tax income for the top 1% grew 226% from 1979 to 2016. The middle quintile managed 46% [5]. Emmanuel Saez's data from IRS records show that during the 2009–2018 recovery, the top 1% captured 52% of all income growth [19]. The bottom 50% of American households now own 2.5% of the nation's total wealth, while the top 10% hold 67.2% [20].
CEO compensation tells the story at a personal scale. In 1965, the CEO-to-worker pay ratio was 20:1. By 2000, it had reached 379:1. It has since settled around 281–344:1 [23]. The ratio didn't explode because executives became 17 times more productive over four decades. It exploded because stock-based compensation (mechanically inflated by the buybacks those same executives authorized) made extraction enormously lucrative once the tax code stopped penalizing it.
The Two-Income Trap
In 1960, approximately 30% of married women participated in the labor force. By 2023, that figure had reached 61% [7]. Dual-earner married couples went from roughly 25% to over 60% of all married households. The entry of women into the workforce was one of the defining social advances of the twentieth century.
It also masked an economic crisis.
Real median male earnings have been essentially flat since the early 1970s, measuring approximately $55,000 in 2022, scarcely different from $54,000 in 1975, adjusted for inflation [7]. The rise in household income since the 1970s was driven almost entirely by women entering paid employment, not by men's wages growing. Households doubled their labor input and, by every structural measure, ended up less financially secure.
Elizabeth Warren and Amelia Warren Tyagi documented the mechanism in 2003. A single-earner family in the early 1970s spent approximately 54% of its income on fixed costs: mortgage, one car, healthcare, taxes. By 2000, a dual-earner family spent roughly 75% of its income on the same fixed costs, despite having two incomes [6]. The second paycheck didn't create surplus. It was absorbed by the inflation of non-negotiable expenses (housing, healthcare, childcare, transportation) that rose far faster than wages.
Childcare illustrates the trap precisely. In the single-earner household of 1960, childcare was not a major budget item because a parent was home. For today's dual-earner family, it costs $10,000 to $15,000 per child per year nationally, and $20,000 to $35,000 in metro areas [6]. The Federal Reserve's 2024 survey found that among parents who used paid childcare, just over half spent at least 50% as much on childcare as they did on housing [2], itself the single largest household expense. The second income didn't double the family's purchasing power. It created a new category of mandatory spending that consumed much of the gain.
The safety net of the 1960s household, the non-working spouse who could step into the workforce in an emergency, no longer exists. Both incomes are already committed. When one is lost, the system collapses. Financial fragility in a dual-income economy isn't a paradox. It's an arithmetic inevitability when fixed costs consume three-quarters of total household income.
The Costs That Changed the Equation
Three categories of expense account for the gap between what the economy produces and what families can afford: housing, healthcare, and education. Each was a manageable household cost in 1960. Each is now capable of destabilizing a middle-class family on its own.
Housing was the single greatest wealth-building mechanism for the American middle class. The median home price-to-median household income ratio held at approximately 2.3 during the 1940s through the 1960s. By 2022, it had reached 5.8 nationally, and 10 to 15 times median income in coastal metros like Los Angeles and San Francisco [8]. First-time homebuyers fell to 26% of the market in 2022, the lowest share since tracking began in 1981 [9]. A record 22.4 million renter households were cost-burdened in 2021, spending 30% or more of income on rent. Median reported rent hit $1,200 per month in 2024, rising roughly 10% per year since 2022 [2]. Homeownership for adults ages 30–34 dropped from 52% among Boomers to 42% among Millennials [26].
Healthcare consumed 5% of GDP in 1960. By 2022, it consumed 17.3% [11]. The average annual premium for employer-sponsored family health coverage went from roughly $1,000 in 1999 to $23,968 in 2023 [10]. The United States spends approximately twice per capita what comparable wealthy nations spend and achieves lower life expectancy. Unlike in any peer country, a single medical event can wipe out a family's accumulated savings; healthcare-related costs are implicated in roughly two-thirds of personal bankruptcies.
Education was either free or trivially affordable for much of the postwar period. The Pell Grant covered 79% of the cost of a public four-year institution in 1975–1976; by 2023, it covered 29% [12]. Total outstanding federal student loan debt stands at $1.77 trillion across 43.2 million borrowers, a 17-fold increase in 25 years [13]. Student debt at this scale simply did not exist before the 1980s. It now functions as a generational tax on the young, one their parents never paid.
The combined effect is that Millennials, despite following the prescribed path (degrees, jobs, dual incomes), hold roughly 5% of national wealth at ages comparable to when Boomers held 21% [26]. Their median net worth at 35 was approximately $39,000, compared with $83,000 for Boomers at the same age.
Perception and Truth
There is a particular form of gaslighting embedded in American economic discourse. It runs something like this: the economy is strong, unemployment is low, GDP is growing, so if you're struggling, something must be wrong with you. Work harder. Get a side hustle. Stop buying lattes.
The Federal Reserve's own data refutes this framing with unusual precision.
In 2024, 37% of American adults could not cover a $400 emergency expense with cash or its equivalent [2]. Only 51% spent less than they earned in the prior month [2]. Sixty percent said price changes over the previous year had made their financial situation worse [2]. Only 35% of non-retirees said their retirement savings were on track [2]. These are not the numbers of a prosperous nation experiencing a perception problem. These are the numbers of a nation where prosperity is real but unevenly distributed to the point of structural dysfunction.
The perception gap itself tells a story. When 29% of Americans rate the national economy as "good" or "excellent" despite low unemployment and record stock market levels, the obvious question is: good for whom? The stock market's performance is largely irrelevant to the bottom 50% of households: 84% of corporate equities are held by the wealthiest 10% [18]. Consumer sentiment, tracked by the University of Michigan since 1952, diverged dramatically from objective macroeconomic indicators after 2020, settling at levels typically associated with recessions even as GDP grew. Lower-income households consistently report worse economic perceptions because economic recoveries since 2009 have, in measurable fact, disproportionately benefited those who already owned assets [2].
The counter-arguments deserve honest engagement. Consumer goods — electronics, appliances, clothing — have become dramatically cheaper in real terms. A smartphone in 2025 delivers more computing power than a 1990s university lab. Medicare and Medicaid provide healthcare access that didn't exist in the 1950s. Some economists argue the CPI overstates inflation, making real wage growth appear worse than it is [27]. These points are valid. They are also insufficient. You cannot defer buying a home, paying for healthcare, or sending your children to college the way you can defer buying a television. The three largest household expenditures, housing, healthcare, and education, have inflated at three to five times the rate of general consumer goods for 50 years. Cheaper phones do not offset a housing market that requires 5.8 years of household income where it once required 2.3.
The gig economy is perhaps the most telling indicator that the "work harder" narrative has been fully absorbed and is still failing. Thirty-six percent of employed Americans now identify as independent workers (gig, freelance, contract, or temporary), up from 27% in 2016 [24]. Among them, 26% said they did it "out of necessity to support basic family needs," nearly double the 14% who said so in 2016 [24]. The share doing it for discretionary income halved, from 40% to 20% [24]. This is not a generation discovering the joy of entrepreneurship. It is a labor force already stretched thin trying to plug the gap between what one or two full-time jobs pay and what basic life costs. Sixty-two percent of independent workers would prefer permanent employment if they could get it [24]. The Fed's survey confirmed that a meaningful share of gig workers said that without gig income, they would have trouble making ends meet [29].
The Trickle-Down Ledger
Supply-side economics has now had four decades to prove its central claim: that reducing taxes on top earners and corporations will generate growth that benefits everyone. The empirical record is settled.
The 1950s, when the top marginal rate was 91–92%, produced average annual real GDP growth of approximately 4.2%. The 1960s, with rates between 70% and 91%, produced 4.5% [21]. The 1980s, after Reagan cut the top rate to 28%, produced 3.2% [21]. Correlation is not causation; postwar infrastructure spending, the GI Bill, and strong union membership all contributed to the earlier boom. But the data flatly contradicts the claim that high marginal rates suppress growth. The two decades with the highest tax rates on record produced the fastest sustained growth in American history. Diamond and Saez estimated the revenue-maximizing optimal top rate at approximately 70%, based on empirical elasticity data [28]. The IMF's own cross-country study of 159 nations found that higher income shares for the top 20% are associated with lower GDP growth, while higher shares for the bottom 20% are associated with higher growth [30], the precise inverse of trickle-down theory.
What the tax cuts unambiguously did produce was concentration. The top 1%'s pre-tax income share was approximately 10% in 1980. By 2015, it had doubled to 20–22% [19]. The 2017 Tax Cuts and Jobs Act delivered its benefits along the same gradient: 37% of the total tax reduction in 2018 went to the top 1% of earners. As individual provisions for middle-income households expire while corporate cuts remain permanent, the top 1%'s share is projected to reach 53% of total TCJA benefits by 2027 [17].
The buyback surge following TCJA illustrates how the mechanism works. The corporate rate cut from 35% to 21% was sold as a catalyst for investment and hiring. What it catalyzed was a record $1.1 trillion in stock buybacks in 2018 [18], money flowing directly from corporate balance sheets to shareholders, 84% of whom are in the top decile of the wealth distribution. In the years after the TCJA, multiple S&P 500 companies spent more on buybacks and dividends than on total capital expenditure, meaning they returned more to shareholders than they invested in productive capacity [18]. Real nonresidential fixed investment (the actual measure of whether companies are building, hiring, and expanding) remained flat or declining as a share of GDP despite record corporate profits.
The American middle class shrank in response. In 1971, 61% of American adults qualified as middle-income. By 2021, that figure was 50%. The middle class's share of aggregate income fell from 62% to 42% over the same period, while the upper-income share rose from 29% to 50% [22]. The middle is not disappearing into prosperity. Most of the departure is downward.
The Fertility Verdict
There is a number that captures the cumulative weight of everything described above more viscerally than any Gini coefficient or productivity-pay ratio: 1.62.
That is the total fertility rate in the United States in 2023, the average number of children a woman will have over her lifetime. In 1960, it was 3.65. It fell below replacement level (2.1) for the first time in 1972 and has continued dropping [14].
The birth rate is often treated as a cultural or demographic curiosity. It is actually an economic signal of extraordinary power. People are not making an abstract philosophical decision when they choose not to have children, or to have fewer. They are performing the calculation that the economy demands of them: Can we afford this?
The median age at first marriage has risen from 20.3 for women (22.8 for men) in 1960 to 28.6 (30.4) in 2022 [14]. Young adults are delaying marriage and children because they are still paying off student debt, because they can't afford a home with enough space, because they know that childcare alone will cost $15,000 to $35,000 a year on top of the 75% of dual-income household earnings already consumed by fixed costs. The birth rate decline tracks with rising housing costs, student debt burdens, and delayed household formation in the data [14]. It is not a mystery. It is a referendum.
To tell a generation drowning in structural costs that they should work harder, that needing a side hustle or struggling financially is a personal failing, is to misdiagnose a systemic condition as a character defect. The data shows a workforce that is already working. Two incomes where one once sufficed. Gig work on top of full-time employment. Childcare costs that didn't exist a generation ago. Healthcare premiums that have risen 24-fold in 25 years. Housing that requires nearly three times the income share it did in the 1960s.
Following the Money
None of this is mysterious. The money is traceable.
Between 1950 and the mid-2010s, labor's share of business income fell from approximately 65% to 56–57%, a decline of 8 to 9 percentage points [4]. At current GDP scale, that shift represents roughly $2 trillion per year redirected from workers to capital owners. Corporate profits as a percentage of GDP doubled from 5–6% in the postwar decades to 11–12% by 2012–2015 [4]. The financial sector expanded from 3–4% of GDP in the 1950s to 7–8% by the 2000s and now captures 25–30% of all corporate profits while employing a small fraction of the workforce [18].
The postwar era demonstrates that a different distributional architecture is possible within a capitalist market economy. When unions are strong enough to bargain, when tax rates make extraction less attractive than reinvestment, when financial regulation prevents speculation from dominating productive activity, when public investment builds infrastructure and human capital, growth gets shared. That isn't theory. It was American policy for 30 years. It produced the highest sustained growth rates, the broadest middle class, and the greatest intergenerational mobility in the nation's history.
The obstacle to restoring some version of that architecture is not economic. Every mechanism that redirected gains upward (the tax code, antitrust enforcement, labor law, financial regulation, the legal framework for buybacks) is a policy lever that was deliberately moved and can, in principle, be moved back. The obstacle is political: the same concentration of wealth that these policies enabled now funds the political apparatus that protects them.
A child born today has a coin-flip chance of outearning their parents. That was once 92%. The distance between those two numbers contains the whole story — not of an economy that failed, but of one that succeeded spectacularly and then changed the locks on who was allowed to benefit.