Many Americans have seen graphs like this:
Much of this is due to the use of different price level deflators for productivity and compensation. Over the past forty-one years, U.S. output has risen in price much more slowly than the cost of living for the typical American worker. When using the same price level deflator, the compensation-productivity divergence looks much more recent, having begun in late 2001, rather than in 1973:
Alternatively, if one wants to use a logarithm scale for the same graph (to show an exponential growth trend instead of a linear trend):
I strongly suspect a good portion of the divergence from the exponential growth trendline after 1973 is due to falling and stagnating worldwide per capita oil production.
The Heritage foundation suggests most of the recent wage-productivity divergence, after taking into account the difference between the deflators, is due to overstating input price increases and not accounting for depreciation in output measurements. I doubt the latter, as depreciation did not significantly increase in 2008. It is interesting that stated average real hourly productivity increased during the 2008-9 recession (a phenomenon almost unique to the United States). I suspect real hourly wages did not rise as fast as real hourly productivity in the U.S. in 2008-9 due to the effects of higher unemployment on wages. In the 1990, 2001, and 2008 recessions in the U.S. (but not before), low-productivity workers increased hourly productivity to avoid being fired, thus leading to rapidly rising real hourly productivity while wages stagnated due to the labor surplus. I do not know why this only arose in the U.S. after the 1980s.