Analyses of savings rates needed for succcessful retirement rates (SSRs) typically assume constant real earnings growth throughout one’s career. However, data on the life-cycle earnings patterns of millions of U.S. workers suggest that earnings growth does not occur at a constant rate that matches inflation. Instead, earnings tend to increase at a decreasing rate during the early years of one’s career and decrease at an increasing rate in the later years. Utilizing simulations of saving and dissaving throughout the life cycle based on both historical market returns and forecasted returns, the authors examine the impact of assuming more realistic earnings growth relative to constant inflation-adjusted growth. Results indicate that failing to account for more realistic earnings curves throughout the life cycle may overstate SSRs for lower-income households while understating SSRs for higher-income households, and understate SSRs for younger households while overstating SSRs for older households. Furthermore, historical SSRs of 10% or less are found for all but the highest-income households after accounting for more realistic earnings curves and Social Security benefits—though variations in specific effects may exist based on the simulation methods utilized.