Tuesday, January 22, 2008

Inconsistent Reports of Long-Term Stock Returns

I intend to review the book Worry-Free Investing by Zvi Bodie and Michael Clowes, but for now I want to discuss one part of the book that reports historical U.S. stock market returns that are higher than I was expecting.

In Chapter 6, the authors discuss U.S. stock returns from 1926 to 2000. They actually give real returns, which means the returns after inflation is subtracted out. Based on the historical data, they calculate the average yearly real return on stocks to be 9.3%. But, others say that the long-term real return on US stocks is 7%.

This may not look like a big difference, but if you play around with one of the many free retirement calculators available online, you’ll find that an extra percent or two of return on your investments each year makes a big difference over the long term. So, which one is the correct historical average return?

It turns out that they are both right because they are talking about different things. The 9.3% figure comes from taking the returns from each of the 74 years, adding them up, and dividing by 74 (a simple average). This is what we usually mean by the average return. It is sometimes called the arithmetic average to distinguish it from other types of averages.

According to Bodie and Clowes (see pages 86 and 87), $100 of stock in 1926 rose to a value of $20,000 in the year 2000 (after factoring out inflation). If the stock market had risen by the same percentage every year instead of jumping up and down, what return would have given the same performance? Instead of 9.3%, the answer is 7.4%. This is the average compounded return (also called the geometric return).

The difference between the average return and the compounded return depends on how much returns vary from one year to the next. The more an investment’s value jumps up and down, the bigger the difference between the two kinds of average. This may seem similar to what is going on with risk-adjusted returns, but they are not the same. Risk-adjusted return calculations artificially penalize volatile investments by more than this difference between the averages.

Which of these two averages is the right one to use? That depends on what you are using it for. If you want to know the average return for one year, then use the arithmetic average. If you want to know what is likely to happen if you let some money ride for a long period of time, use the compounded return.

This may all seem paradoxical. How can both averages have any meaning? For those who want to understand it without much math, see this post for a simple example that explains this paradox and its connection to risk and volatility.

No comments:

Post a Comment