If you are an investor, you cannot escape forecasting.
Those who criticize predictions are guilty of an obvious blunder. If you are making investment decisions, your conclusions -- and the hidden assumptions and reasoning -- depend on forecasts.
So don't kid yourself. You are a consumer of forecasts made by others. These predictions influence your investment decisions in dramatic ways. Suppose, for example, you have decided to sell all of your stocks. Josh Brown explains why this is a mistake:
When you go to 100% cash, something changes in your mind. You get to this place where mentally you won't be satisfied unless you pick the perfect entry point. It's harder to do some buying at 100% cash than it is at 75% cash or 50% cash - at least then you're "adding to positions" rather than taking a stand from scratch. Price will drive you crazy without an existing position or average cost to anchor you, whether it's higher or lower than the current quote.
Do not depend on this single quote from a fine article. I urge you to read the entire piece. The concept of avoiding extremes may be the single best way to improve your investment performance.
The first step in this process is realizing the most important fact: Like it or not, you are a consumer of investment forecasts.
Putting a finer point on this, you are probably terrible at evaluating these predictions. When you go to buy a car or a refrigerator, you do some real work. How much effort do you make when evaluating investment forecasters?
Avoiding Bad Pundits and Gurus
In a very fine article on the general topic of pundit reviews (once again, well worth reading in full), Scott Anthony's Harvard Business Review Article recommends Four Strategy Gurus to Avoid. There is a lot of worthwhile nuance, but here is the quote that you will find to be most useful:
The opiner. Michael Mauboussin, the Chief Investment Strategist at Legg Mason Capital, eloquently describes how the key thing to evaluate when looking at equity investors isn't what they buy but how they choose what to buy. In other words, ask "What models do they use to make decisions? What process do they follow to gather data?" The same holds true for pundits. Offering opinions without explaining underlying assumptions or mental models isn't helpful. The real world is complicated and any prediction ought to at least spell out the assumptions that would have to be true for a prediction to hold. Wrong predictions based on well-reasoned (at the time) assumptions are useful because they help strategists develop their own instincts for assessing future technologies.
What to Evaluate
With all of this in mind, here is my summary of how to evaluate forecasters:
- Is the forecaster an expert in the subject at hand?
- Does the forecaster have a track record?
- What model is being used?
- What assumptions have been made?
If major media outlets actually applied these tests, there would be a crisis! They would have no content!
Major Arenas of Forecasting
Here are the most important issues. If you knew the answers to these questions, you could make big profits from your predictions.
- Will there be a recession in the next year? This means a specific definition with a time frame. No BS.
- What is the forecast for corporate earnings in the next year? Be specific.
- Will the European governments successfully contain their debt problems?
- Will the Supercommittee reach a conclusion that will be enacted into law?
Investment Conclusion
Regular readers of "A Dash" know that I am working on all of these fronts. The average investor has been bombarded with negative news on economic headwinds, the economy, Europe, recession odds, and plenty of politically-charged commentary.
My own take is that the ill-informed punditry is too negative. Most of the popular pundits fail all of the tests we recommend. This is good news -- very good news -- since it suggests an extreme in market valuation. It helps to explain why stocks like Microsoft can trade at six times next year's cash flow.
I have written past pieces on all of these specific topics, but today it is important to pull it all together. I plan more specific analyses of each of the major arenas.
Acknowledgment
While I read many sources, I have a special reliance on Abnormal Returns. When I am on the road for a few days (as I was last week) I know how to make sure that I am completely in touch with the market. This story was inspired by links from AR. People should realize the synergy in the blogosphere. Excellent curation of financial stories stimulates new stories.
[long MSFT]
I wanted to add to my previous comment specifically on the assessment of experts. Jeff produced a list of good questions:
"With all of this in mind, here is my summary of how to evaluate forecasters:
Is the forecaster an expert in the subject at hand?
Does the forecaster have a track record?
What model is being used?
What assumptions have been made?"
...but here is the problem. In 4th quadrant situations, (where both the probability of outcomes and the consequences of outcomes are very difficult to estimate), experts have a very poor track record of predicting low probability/high consequence events. Some typical examples are the "Arab spring", fall of the berlin wall, 1929 crash, World war I, etc.
Which brings me to the europe situation, which is deep into quadrant 4. If europe does turn into a disaster, it will be against the predictions of people who are truly experts on europe (by the criteria of Jeff's list).
Now, just because real experts don't predict some disaster doesn't mean that every wacko doomsday prediction will come true! But it does mean that some unexpected and high consequence events will happen that are not predicted by experts.
One last point about europe. We don't know much about what the consequences of various "bad" outcomes might be. But what we do know is that if there is a "bad" outcome, the larger the amount of eurozone debt that goes into default, the more damaging for the world economy and financial system it will be.
As all of the "solutions" in europe, so far, involve loading up the weakest debtors with more debt, i think the potential damage from a "bad" outcome is getting larger, even if the probability of a "bad" outcome is not.
Posted by: Angel Martin | October 22, 2011 at 01:53 PM
Jeff, thanks for this article.
It's an important reminder for us to look again at the forecasts we make or use, what we have done ourselves vs what we use from others, what the model is, what is being assumed, what the track record is, how solid it is...etc
Angel
Posted by: Angel Martin | October 22, 2011 at 08:54 AM
A very good post today Jeff. Thanks...
Posted by: louis | October 21, 2011 at 09:37 AM
Brian -- It is pretty easy.
No one believes the people whose entire careers are involved in forecasting earnings -- the people you are citing.
Instead the choose to believe some guy who says "earnings estimates are too high" even though there is no evidence and no method for this opinion.
So it is a simple choice: believe the equity strategists who you cite or believe some pundits who say they are wrong.
I am curious. How do you choose?
Thanks for joining in.
Jeff
Posted by: oldprof | October 20, 2011 at 10:54 PM
I don't see how you can argue that most forecasters are pessimistic when almost all equity strategies from asset firms and banks predict robust double digit earnings growth next year and year end forecasts 10+ percent higher from here.
Posted by: brian | October 20, 2011 at 10:23 PM