It takes money to make money?
Our cities should not be risking great failure, or any failure, for that matter. The concept of investing (other people's) money in order to make money, especially when pushed by a class of experts with no personal risk or skin in the game, is not one that local governments should be pursuing. We need a new model; a Strong Towns approach.
Does it really take money, to make money?
There was a comment last week on the blog that came at just the right time to bridge an important mental gap for me. In reference to capital investments made in support of a development pattern (in this case, an entertainment district), the following was shared:
It takes money, to make money. Will it work, who knows?
I am confident that, at some point in my professional career as an engineer and a planner working for and advising local governments on capital improvement projects, I used the phrase, "it takes money to make money." This is an idiom that is quite American and it is particularly poignant as we move through the Desperation Phase of the Suburban Experiment.
The idea here is that one needs to put something at risk if one is going to experience a gain. A related idiom is, "nothing risked, nothing gained." And clearly we can see that, in the Ponzi Scheme of Growth we find ourselves in, those local governments that do nothing to generate growth are generally seen as falling behind (in the near term).
I have about a dozen directions I want to take this, but for today I'm just going to focus on this one line of thinking: Why do we seem to culturally believe this is true? What makes a broad mass of people generally be supportive of speculative public projects like new highways, stadiums, reconditioning the downtown or building an "entertainment district", sold to the public as "growth"?
The Inevitability of Disaster
My favorite Malcolm Gladwell essay is called Blowing Up; How Nassim Taleb turned the inevitability of disaster into an investment strategy. This was my introduction to the Patron Saint of Strong Towns Thinking (Taleb) and I've listened to the essay on audio book at least 40 times. It resonates with me so deeply because I professionally identified with Taleb's nemesis, Victor Neiderhoffer, while I intellectually identified with Taleb. I continually go back to the essay to help make sure I haven't slipped back into my old ways professionally.
The essay looks at how people judge risk, some of the assumptions that go into making those judgements and then how people respond. Niederhoffer, a brilliant empiricist quoted as saying, "everything that can be tested must be tested," is the ultimate expert. He has the knowledge, the experience and the rigorously tested equations that give him the ultimate confidence to wager hundreds of millions of dollars on the direction of the market.
By contrast, Taleb was confident he knew nothing about the direction of the market. More precisely, he was confident that other people—those considered "experts"—knew less about the direction of the market than they believed they did. As a hedge fund manager, the brilliance of Taleb is that he made his money essentially off of other people's arrogance. The Niederhoffers of the world would confidently sell options with the full belief that their PhD's and back-tested equations allowed them to identify safe bets. Understanding uncertainty and complexity and thus having an appreciation (not shared by Niederhoffer) for unpredictable, random events—a phenomenon not accounted for in the very orderly Niederhoffer approach -- Taleb would buy many of those options.
On most trading days, Niederhoffer would make a small amount of money while Taleb would lose a small amount of money. This had the perverse effect of increasing the unfound confidence of Niederhoffer as each winning day made him more and more sure that he knew what he was doing. Of course, there were those unpredictable events, ones that could not be backtested, that would hit the scene every now and then (think 9/11). When they did, a Niederhoffer investor would "blow up", lose everything at once, while Taleb would not only survive, but thrive.
Niederhoffer was me as a young engineer and then later as a young engineer/planner. I knew what I was doing. I was extremely confident. I had the education, the standards and the theories to demonstrate that I was the expert. I had licenses and accreditations that even gave me initials behind my name. If you were a city and wanted to bet on your future—to invest money to make money—you would have come to me. I was a Victor Niederhoffer in my own professional domain.
And many times, what I recommended worked just fine. Or at least worked long enough where nobody remembered me when it stopped working. I projected population growth, traffic counts, sewage flow and whatever else I needed to do. That was my job and I was an expert. To say I didn't know would not only have been looked at as professional malpractice, it would be been suicide for my career. I was never faced with that dilemma, though, because I believed deep in my heart that I knew the answer. While I could have accepted that I made a mistake in my calculations—I obsessively checked them for that reason—I never paused to consider that my entire approach might be deeply flawed.
The Danger of Overconfidence
I'm re-reading Antifragile by Nassim Taleb, the most important book of the past year. As my mind was disembarking from vacation during the plane ride home yesterday, I read this quote (page 215 of the hard copy):
Expert problems (in which the expert knows a lot but less than he thinks he does) often bring fragilities, and acceptance of ignorance the reverse.* Expert problems put you on the wrong side of asymmetry. Let us examine the point with respect to risk. When you are fragile you need to know a lot more than when you are antifragile. Conversely, when you think you know more than you do, you are fragile (to error).
*Overconfidence leads to reliance on forecasts, which causes borrowing, then to the fragility of leverage.
How many times did I recommend a project go forward based solely on my expert opinion, as informed as it may be? How many times did I rely on projections to justify enormous expenditures, projections I didn't understand or have any business making? How many times, instead of taking a deep and critical look at what I really knew, would I throw out a reassuring axiom like "it takes money to make money"? And how many times would I believe it myself?
The answer, sadly, is too many times to count.
On how many of those recommendations did I have any downside risk for being wrong? Which of these projects, if I had grossly overestimated in my projections, would have damaged my career or my earnings? Was there any scenario where a project I was leading went forward and I didn't get paid a bonus? Would I ever have recommended that a project desired by the city officials not proceed?
I've been honest with myself and can acknowledge the truth: never.
So what makes us so confident that investing someone else's money to build an entertainment district, expand a highway, extend a utility line or construct a stadium is a good investment? We never calculate—let alone track—the public's actual return-on-investment (dollars in versus dollars out over multiple life cycles) when we do a project. We never even ask the question. And most critically, we never learn from our mistakes; there is no feedback mechanism other than total collapse (which we would just blame on someone else anyway).
We engineers, planners and other "experts" do projects because that is what we do. Period. There is no other deeper wisdom at work here.
Our cities should not be risking failure
I plan to build on this line of thought in subsequent posts, but for the time being, the takeaway for today is found within the Gladwell piece:
Unlike Niederhoffer, Taleb never thought he was invincible. You couldn't if you had watched your homeland blow up, and had been the one person in a hundred thousand who gets throat cancer, and so for Taleb there was never any alternative to the painful process of insuring himself against catastrophe.
This kind of caution does not seem heroic, of course. It seems like the joyless prudence of the accountant and the Sunday-school teacher. The truth is that we are drawn to the Niederhoffers of this world because we are all, at heart, like Niederhoffer: we associate the willingness to risk great failure—and the ability to climb back from catastrophe—with courage. But in this we are wrong.
That is the lesson of Taleb and Niederhoffer, and also the lesson of our volatile times. There is more courage and heroism in defying the human impulse, in taking the purposeful and painful steps to prepare for the unimaginable.
Our cities should not be risking great failure; or any failure, for that matter. The concept of investing (other people's) money in order to make money, especially when pushed by a class of experts with no personal risk or skin in the game, is not one that local governments should be pursuing. We need a new model, a Strong Towns approach.
Coming Soon
In a subsequent post, I'm going to explain how cities should be investing public dollars so as to capture the upside potential of growth without the downside potential of failure and waste.