Monthly Archives: November 2013

Make Small Mistakes: A Mood Is Not A Personality

Alex Tabarrok has a new paper out showing that easy availability of guns increases the number of suicides. (Read the comments section on that post, it’s full of good tidbits like the fact that there are more suicides farther from the equator.) This is an econometric study, but there is a psychological angle:

Our econometric results are consistent with the literature on suicide which finds that suicide is often a rash and impulsive decision–most people who try but fail to commit suicide do not recommit at a later date–as a result, small increases in the cost of suicide can dissuade people long enough so that they never do commit suicide.

In other words, while we think of some people as “suicidal” this is just the fundamental attribution error rearing its ugly head. Another example:

  • That other person didn’t signal before changing lanes because they’re a bad driver.
  • I didn’t signal before changing lanes because I forgot, I’m tired, my kids are yelling in the backseat, etc.

People’s behavior is determined by the situation, their feelings are transient or generated on the spot. Very little of their behavior can be pinned to permanent characteristics or explicit intentions. But our first inclination is the opposite. If this  idea tickles your fancy — if you’d like to learn a lot more about how the situation can affect your behavior — read The Person and the Situation, even though it didn’t make my list of top 5 behavioral economics books and even though Malcolm Gladwell wrote the Introduction. You should also browse this deck of cards showing how the physical design of the environment can affect your actions.

There are also fascinating implications for the study of crime. Gary Becker revolutionized the field by pointing out that crime isn’t done by “criminals” — it’s done by ordinary humans who face different costs and benefits than the rest of us. Of course, this isn’t the final word. Most crimes are crimes of passion; between a fifth and a third of prisoners were drinking at the time of their offense. To prevent crime, we don’t need to make 25 year sentences longer, we need to somehow get around all that System 1 decision-making. And an important new paper shows that Cognitive Behavioral Therapy is staggeringly effective:

The intervention … included … in-school programming designed to reduce common judgment and decision-making problems related to automatic behavior and biased beliefs, or what psychologists call cognitive behavioral therapy (CBT). Program participation reduced violent-crime arrests during the program year by … 44 percent … the benefit-cost ratio may be as high as 30:1 from reductions in criminal activity alone.

The paper also finds improved schooling outcomes. Reducing the importance of your automatic decision-making can have huge benefits. The alternate strategy — what Tabarrok’s paper on suicide suggests — is that you should also lower the size of the mistake that your System 1 self can make. Your impulsive self can really screw you over if you’re not careful: make sure he doesn’t have a gun.

A Little Navel-Gazing

Just a brief trip five years down memory lane to revisit one of the better sentences written about me to date:

Potok’s sophomoric personal attacks against outgoing liaison Hollie Gilman may be gratuitous, but they do not detract substantively from his ideas or campaign.

I took this as a compliment, because I was only a freshman at the time! It pairs well with my recent letter to the same newspaper:

When I majored in economics at the University of Chicago, the coursework was taught with an eye towards the complexity of the outside world and an understanding that “models”­­­—a word Golovashkina uses as an epithet—are the scientific way to best understand that complexity. A simple map does not imply a simple territory; I’d recommend Jorge Luis Borges’ “On Exactitude in Science” to anyone still confused about the usefulness of a life-sized map.

Finally, someone named Louis Potok was buying real estate in Chicago in 1922.

My Dictum and Your Blind Spots

In Thinking Fast and Slow (one of my 5 behavioral economics must-reads), Kahneman lays out the memorable idea that “What You See Is All There Is.” He explains it nicely in a brief interview:

WYSIATI means that we use the information we have as if it is the only information. We don’t spend much time saying, “Well, there is much we don’t know.”

These are the famous unknown unknowns that I’ve written about— the gaps or blind spots you wouldn’t think to look for. These are so important that I think it’s worth updating Asimov’s Dictum: “The most exciting phrase to hear in science, the one that heralds new discoveries, is not “eureka!” but “that’s interesting…” “Similarly, Potok’s Dictum is that whenever you’re planning, evaluating, or decision-making:

The most exciting phrase that heralds good decision-making is not “here’s the answer” but “oh, I hadn’t thought of that.”

Whenever a new idea, perspective, or fact appears, treat it carefully and feed your inner pigeon so that you learn to keep generating them! A useful addition to your cognitive toolkit would be a set of ways to find those blindspots you would otherwise totally ignore. Here are a few I’ve thought of or seen elsewhere:

  1. Explain it to a child
  2. Ask a third party with “middle-level” knowledge of the issue: not an expert, but not a child.
  3. Find a new taxonomy to organize everything you know about the issue — the “worse” it seems, the better. For example, if you’re planning a project chronologically, try to list all the aspects of the project by department or division instead.
  4. Perform a premortem: ask yourself “If this project fails spectacularly, what will have caused it?”
  5. Twiddle the knobs a la Daniel Dennett’s must-read advice about Intuition Pumps, or Polya’s guidelines for understanding mathematics. Change each part of what you know — sometimes the extreme cases — to see what happens. “What if we had 10 years to do this? What if we only had three days?” “What if only 5 people show up to our event? What about 50, or 500, or 5,000?” Most of these questions will be a waste of time but a few might completely change the way you’ve been thinking.

Readers: how do you find blind spots? What tools should I add to this list? Have you ever completely overlooked something that was obvious in retrospect?

Mathematical Decision-Making

Whenever you make a decision you have to consider a universe of facts. Suppose you’re trying to decide where to go on vacation.

First you might make a list of all the cities you would consider as a vacation spot and all the facts you know about them. These are the integers, and you can get pretty far with a few simple operations. You know that Paris has great museums and London has bad food and Berlin has great nightlife.

But there are also huge gaps in the facts you know. If I teach you division, you can turn the facts you know into a much larger set of facts you don’t know — the rational numbers. What are the museums like in London? How’s the food in Berlin and the nightlife in Paris? You can spend your whole life wandering around in the rational numbers, making pretty good decisions and really honing your long division skills.

But there’s a much larger infinity of things you’ve never thought of. Can you take a staycation, or take a cruise? Is that hand gesture you always make considered rude in any of these cities? These unknown unknowns are like irrational numbers. You need new, exotic operations to find them but once you learn how to look for them they’re everywhere you look — and there are so so many of them.

We spend so much time practicing our long division for 3, 4 and 5 digit numbers — staying in the known unknowns of the rational numbers — and not nearly enough time developing and using the new tools that will bring the unknown unknowns to our attention. This is one of the most important ideas you can have in your cognitive toolkit.