..that you are wrong?

This is not a question enough people ask themselves about their own convictions.

Unless we define our disconfirmatory evidence before making a decision, proof that we are wrong tends to be a continuously moving goalpost.

Our instincts deploy an army of biases ordered to guard our opinions from the outside world: confirmation bias, fundamental attribution error, dissonance reduction, on and on. The water in the pot heats up, and we convince ourselves that the temperature is just fine.

The only way we can fully negate the power of these biases is to define the information or events that would falsify our beliefs ahead of time. Explicitly writing down the evidence you require to overturn an opinion can also reveal the comic extent to which you’ve fenced yourself into a dangerous position. Forget about calculating “the odds”; what is the consequence if you are wrong?

Responsible scientists define and test against the null hypothesis.

Responsible investors define the lowest price at which they’ll hold their stock before buying into the market.

Responsible thinkers define the evidence sufficient to disconfirm their assertion, especially (especially) before sharing it to social media.

Critic Gene Siskel asked the following question to determine whether a movie was worth watching: “Is this film more interesting than a documentary of the same actors having lunch?”

The mandate to creators might be: you aren’t done until your final product is more interesting than its ingredients. The raw materials: the people, the process, the stories, the histories.

This is not an easy problem to solve; every step of cooking a beautiful meal can be more arresting than what ends up on the plate.

For the frustrated creative, a deeper appreciation for your ingredients may present the path forward.

When it comes to advice, we prefer the intuitive.

Rules of thumb are best when shaped like thumbs.

As a result, we are provided with many thumb-shaped things, and told they are rules.

Beginning in the 1960s, a misinterpretation of nutrition studies led a coalition of researchers to assert that dietary fat caused fat to accumulate on the body and in the bloodstream. Makes sense, we concluded, that ingesting fat makes one fatter.

It took nearly fifty years for better thinkers to dismantle this mistaken narrative.

“You are what you eat,” they told us.

Take caution around thumb-shaped things.

The distinction is not between hard subjects and soft subjects (which implies a degree of difficulty) or between technical and non-technical (which implies the necessity of technique), but between non-narrative and narrative disciplines.

Non-narrative disciplines examine or manipulate the present moment and operate deterministically. Mathematics, most physics (that we are aware of), programming, surgery, carpentry, casino games, and cooking, to name a few. Interpretation is irrelevant in these disciplines: what works, works, and what doesn’t, doesn’t. Properties can be mapped empirically. In fact, that’s the only way they can be mapped.

Narrative disciplines examine or manipulate the past, the future, or imagined worlds. They do not operate deterministically, no matter how badly we’d like them to – or how emphatically “experts” insist. They are not “subject to” interpretation; they are interpretation. Artists, historians, economists, chefs, social scientists, UX designers, war fighters all share in this domain.

The danger is when non-narrative practitioners mistake their field for narrative, and vice versa.

In the former case, we narrate – tell stories about – why a given outcome has occurred. We declare we have beginners’ luck because we won the first round. We invent deities to explain the drought. Or, we dismiss evidence when we cannot easily produce an explanation: chiropractic treatment must be quackery because we don’t see a link between spinal manipulation and chronic knee pain. Besides, chiropractors don’t even have an M.D. (certification being another type of narration)!

In the latter case, we apply rules and statistics to systems that operate under randomness. We invent categories of people, predict the behavior of “rational” actors, forecast the next war based on the previous one.  We are comforted when a man in dark rimmed glasses tells us we have nothing to worry about. His model has taken into account the past, and runs on very expensive computers.

In any case, we’re sure that this time we’ve finally figured it out. And then we get a rude surprise.

Evan detests his cranky knees, his aching back, plastic tray tables, narrow armrests, and narrower seats. Four hours into an overnight flight from London to SFO, he considers kicking down the cabin door, blasting himself somewhere over the mid-Atlantic, and enjoying a couple minutes of supreme legroom as he makes his final descent.

No need to worry, passengers. Alas, he is wedged into a middle seat. He twists, desperately trying to crack his back, but the armrest blocks his rotation.

He shifts, left leg on right, right leg on left. Traces circles with his toes. Drums the armrest. Checks the time on his phone. He’s gotta move. He taps the middle-aged woman beside him on the shoulder and signals that he’d like to stand up. The woman pauses just long enough to create the impression that this is a great burden to her in-flight experience. He mouths a perfunctory apology as he lurches past her.

As he shuffles down the dark aisle, he considers how nice a long jog would feel after landing: something to sweep the crust from his joints, like shaking sand off a beach towel. Sadly, barring a miracle, he won’t have the time – the plane lands at 6 AM and he needs to be at school by 7:15: Evan “Mr.” Brosh is a seventh-grade teacher in the San Francisco Unified School District. He is returning from a prestigious conference-slash-networking event for gifted educators and ambitious tech investors. The conference was very productive: there was much talk of transformation and innovation and representation and he left with a dozen new LinkedIn contacts.

His mind wanders to his eighth period class. He feels a tightness in his chest, a weight behind his eyes. Several kids had become unruly over the past quarter. Their fidgeting had become so severe that he recently started writing names on the board. The thrill of returning from spring break would spike them even further. He’d have to spend the first half of class settling them down. He just knew it. He paced down the aisle, back to his seat. Why couldn’t those kids just sit still for forty-five minutes?

There is an irony to the white-collar worker that advocates fiercely for a Paleo diet, because the relationship between corporate employees and freelance artisans echoes the relationship between agricultural and hunter-gatherer communities.

For linguistic simplicity, let’s condense our subjects across timelines into hunter-artisans and agro-corporates.

Hunter-artisans follow abundance. Life as a hunter-artisan allows for, and often requires, regular movement to new fertile spaces. Agro-corporates tend to root to one place for as long as possible.

Hunter-artisans tend to share resources within their community: food, tools, technique, knowledge. Agro-corporates protect resources via private property, trade-secrets, and patents.

Gifts and favors animate hunter-artisan economies, while salaries and taxes form the basis of agro-corporate economies.

Gender equality is prevalent in hunter-artisan communities (though not a requirement). Woman are frequently as powerful and influential as men. In agro-corporate cultures, patriarchy has prevailed for millennia.

In hunter-artisan communities, reputation within the group is vital; reputation outside the group is not relevant. In agro-corporate societies, the opposite is true.

Finally, it is curious to observe that the popularity of eating like a hunter-gatherer has reignited in parallel with the freelance economy. Call it the hunter-artisanal revolution.

Tip of the tongue phenomenon: when you have a specific word in mind, but can’t recall it. Like a masked villain, the word scampers from your custody at the critical moment. You scour the city. The word evades pursuit. You plead with a friend to name it. You reach for a thesaurus. The tension builds.

There’s a similar, less-researched, but equally common mental state: tip of the sword phenomenon. This is when you have a specific dragon to slay, some crucial action you must take, but can’t put your finger on what exactly it is. Over drinks, you complain to your mates. They guess wildly at what the problem might be: maybe you need to take a vacation. Exercise more. Find a new job. No, you say. That’s not it. Not exactly.

Homo sapiens have been around for a hundred thousand years, and yet we’ve found only one reliable cure for tip of the tongue, tip of the sword: long walks.

Your first name was fashioned by your parents: a gift, a prediction, a promise. Your surname holds hands with history; you were seated at the front of an ancestral queue. Your nickname was discovered by your friends, excavated from an inside joke. Your names are given to you.

Companies name themselves. Their names are not a mirror, but a selfie: a framing, a pose. Their promise, their history: all invention.

Imagine if it were the other way around.

It’s worthwhile to consider the ways the old instruct the young, and the young enlighten the old. Make no mistake: both generations have plenty to impart, and neither trusts the other.

And so, as we must, we teach with metaphor.

The old disguise their lessons of hard-earned experience and the assumptions of their age in the gloss of fiction, whether through Spiderman or Shakespeare, Jesus or John Cena. Salman Rushdie captures this dynamic in conversation with Paul Holdengraber:

“The thing about fairy tale, folk tale, and mythology is that these things in many ways contain the collective wisdom of the human race; these beautiful little things into which an enormous amount of moral and practical information is packed.”

The young teach the old through invention. New technology, new fashion, new relationships to work and to sex. Each innovation is a pocket manifesto that responds:

“Yes, and here is what you missed. This is the solution. This is what matters now.”

Story and invention, call and response. Should you need to teach a lesson, keep the age of your audience in mind.

A friend once told me about a game he’d play to spend the empty days as a teenager in Brooklyn. He would walk out the door, pick a direction, and keep walking. Every time he reached a traffic light, he’d turn left or right at random. For hours. No destination in mind; he’d end up where he’d end up. Always somewhere new. For curiosity. For adventure. For no reason.

We are tourists, in a densely networked city of ideas, and there are tour guides at every corner. Friends. Blogs. Algorithms. Each suggesting what path to take when we reach the corner.

We tend to prefer this. We’d rather somebody hold our hand as we cross the street.

There is safety in curation.

Conversely, there is heroism in discovery.

There are few spaces left, physical or digital, where heroism is even possible. Curation and recommendation are so intrinsic to daily life that we confuse social proof for quality:

“This restaurant only has two reviews. Bad sign.”

“They’re not even on Spotify. Why would I go to their concert?”

“How helpful could this book possibly be if no one knows about it?”

This, I see, is the rare opportunity of a good public library: one of the last remaining cities without tour guides. A place for heroism. A place you can turn left, turn right, turn left, hand un-held. End up where you end up. For curiosity. For adventure. For no reason.