Why We Create

Before the year ends, read Jia Tolentino’s searing essay Worst Year Ever, Until Next Year:

In any case, someone will tweet “worst year ever” every few minutes until 2016 is over, and then people will begin tweeting “worst year ever” as soon as 2017 begins. They will type “worst year ever” because of spilled drinks and late Ubers, a new Trump story, a new dispatch—if she miraculously manages to survive until then—from Bana Alabed, the seven-year-old girl in Aleppo who’s been tweeting, with her mother’s help, her fears of imminent death. There is no limit to the amount of misfortune a person can take in via the Internet, and there’s no easy way to properly calibrate it—no guidebook for how to expand your heart to accommodate these simultaneous scales of human experience; no way to train your heart to separate the banal from the profound. Our ability to change things is not increasing at the same rate as our ability to know about them. No, 2016 is not the worst year ever, but it’s the year I started feeling like the Internet would only ever induce the sense of powerlessness that comes when the sphere of what a person can influence remains static, while the sphere of what can influence us seems to expand without limit, allowing no respite at all.

Perhaps it is the horror that swells, perhaps it is our awareness of it.

Yet I have friends who agree that this year was terrible culturally, and declare that it was their most fulfilling and happy year personally. This doesn’t diminish the personal pain that many others have gone through, but allows that there is a limit to the usefulness of abstract empathy. Despair is a passive verb. Even anger is more useful. One can both mourn and feel joy.

A shitty year is the most compelling argument for building oneself a joyful refuge. It doesn’t help anybody to freeze out in the cold.

It might be your best year ever. Please, fiddle while Rome burns. More than ever we need your songs.

What Came Next

And then one year, all the stores raised their prices for Black Friday.

Everything was 100%-800% more expensive. During the month of November, the stores hyped their Black Friday mark-ups: $3001 for a bulky, standard-def TV; $801 for a blender.

Thanksgiving evening, the overnight lines for Black Friday “doorblocker” sales (6 AM to 8AM, minimum 30 per customer) were meager. A few stalwarts huddled in the cold, driven more by stubborn tradition than genuine enthusiasm.

The bulk of Americans stayed home that Friday. They made breakfast with leftovers. They sipped coffee and chatted on Messenger. They wondered how to spend their free time.

The best Black Friday of all: so much time saved.

Why We Do Better

We learn the most from the events we are least prepared for. Assuming, of course, that we survive the initial collision.

In his essay on the wisdom of lifting barbells, Nassim Taleb applies the principle of tail risk to strength training. Our bodies get stronger not from the monotonous humdrum of routine activities (rising from bed, sitting in a car, sitting in the middle row at team meetings, walking to our car, etc.), but from exposure to infrequent extremes: lifting weight off the ground at the very edge of our muscular and skeletal capacity.

He uses the analogy of weight-testing a bridge:

“You will never get an idea of the strength of a bridge by driving several hundred cars on it, making sure they are all of different colors and makes, which would correspond to representative traffic. No, an engineer would subject it instead to a few multi-ton vehicles. You may not thus map all the risks, as heavy trucks will not show material fatigue, but you can get a solid picture of the overall safety.”

Frequent, trivial insults chip away at a system (low back pain, carpel-tunnel, etc.). Rare, intense shocks may strengthen them.

For the past decade, Millennials have faced a glut of minor knocks but, outside of the 2008 U.S. recession, relatively few cultural hammers. Despite being the savviest participants in social media, the most connected and technologically capable, and having the broadest access to education and global impact, commentators describe the average Millennial as sheltered, anxious, and timid.

It makes sense that a generation insulated from failure and conflict would popularize the concept of microaggressions: frequent, trivial insults that chip away at self-esteem and dignity. And, like a overactive immune response, the battle against microaggressions has not strengthened Millennial political or social clout.

To the majority of voting Millennials, the election of Donald Trump was a tail event: an unthinkable catastrophe, an existential threat made concrete.

A macroaggression.

What if that was exactly what the generation needed? An extreme event that would organize, mobilize, and strengthen the entire system? What if the most connected, most educated generation was also the most politically engaged?

We learn the most from the events we are least prepared for. Assuming, of course, that we survive the initial collision.

Why We Learn

Before we get too smug about the past century of scientific progress, note that scientists developed the theory of special relativity before developing a consensus on the existence of the female orgasm.

Scientific inquiry into sex didn’t begin until the 20th century, and until the 1950s remained at the extreme fringes of biology, medicine, and psychology. Participants were nearly impossible to recruit. Many researchers completed their studies with prostitutes, research assistants, spouses, and, when necessary, themselves:

“Rather than risk being fired or ostracized by explaining their unconventional project to other people and trying to press those other people into service, researchers would simply, quietly, do it themselves.”

Mary Roach, Bonk

Measurement instruments were even more difficult to acquire. Mid-20th century researchers like William Masters and Virginia Johnson built their own makeshift penis cameras to get a better look at the action:

“The dildo camera unmasked, among other things, the source of vaginal lubrication: not glandular secretions but plasma seeping through the capillary walls in the vagina.”

Mary Roach, Bonk

TIL.

Here’s the thing: The Hubble telescope recently photographed a galaxy 13.8 billion light years away, literally looking back in time to the formation of the universe. It is very likely that we will develop 3D-printed human kidneys for transplant before we develop a complete model of the mechanics of human insemination.

We are taught to view technology as the bottleneck for understanding the world around us. If only we had more engineers and data scientists to build the gadgets and crunch the numbers, we’d usher in our age of abundance.

We could build a machine to perfectly record and analyze every detail of human sexuality, and we’d still be screwed without thousands of people willing to strip down and jump in, without governments and universities willing to fund the studies, without teachers and parents ready to broach the subject. Until we de-stigmatize human bodies and everything we like to do with them, we’ll never fully understand or heal them.

For our most important human problems, technology is not the bottleneck.

The bottleneck is people willing to talk frankly, to act shamelessly, to share generously.

The bottleneck is culture.

 

Bonk, by Mary Roach, is a frank, generous, and hilarious look at history and science of sex.  Check it out here.

Why We Act

I can’t stop thinking about smash cakes since I learned about them last night.

Smash cakes are whole cakes that parents give to their babies on their first birthday to mash into with their faces, dig into with their hands, to messily revel in, like a tiny infant hurricane tearing through a frosted beachside villa.

99 times out of 100, I’m sure parents just want to have a fun day and a cute photo op.

But, parental intent be damned, there is more than just batter in this cake.

What is a smash cake made of?

1. Vicarious indulgence: Every single 30-year-old I’ve talked to about smash cakes has replied with some variation of, “Jesus, I want that immediately.” When we watch an infant grip her cake with two small fists and smear her cheeks in frosting, we are reminded of how rarely we let ourselves plunge recklessly, shamelessly into pleasure. Cake smashes are no doubt fun for the baby, but they are cathartic to the adults hovering behind the highchair, cameras in hand. For ten minutes, our imaginations smash the cake too, fully present, carelessly free. Just like Pixar movies and trampoline parks, smash cakes are really for us, not them.

2. Ritualized destruction: I am reminded of sand mandalas, the exquisite, kaleidoscopic depictions of the divine universe created by Buddhist monks over days or weeks. After completion, mandalas are destroyed, brushed into an urn, and poured into a river to demonstrate the impermanence of all things. Similar rituals of artistic destruction appear throughout history and across cultures, all the way to present day festivals like Burning Man. Smash cakes carry this lineage of sacred ephemerality. One could argue that infants are better participants than monks to carry out this act of destruction, for even their memories of the event are lost to time. Parents, as usual, miss the almighty point by documenting the occasion like a Kardashian wedding.

3. The first hit of sugar: Smash cakes provide many babies with their first taste of processed sugar. Parents see this as a moment to celebrate. I can’t help but mourn. For most in the Western world, sugar is less a treat than a chronic toxin, strongly linked to the wave of metabolic syndrome, diabetes, and obesity that is crushing entire communities. While sugar doesn’t create the physiological dependency that opioids do, the taste preferences and habits we acquire as infants are arduous to reprogram as we age. In this context, watching a cooing parent push a frosted slice under their reluctant child’s nose recalls the dread of a slasher flick. I yell at my screen, tell her to run, run. The protagonist is deaf to my cries.

4. Shut up, it’s just meaningless fun: You read all this and sigh, come on, man! It’s not a ritual or a meditation or a metaphor for jack shit. It’s a fucking cake and it’s a fun, silly thing. Shut up. It’s meaningless. But (I reply) that is meaningful. (You are on the verge of punching me at this point.) I continue: a first birthday marks the symbolic end of an age of meaninglessness.

We demand nothing of infants. They act on impulse, gleefully free of the cultural ideas and interpersonal norms that shape our every shudder. Outside of a few sensations (the sight and sounds and smells of parents prime among them), very little has meaning to them. They could crash a Rolls Royce into the last living polar bear without breaking a sweat, and no jury would convict them because they understand what none of those things are.

Around 12 months old, babies begin to develop mental representations of the world. They notice that Buzz Lightyear continues to exist even when he is hidden behind mommy’s back. They form a hazy understanding of cause and effect, of goal and intent. As they begin to comprehend that a world exists beyond their field of vision, that world starts to place basic expectations upon them about how to exist. We snack on the fruit of knowledge, and suddenly we’re told to put on some damn underpants.

A first birthday is our grand entrance into civil society, with its rules and taboos and demands. In this light, smash cakes form the centerpiece to a sort of baby stag party, one last sensuous celebration of egocentric independence, a hedonistic abandon that will soon be wrenched away forever.

This means nothing to them. What a gift.

I can’t stop thinking about smash cakes.

Why We Talk

A middle-aged woman at the table next to you chokes up as she discusses with her friend the swift unraveling of Brangelina’s marriage. “They just seemed so politically aligned, too.” She sighs hard, a frustrated push, like forcing air from a bike tire. She clutches her friend’s forearm: “Ugh, and what about the kids?”

You get the sense from her nervous worry that she’s not really talking about Brangelina.

Conventional wisdom in the United States is to not talk intimately with strangers.

When we chat with a mutual friend at a birthday party, we might cover the recent celebrity breakup, but we don’t dare share our worry that we selected the wrong spouse.

Sharing our own turmoil feels too intimate, too vulnerable. We don’t trust that the person across the table won’t judge us, attack us, or run far, far away.

And so, we use intermediaries.

This is why we created banks. As commerce expanded between cities, strangers needed a way to track and exchange debt with each other, but didn’t have a way of verifying the amount of money each had, or a means to safely deliver their funds. When you said you’d pay me for a shipment of blue dye, I couldn’t trust you had the money, or that your money would reach my hands. So we created banks as a central ledger, a trusted, neutral 3rd party to verify and conduct the transaction.

Popular culture acts as a bank for our attitudes and beliefs. We might share our deepest fears, our guilty desires, our stubborn, fact-free opinions with our closest friends, in the same way that we pick up the tab for a pricey dinner, figuring it will all even over the course of our relationship.

But for most people, we use pop culture as an intermediary, a trusted 3rd party. We use celebrity relationships to indirectly talk about our own our relationship. We side with the actions of TV characters to vent our nagging suspicion that we’ve disappointed a friend. We argue about presidential politics to confront our fear that the person sitting in the seat next to us doesn’t want us to exist.

Banks are convenient. They are a useful, responsible means to conduct transactions. But those who delegate too much trust to them, who don’t actively work to manage their own finances, tend to get burned.

As we age, we also come to learn that we must stop using pop culture as a container to launder our personal feelings. That we must speak with honesty, vulnerability, and passion directly to the people we need to hear us: our spouse, our friends, and yes, that person in the seat next to us wearing the red hat. That we can’t store our identity in the bank; we must spend ourselves in the real world.

What Came Before

The misogyny and vicious trolling that sprouted around Gamergate was, in retrospect, a foreshadowing of the ideology and tactics that would coalesce around the alt-right’s political ascendancy two years later.

For whatever reason, the video game community tends to feel the first tremors of broader cultural and economic upheaval.

A second, less doom-laden example:

Once upon a time, if you wanted to learn about the video games, you went to either Gamespot or IGN. These websites were the most trusted and visited sources for video game coverage, and by the mid-2000s had largely supplanted the print media industry.

In 2007, Gamespot fired editor-in-chief Jeff Gerstmann because he gave a mediocre review score to a game, and that game’s publisher happened to be a significant advertiser on the site.

In 2008, Gerstmann and several former coworkers started Giant Bomb, and over the next several years, Giant Bomb helped to forge the new landscape of video game journalism. They phased out written coverage and moved to 30-60 minute videos of them playing games as they talked and joked, the way you might if you were sitting on the couch with a friend. They were one of the first sites to have weekly podcasts, analyzing the week’s news and digressing into bizarre conversations and inside jokes for hours.

What most differentiated Giant Bomb was that the creators were the stars. Their personalities, insights, and senses of humor were front and center. They didn’t hide their preferences and non-gaming obsessions. Newcomers visited Giant Bomb for the first time to hear about games, but fans returned daily to hear from Jeff, Brad, Ryan, and Vinny, regardless of the topic. As a result, they’ve dabbled in spinoff podcasts about pro wrestling, Formula 1 racing, and life advice.

Giant Bomb happens to be the site where you can hear your favorite people talk about games; if those people leave Giant Bomb (particularly Jeff), the brand ceases to have value beyond its SEO ranking.

This shift in authority from institution to individuals has rippled out beyond video game journalism over the past five years. We see it across all media. Not long ago, if you were a politics journalist, there were a half-dozen publications that might be your ultimate goal: The New York Times, The Washington Post, The Wall Street Journal, etc. If you worked for The New York Times, your reputation was derived from your employer and the 150 years of credibility behind it. If you left The Times, the readers wouldn’t know or tell the difference.

Now, the opposite is true.

Nate Silver built his FiveThirtyEight blog on the strength of his work and participation in social media around the 2008 election. When The New York Times bought FiveThirtyEight in 2010, they didn’t give him credibility – he gave them credibility (and huge traffic). And when Silver left in 2013, he took his fans with him.

Starting with video games, the power of publications to grant legitimacy to people has shriveled. In the age of the internet we care about individuals over institutions. Journalism and entertainment were the first industries to change in the new climate, but every industry historically dominated by institutional authority is at-risk.

And schools are next.

Why We Love

My cousin moved into an adult dorm.

My friends fantasize about sharing giant house. Perhaps it overlooks the eastern coast of Oahu. We meditate in the tangerine sunrise. We bake our own bread every evening.

And The Atlantic reports that young adults are flocking to communal living spaces around the country.

Thank Facebook. In fact, thank the entire constellation of apps and social networks that have led a generation of college grads into an uncanny valley of personal relationships. The more Facebook and Twitter and Tinder attempt to simulate the dynamics of human interaction, the more they push us to the precipice of disgust. In our monkey brains, we know the difference between face time and FaceTime. We feel connected as long as we lock eyes with our phones. Then we look up, and the room is empty.

What Came Next

When asked to imagine 100 years into the future, the first place your mind might go is to the new technology: what do we use to communicate? Are there chips implanted into our eyes? How common are personal robots and drones? What medical breakthroughs have we stumbled upon? Have we populated the solar system?

Science fiction has trained us to imagine the future first through the lens of technological advancement.

Perhaps it has limited our imagination.

Let’s travel 100 years into the past. We flag down a woman walking down 8th Avenue. We have a story to tell her about the future. We’re short on time (the time control device really eats up our phone’s battery), so we tell her two things:

  1. In one hundred years, we will have created a device that instantaneously enables you to send photographs and messages to anyone else in the world. Everybody will have this device, and it will fit in the palm of her hand.
  2. We will elect our first female president, the successor to our first black president.

Which statement will inspire more curiosity? More excitement? Which statement will compel her to see her present day in a new light? To change her mind? To take action?

Again, imagine 100 years into the future. Science fiction might be the least radical lens through which we may envision the world-to-come.

Why We Act

Go to a concert by yourself.

Listen to an album, front to back, with other people.

Go for a hike by yourself.

Read a book with other people.

Fly to Paris by yourself.

Take a shower with other people.

Whether by routine or taboo, we give our hobbies and habits a social orientation: private or public, alone or accompanied.

Insight blooms in new company.