Why We Learn

Before we get too smug about the past century of scientific progress, note that scientists developed the theory of special relativity before developing a consensus on the existence of the female orgasm.

Scientific inquiry into sex didn’t begin until the 20th century, and until the 1950s remained at the extreme fringes of biology, medicine, and psychology. Participants were nearly impossible to recruit. Many researchers completed their studies with prostitutes, research assistants, spouses, and, when necessary, themselves:

“Rather than risk being fired or ostracized by explaining their unconventional project to other people and trying to press those other people into service, researchers would simply, quietly, do it themselves.”

Mary Roach, Bonk

Measurement instruments were even more difficult to acquire. Mid-20th century researchers like William Masters and Virginia Johnson built their own makeshift penis cameras to get a better look at the action:

“The dildo camera unmasked, among other things, the source of vaginal lubrication: not glandular secretions but plasma seeping through the capillary walls in the vagina.”

Mary Roach, Bonk

TIL.

Here’s the thing: The Hubble telescope recently photographed a galaxy 13.8 billion light years away, literally looking back in time to the formation of the universe. It is very likely that we will develop 3D-printed human kidneys for transplant before we develop a complete model of the mechanics of human insemination.

We are taught to view technology as the bottleneck for understanding the world around us. If only we had more engineers and data scientists to build the gadgets and crunch the numbers, we’d usher in our age of abundance.

We could build a machine to perfectly record and analyze every detail of human sexuality, and we’d still be screwed without thousands of people willing to strip down and jump in, without governments and universities willing to fund the studies, without teachers and parents ready to broach the subject. Until we de-stigmatize human bodies and everything we like to do with them, we’ll never fully understand or heal them.

For our most important human problems, technology is not the bottleneck.

The bottleneck is people willing to talk frankly, to act shamelessly, to share generously.

The bottleneck is culture.

 

Bonk, by Mary Roach, is a frank, generous, and hilarious look at history and science of sex.  Check it out here.

Why We Create

When Tim Ferriss observes, “Almost every friend I have who is a consistently productive writer, does their best writing between 10 p.m. and 8 a.m.,” he fires yet another missile in the endless conflict between owls and larks. Morning people extol the world-building virtues of their productivity. Night people insist that art, inspiration, romance – everything worth staying awake for – reveal themselves in the moonlight. Well-meaning scientists draw up non-rigorous studies that convince nobody of anything and both sides of their moral, intellectual, and societal superiority. Exhausted neutrals are asked to take a stand: Team Sunlight or Team Moonlight.

But:

Maybe you are not a morning person or a night person, but a morning person and a night person. Two people, each with their own sets of talents. Perhaps one is going unheard.

Some writers distinguish ideation from synthesis. Ideation periods are spent brainstorming, developing concepts, taking notes, reading and outlining. Synthesis is when the “actual” writing takes place, turning that note slurry into solid (or shaky) prose.

For most of us, ideation and synthesis demand different levels of focus, distinct forms of mental energy, separate muses. Perhaps your curiosity rises with the sun. You spill with ideas after breakfast. Yet you can’t sit still and bang out your masterpiece until your partner heads to bed. You tell your friends you are a night person, but the truth is a little more interesting: your ideation is a lark, your synthesis is an owl.

The mistake (as usual) is thinking you are one person.

Observe not whether, but how you are an owl and a lark.

Then, spend the afternoon in the noblest manner: napping.

What Came Before, What Came Next

At times, say while preparing one’s morning coffee, we can be struck with the unnerving sensation that we are presently living in someone else’s past.

We are surely not the first to experience the feeling. A young man in England around the time of the Renaissance is struck by the same realization as he walks to his class on medicine at Oxford. While most of his professors continue to lecture on the ancient Greek theory of the four humors, a small, rebellious sect within the college advocates for an empirical, observation-based approach to medicine. The young man suspects that in time, careful practice and experimentation will overturn centuries of theory.

It takes nearly two-hundred fifty years before Pasteur and Koch establish the existence of germs and their role in spreading disease. The following century sees widespread vaccination and plummeting infant mortality.

The young man at Oxford, in his moment of epiphany, understands that he is presently living in the past.

I prepare my morning coffee and consider that the field of modern psychology is barely a century old. We believe our technology to be advanced, our pharmaceuticals effective, the foundations of our theories sound. In reality, our understanding of the brain remains faint. We are at sea, squinting at the hint of a coastline through a fog.

Two-hundred and fifty years from now, a young man walks to a seminar at Oxford titled Neuroarchitecture and Rearchitecture. On the way, he feels the distinct sensation of presently living in someone else’s future. He makes a mental note to ask his professor about the phenomenon after class.

What Came Next

I continue to dwell on Max Planck’s assertion:

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Life expectancy for a 50-year-old living in the UK has increased by ten years during the past century. Some of our sharpest thinkers aim to increase it indefinitely.

An ironic consequence of their possible success: the longer we live individually, the shorter the human race will survive collectively. When generations are replaced at a slower rate, species are slower to genetically adapt to environmental changes.

A second irony: longer life spans slow the acceptance of scientific truths that could further lengthen life spans.

Kurzweil suggests that life spans will advance at an exponential rate, but the truth is the opposite. Immortality is asymptotic; the closer we get, the slower we’ll go.

What Came Next, Why We Fuck Up

The Sugar Conspiracy is a long article that might save the lives of the people who take thirty minutes to read it. Rather than summarize its thesis and nutritional implications, I think there are three biases that are worth setting aside to consider:

1. In-group bias and the politics of academia: It is incorrect to think of the general public as the target audience of academics. Academics write for other academics. As a result, the dynamics that pervade the psychology of any group – drive for consensus, alienation of outsiders, the convergence around charismatic leaders – subvert the supposed objectivity of the scientific process.

Further complicating this is that funding for research is tied to the results. Just as movie studios and musicians produce whatever the crowd wants to buy, academics tend to produce what will be funded and allow them to continue their careers. Hard sciences with fewer direct implications to human behavior (e.g. physics) are less affected by this than social sciences, where results influence business and politics.

2. Confirmation bias and progress via death: Changing a person’s mind through reason is impractical. Add to this the social and monetary motivations for social scientists to remain steadfast against contradictory evidence, and you have a field that generates tremendous confirmation bias. That’s why this quote by physicist Max Planck sticks out to me:

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Feel free to substitute “scientific” progress for social, political, culinary, etc.

3. Authority bias and the threat of the crowd: When the printing press was first invented, there was debate within the Catholic Church about whether the Bible should be made available to the masses. Until this point, literacy was largely limited to priests in the church, whose responsibility was not only to read the Bible to the congregation, but interpret its truths. Catholic leaders were concerned that ordinary people would not be able to “correctly” interpret the word of God without an intermediary. The availability of the Bible in homes directly led to an overthrow of orthodoxy thanks to Martin Luther and the Protestant Reformation.

We have reached a familiar precipice thanks to the democratization of the internet. Over the past century, scientific process has been relegated to academics who take pride and pleasure in regulating which voices are valid. Blasphemers of academic orthodoxy have been No-True-Scotsman’d out of conferences, funding, and tenure. But once again, advances in communication technology have “flattened hierarchies everywhere they exist.” More voices does not necessarily mean better signal, but it does topple the prevailing authority bias present throughout academia. In the next couple decades, we will see a return to prominence of citizen scientists that crowdsource their validation with other non-credentialed enthusiasts.

Which, ironically, is how the scientific method first emerged.