The Breakthrough Fallacy—What Change Actually Looks Like

Listen to this article
00:0000:00

Before Gandhi was Gandhi, he was just a lawyer. And apparently, not a very good lawyer. Then in 1893, while taking a train in South Africa, he was asked to remove himself from the first-class cabin, you know, because racism. He threw a shit-fit and refused. The white people duly threw him off the train and it was sitting there in the cold damp night, alone, that Gandhi vowed to fight for equal rights. After using his philosophy of nonviolent protest to win civil rights victories in South Africa, he returned to India where he successfully led the movement for Indian independence. Or so the story goes…

Before Einstein was Einstein, he was a clerk at a patent office. And apparently, not a very good clerk. Known for being a poor student, growing up with a speech impediment, and being a bit on the lazy and disorganized side, Einstein suddenly found himself, in 1905, amidst a flurry of scientific epiphanies. At the meager age of 26, he published four papers in major scientific journals. These papers included his ideas about the theory of special relativity and his famous equation, E=mc2. He would become world-famous, the very icon of genius in western culture. Or so the story goes…

Name any famous or influential person and you’re likely to also turn up a similar breakthrough story of their success. Michael Jordan being cut from his high school varsity team inspired him to work harder and never fail again. Steve Jobs getting fired from his own company forced him to re-evaluate the way he worked with others. Barack Obama giving the keynote speech at the 2004 Democratic National Convention ignited a new base of liberal voters. Harrison Ford happening to get hired by George Lucas to build cabinets for his new house and then becoming friends with him.

The list goes on and on. Everyone seems to have their own big “breakthrough” to explain their success.

We describe our own lives with these same breakthrough narratives as well. When I talk about the choice to quit my day job, it’s told as an experience I had on my first day at work. When people ask me how my girlfriend and I met, we both have a little story about a conversation we had that immediately made us incredibly interested in one another. When I explain my decision to drop out of music school, it’s always told with a little anecdote of something my guitar teacher said to me during a lesson. Somehow, in my mind, that single exchange with my teacher ruined everything for me.

My guess is that you can describe the major inflection points throughout your life in a similar way. Every major life decision that has been monumentally influential on you today, you can likely point to a single moment that felt as if it inspired or motivated that change or transition.

Man standing at sunset

And similarly, we apply this logic to our future as well. In our careers, we wait for our “big break.” When trying to meet somebody, we hope that each new person is “the one.” When we try to learn a new skill or improve ourselves in some way, we hope to have our big breakthrough or stumble upon an epiphany that will forever change the way in which we view ourselves and others.

We’re always waiting for that next big breakthrough in our lives, never knowing when it will show up.

Some people get frustrated because they feel like they work and work and no breakthrough ever comes. They feel like there should be some glorious epiphany, some god-came-down-from-heaven-and-flushed-my-toilet kind of life-changing event that will finally, once and for all, rid them of whatever it is that is hurting so much. Yet, despite their efforts, all they get is the agonizing reality of imperceptibly gradual change.

Then you have other people who feel as if they’re constantly having breakthroughs. Any time they’re upset or frustrated with something, anything that makes them feel better is treated as some sort of life breakthrough that will leave them changed forever — a new conversation with a friend or family member, a new visit to their therapist, taking a new Buzzfeed quiz — “Oh look, my spirit animal is a meerkat, that explains everything!”

Yet, despite their constant perceptions of their own breakthroughs and monumental shifts in emotions and thought patterns, outwardly their lives also continue to evolve and change at a snail’s pace. The majority of their behavior remains the same. The majority of their thoughts revert to their stubborn ways.

I believe this is because the “breakthrough” framework for personal growth is mostly an illusion. This idea that single events have a disproportionate effect on our identities and how we grow is the result of a perceptual bias. It’s nothing but a trick our mind plays on us to make our experiences more comprehensible and our progress seem more reproducible.

A Trick of the Mind

Memory is a funny thing. If I asked you the name of the street your best childhood friend lived on, you could probably immediately tell me. But if I asked you what shirt you wore three days ago, you probably have no idea.

That’s because memory is based more on importance and meaning than it is on time, detail or even facts.

Biologically speaking, memory is expensive.1 It requires a lot of energy to rewire our neurons and synapses. Our world is too varied and complex for us to remember absolutely every detail about every event, so our brain decides to take a few “shortcuts” when it comes to organizing information. It organizes experience in terms of meaning first, facts and details second.

So when you have a long, complicated experience with a lot of subtle micro-experiences, instead of analyzing and weighing every individual factor, the brain simply inducts the overall meaning and then constructs the “facts” into a narrative to fit that meaning.

Man sitting on train

This is why eyewitness testimony in court cases is notoriously unreliable. They’ve already decided what the event means to them, and their memory unconsciously alters itself to fit that meaning.

It’s why lawyerless suspects are so susceptible to admitting to things that they didn’t do — the police convince them of the meaning first (“you’re a loser, you’re so stupid that you didn’t even realize what you were doing was wrong,”) and then the suspect magically remembers committing the crime.

It’s why when we’re angry at somebody we could swear that they said something that they actually never said. It’s why when we’re sad about something, we feel like everything in our life sucks, even though only a couple things are wrong. It’s why when we’re embarrassed about something, we’re convinced that more people were paying attention to us than actually were.

The facts are pieced together to match the meaning, not the other way around.

Our memory constructs itself in little cause/effect stories based on whatever meaning we glean from a situation.2 These little cause/effect narratives are less accurate than they are useful. They help us remember things that are important. They help us predict what events may happen in the future.3 And as a result, they condition us to view momentous life changes, in both ourselves and others, as these single major cause/effect events that can be replicated and acheived.

But unfortunately this isn’t true.

Take Gandhi: It turns out that Gandhi was born to a family of successful politicians. His mother was a highly devout Hindu and often fasted for days on end, especially if she felt particularly aggrieved about something.

Gandhi grew up resenting the British occupation of India and he originally took a job in South Africa to avoid the stifling influences of colonialism. But once he was there and confronted with even worse instances of colonialism, he resigned himself to fight it. After all, he seemingly had no alternative.

And initially, his fight was purely legal and legislative. He was a lawyer, and so he represented Indians who were discriminated against in business and in law. It took over a decade for his brand of nonviolent protest to take shape and it only did so after he had come into contact with ideas of civil disobedience and radical nonviolence written about by Thoreau and Tolstoy and others. It took Gandhi over 20 years to successfully achieve (small) civil liberties victories for Indians in South Africa, and another 30 years to lead India to independence… from a British Empire that had just been decimated by World War II and left willingly.

It’s easy to look at Gandhi’s life and view it as the singular transformation of a great soul on one fateful night on a train. After all, the nickname “Mahatma” itself means “great soul.” But the truth is that Gandhi was born and raised with many of these sensibilities, and the colonial world he grew up in slowly chiseled him into the eccentric and radical revolutionary that history knows today. And even then, Gandhi was no saint. He participated in multiple wars, was an unabashed racist himself, reportedly beat and neglected his wife, and had a variety of odd proclivities and habits.

Gandhi's shit stank, just like everybody else's
Gandhi’s shit stank, just like everybody else’s

History is complicated. Not to mention messy. Yet, the story we usually end up with is “man is normal, X happens, man is now exceptional.” But it doesn’t work that way.4

Einstein’s story is often told in a similar way. He was a bad student. He had a speech impediment. He was unorganized and lazy. He couldn’t even get into a good university. But he was so brilliant, he revolutionized modern science almost as a hungover afterthought.

Again, not true.

Einstein was brilliant. But his discoveries didn’t just come out of nowhere. And they certainly didn’t come out of a patent office.

Einstein’s discoveries were not a single stroke of genius as much as a life-long passion. He first got the ideas for his theory of special relativity when reading a children’s science book at age 10. The book talked about how fast electricity traveled. Little Einstein began to wonder that if you could travel as fast as light, what would it look like — after all, it would appear stationary.

By age 13, he had read Immanuel Kant’s Critique of Pure Reason — a thick and dense philosophy book about the limitations of our empirical observations. If that doesn’t impress you, I suggest picking up a copy and seeing if you can make it past page three. Chances are you can’t.

By age 15, he had mastered calculus and differential equations and by age 16 he had already been published in a scientific journal — it was a paper about observing the speed of light.

Taken in the context of his life then, Einstein’s theory of special relativity in 1905 comes as no surprise. He struggled for more than a decade to accomplish that feat. And it took almost 20 years for his theories to be taken seriously within the scientific community. Not exactly an overnight success. He started at age 10 and only really made it by around age 40. That’s a lifetime of work.

But even then, he wasn’t celebrated as the iconic figure he is today. It wasn’t until after World War II that he was seen as a hero and a genius — after all, he was a German Jew who came up with the theories that gave the US the atomic bomb and won World War II. What wasn’t to celebrate about him?

Get Your Shit Together — Here’s How

Enter your email address below and I’ll send you a 55-page guide showing you how to develop rock-solid self-discipline and healthy habits that last.

    The Breakthrough Fallacy

    In his book The Black Swan, author Nassim Taleb calls this tendency to remember everything in terms of simple cause/effect events, “The Narrative Fallacy.” It’s the human brain’s habit to compress experience into pithy, repeatable narratives for us to not only remember better ourselves, but also to better communicate to one another.5

    Taleb says the danger of the Narrative Fallacy is that it causes us to underestimate some of the more random and unglamorous influences on major life events — the fact that Michael Jordan’s hyper-competitiveness is an ingrained part of his personality and has nothing to do with his high school team; the fact that Steve Jobs’ later career success is better explained by the massive changes that had occurred in the consumer electronics market in the 90s than by him coming to terms with his inner asshole; or the fact that Harrison Ford got the chance to build George Lucas’ cabinets and become Han Solo because he had already been auditioning for small roles and networking throughout Hollywood for eight years.

    Harrison Ford's big "break" was essentially auditioning and taking crappy parts for eight years straight.
    Harrison Ford’s big “break” was essentially auditioning and taking shitty roles for eight years straight.

    If the Narrative Fallacy says that we underestimate the influence of subtle and unmemorable events, then the logical flipside of the Narrative Fallacy is that we’re also likely to overestimate the importance of our most memorable life experiences — call it “The Breakthrough Fallacy.”

    The Breakthrough Fallacy suggests that we overestimate the most memorable facets of other people’s development — Gandhi getting thrown off the train in South Africa; Einstein the lowly patent clerk publishing the theory that changed the world — as well as the most memorable facets of our own development — my guitar teacher telling me that practicing six hours a day wasn’t good enough.

    But most importantly, the Breakthrough Fallacy also suggests that the biggest and most important changes we can make in our future are not necessarily the most memorable or the most dramatic. On the contrary, they’re likely to be contextual, circumstantial, slow and arduous, and in some cases, possibly not even conscious.

    I get emails from readers all the time asking me for that one thing they can do to get over their ex, to work up the nerve to quit their job, to become a popular blogger/writer, to fix their relationship with their spouse, to get over their social anxiety, and on and on.

    And I have no good answer for the simple reason that these are the wrong questions.

    Part of the Breakthrough Fallacy is our human nature. Our psychological biases working against ourselves. But part of the Breakthrough Fallacy is cultural — the quick fix, the magic pill, the limited-time-only solution for three easy payments of $69.95. We are all looking ahead for that memorable breakthrough experience that never comes.

    When we get over an ex, we feel it should be some momentous occasion laden with fireworks and champagne corks. But it’s more likely to be a quiet and unnoticed moment, sitting on a bus or train, silently watching the moon, being alone and being OK with it.

    When we get over an insecurity, we expect some explosion of hallelujah energy, this dizzying high of perfect freedom and pristine self-confidence. But in reality, our anxieties are like castles built in the sand, things that the sea of experience quietly erodes away and smooths over, until it becomes impossible to remember that there was ever something there to begin with.

    When we constantly seek the next epiphany or the next breakthrough to “fix” ourselves, really all we’re doing is reaffirming a belief that we’re already broken. The desire for the singular breakthrough that rewrites our identity and remixes our entire lives is a subtle but persistent attack on our own self-worth: a seemingly noble desire on the surface that reverberates “I’m not good enough, I’m not good enough,” underneath.

    Because nobody ever changes completely. And nobody changes all at once. Change is gradual. Breakthroughs are gradual, measured in decades and not moments.

    There is no big breakthrough. Our breakthrough is now. This moment. And the next. And the next. Our life is a never-ending series of micro-breakthroughs, some of them obvious and consciously impactful, others subtle and unnoticeable.

    And as long as we’re fixated on our next big life-changing moment, we’re likely to miss all of the small ones happening right now, right in front of us.

    [Photo credit: RL Johnson and colinlogan]

    Footnotes

    1. Dukas, R. (1999). Costs of memory: ideas and predictions. Journal of Theoretical Biology, 197(1), 41–50.
    2. Klein, K., & Boals, A. (2010). Coherence and Narrative Structure in Personal Accounts of Stressful Experiences . Journal of Social and Clinical Psychology, 29(3), 256–280.
    3. Schacter, D. L., Addis, D. R., & Buckner, R. L. (2007). Remembering the past to imagine the future: the prospective brain. Nature Reviews Neuroscience, 8(9), 657–661.
    4. White, H. (2005). Introduction: Historical fiction, fictional history, and historical reality. Rethinking History, 9(2-3), 147–157.
    5. Taleb, N. N. (2010). The Black Swan: The Impact of the Highly Improbable (2 ed.). New York: Random House Trade Paperbacks.