Fickle

Often, we fear changing our minds on ideas because we don’t want to seem like we aren’t solid in our opinions and beliefs. It’s as if we think people will dislike us because we don’t stick with our beliefs when they are challenged. In popular culture, there’s no better example than politicians in debates, who will never sway from their beliefs1.

But it’s not just in the mainstream media. Scientists do this, too. One only has to look at the complicated geocentric model of the solar system from Ptolemy to see how we aren’t tempted to ever leave our beliefs. Or by looking at the Pythagoreans, who only considered whole numbers (or ratios of whole numbers) as “proper” in mathematics, effectively shutting out rational numbers from ever seeing the limelight.

But this is exactly the opposite of what should happen.

Follow this line of thinking: Imagine you’re beginning to learn a new subject at school. At first, you have absolutely no clue about anything in the subject. As the class begins, you begin to form mental models about the concepts that are discussed to you. You extrapolate from what the teacher is saying to make assumptions about other applications of the idea.

The next class comes, and the teacher talks about the concept you had been thinking about earlier. However, you are surprised to learn that your idea of the concept wasn’t quite correct. Instead, there were other factors that you hadn’t taken into account, which renders your mental model incorrect.

Once the class finishes, do you go up to the teacher and say that they have it all wrong? Or do you take what the teacher just taught and incorporate it into your model, fixing what was wrong?

If you want to learn, you’re definitely picking the latter. It’s the rational option, since you’ve learned something new. As such, it makes sense that you’d change your mental model.

After all, the former option isn’t smart. You’re trying to defend assumptions you’ve made versus actual proof from a teacher2. Therefore, you’re setting yourself up for ignorance, versus actually learning a new idea.

The big problem is that trying to maintain one idea with an unwavering belief is a recipe for stagnation. Case in point: Back in 2007 when Apple unveiled the iPhone, BlackBerry wasn’t particularly concerned, infamously saying, “These are computer guys. They’re not just going to walk in here and solve the phone riddle.” (Link). We all know that the iPhone flourished, while BlackBerry steadily lost marketshare, in large part because they did not want to change their beliefs. As a result, they stagnated.

These are just a few examples, but plenty can be found in so many domains throughout history. Unwillingness to change their beliefs resulted in bad things for those people. By not keeping an open mind, they halted any potential for growth.

So now that we know what happens to these people, why in the world does this continue to occur? Surely people would see the evidence and try to change for the better?

I can’t resist, so here’s my three-word answer: global climate change.

The best reason that I can find why people don’t like changing they’re ideas is because they become attached to them. Particularly in the scientific community, original ideas are the currency. If you have an original research idea, you have a potential ticket to continued funding. As a result, people can get emotionally attached to an idea, not wanting to let it go even when most evidence points against it. While this is prevalent in science/academia, it also appears in every other domain. People like their own ideas, and so aren’t particularly happy about changing them.

Therefore, we have to be open to new ideas and perspectives if we want to be lifelong learners and grow in our areas of interest. Clinging to our ideas is only a mechanism that tries to protect our pride and make us look “resolute”. Instead, we should worry about being blind to new and brilliant ideas because we are entrenched in our old world views. If we do this, we can create the habit of absorbing information, and then analyzing.

What this means is keeping an open mind to everything. By all means, call an idea “terrible” if it is so. However, you should give every idea the benefit of the doubt until you’ve understood it. Or, at the very least, try to keep yourself versed in other perspectives, so you can always be testing your ideas and perspectives against the others. That’s the key, in the end. We should be concerned with finding the best ideas at the time, and working with those, no matter who they come from.

As people that are interested in learning and becoming the best in our domains, we should not fear change. We should fear staying rooted in old and outdated ideas, simply because we’ve become emotionally attached to them.

In the end, being fickle and willing to change your mind isn’t a bad thing. It’s the sign of a smart person on a path for lifelong growth.

  1. Personally, I don’t see how we can continue calling these events “debates”. In my eyes, a debate means a place in which one party should present arguments that are so strong that they sway their opponent. However, this obviously never happens. 

  2. I’m not saying that teachers are always right and students’ assumptions are always wrong. However, when a proof is given that is rock solid and contradicts your mental model, that is when it is better to listen to the teacher. While making mental models is a good exercise, they tend to not be the most accurate, and so are often changed. 

Elegance

Ask any scientist or mathematician, and this is the quality that they would love their solution to have. They want the result to be elegant, simple, and intuitive.

To give you an example, I remember doing a problem in my calculus class which involved using a bunch of trigonometric functions. Naturally, the integral kind of exploded as I worked on it, and the result was super-complicated. However, after applying a bunch of different identities and swapping sines and cosines, the answer came back as simply tangent of theta.

When I got this result, I immediately knew I was right. The result was just too perfect after all that work for it not to be true (of course, this is a bias). Additionally, the answer made me feel good. It was a nice answer to look at, particularly after all the work required to get there.

This underscores our tendency in science and mathematics to revere simple answer. Consequently, we tend to “dress up” our equations and concepts in order to make them much more compact than they are in reality. I have two examples to illustrate the point.

First, in physics (particularly, wave motion), there’s the notion of forced oscillations for a spring or other kind of object feeling some sort of oscillation. The illusion of “dressing up” the equation was so strong in this sense that I felt moved enough to create a small comic of it:

ForcedOscillation1ForcedOscillation2

Even as my teacher talked about this equation, she looked sheepish. As soon as we saw the whole equation, we could see why (and this was only the steady state solution).

The second example comes from the recent World Science Festival, where I watched the panel on gravitational waves. During this panel at around the thirty minute mark, the moderator (Brian Greene) walked through some of the equations of general relativity, and showed just how complicated these equations can be. Despite looking relatively (sorry!) simple, the equations are just being dressed up to cover their complexities. There’s nothing necessarily wrong about this, but it does illustrate how equations in science and mathematics can be a bit more challenging than they appear. This is all done in the name of elegance. If we can make an equation more compact, we will do it.

Often, we seek the elegant answer, wanting to have something simple after working through a bunch of mathematics. This leads us to covering up the complexities of many equations, which make them difficult to understand while looking in from the outside.

Perhaps we should embrace a little more complexity?

Are You Willing to be Mediocre?

At first glance, the answer would be, “no”. However, giving that answer would be missing the point.

We all want to be experts. It doesn’t even matter what the expertise is in. If you ask most people, they’ll gladly accept being an expert at nearly anything. This is because achieving a state of expertise means you are wise and went to the trials of becoming an expert.

What is missed, however, is the fact that expertise arises from mediocrity. To be an expert means to have been an amateur. It’s virtually impossible to leapfrog from not knowing anything about a subject to being an expert. This simply does not happen.

Instead, expertise occurs as a result of a lot of practice. And, more importantly, being wrong.

We don’t like to be wrong. Often, being wrong feels like a personal attack on one’s character, as if we don’t feel as intellectual as the rest of our peers. We don’t like being wrong because we do not enjoy displaying a weakness to those in our social circles.

However, the reality is that expertise requires you to be mediocre. There’s no shortcut. In a way, being an expert means you’ve once been an amateur and have learned from all your mistakes. We don’t like to think of it like this, though, because experts are thought to be people who don’t get anything wrong.

What we need to realize, then, is that those who become experts have become experts because they’ve accepted the journey to get there. Mainly, they’ve accepted that being mediocre is just a phase in a process. Being only a phase, they realize that it will come to an end, and eventually lead to being an expert.

It’s a comforting myth to spread, but it’s wrong. Experts become experts as a result of being mediocre, not in spite of. Therefore, if you want to be an expert at what you do, embrace the phase of mediocrity. Through hard work, you’ll find that it is only a phase and does get better.

The path to expertise always includes mediocrity.

Slowly Chipping Away

Trying to solve the problem all at once is complicated, and usually too messy. When trying to solve something in one step, it’s easy to make mistakes or otherwise fail. It’s the nature of trying to figure out faster ways to perform tasks.

It’s the same way for your goals. On the one hand, you can attempt to take on too much, too soon. You can fill your schedule up with your goal to such an extent that everything becomes overwhelming. At that point, odds are you will stumble backwards from the pressure of it all.

If you want to look at the masters for inspiration, you will see this strategy all the time. Rarely will a master attack a problem head on, because they know it won’t work. Instead, they develop a methodical strategy to slowly get part of the problem solved, one step at a time. This is because it is much easier to handle a small part of the problem than trying to deal with the whole thing at one time.

When you have a choice for how to tackle a goal or situation, the smart choice will usually be to slowly chip away at it, instead of trying to get it done in one session. This way, you can create momentum for yourself, as well as stopping yourself from being paralyzed out of overwhelm. And, chances are, a big, ambitious goal will be tough to handle in one go.

Chipping away at your goal may be slower, but it ensures that you don’t get stopped by being too big.