Book Summary: The Great Mental Models by Shane Parrish and Rhiannon Beaubien
Updated: Dec 28, 2020
An alternative to reading: Animated Video (7min)
The Great Mental Models: General Thinking Concepts is the first book in The Great Mental Models series designed to upgrade your thinking with the best, most useful and powerful tools so you always have the right one on hand.
This volume details nine of the most versatile, all-purpose mental models you can use right away to improve your decision making, productivity, and how clearly you see the world. You will discover what forces govern the universe and how to focus your efforts so you can harness them to your advantage, rather than fight with them or worse yet— ignore them
3 Big Ideas
Adopting the right Mental Models will increase your chances of success in life and work.
To work with complex problems, your mental models must be continuously upgraded and refined.
The most successful people in life are those with the fewest blind spots.
2 Most Tweetable Quotes
"In life and business, the person with the fewest blind spots wins."
"The world does not isolate itself into discrete disciplines. You need to work hard at synthesizing across the borders of our knowledge, and most importantly, synthesizing all of the ideas you learn with reality itself."
Toby's Top Takeaways
The Fifth Discipline by Peter Senge is one of my favourite books on organisational change. He shares that Mental Models is one of the five disciplines of a learning organisation. Therefore I was interested to see how this book would explore the topic of Mental Models.
This biggest value of this book is how it provides a simple way to categorise thinking patterns. I was reminded of familiar ideas such as First-Principles and Mapping. I learned new ideas, Occam’s and Hanlon's Razor. Sharing this book with your team members would be a great way to build an understanding of each other's mental models.
The models described within the book are mainly applicable to complicated and simple situations. Where the cause and effect are known. Therefore these mental models may not be best suited to complex situations.
Finally, the book focuses on individuals. There is little focus on Mental Models within groups. How you can harness collective intelligence. Diverse teams will always succeed over a single individual when tackling complex problems. There are further volumes planned so perhaps this will covered then.
In summary, a good book to reinforce your existing thinking patterns but also a way to learn new mental models to improve your thinking.
I believe in the discipline of mastering the best of what other people have figured out.
Mental models describe the way the world works. They shape how you think, how you understand, and how you form beliefs. Largely subconscious, mental models operate below the surface. You’re not generally aware of them and yet they’re the reason when you look at a problem you consider some factors relevant and others irrelevant. They are how you infer causality, match patterns, and draw analogies. They are how you think and reason. A mental model is simply a representation of how you think something works.
In life and business, the person with the fewest blind spots wins.
There is an infinite number of Mental Models. This book shares Mental Models with the most utility:
Circle of Competence
Develop a latticework of mental models
“You've got to have models in your head. ... You've got to hang experience on a latticework of models in your head.”
Munger recommends that 80–90 mental models will carry you through 90% of life.
With complex problems, you increase your ability to act effectively when you can utilise a wide range of mental models. You can think of these as spectacles. Lenses that allow you to see the world from different perspectives. The diversity of perspectives is what helps you succeed.
The key here is variety. Most of us study something specific and don’t get exposure to the big ideas of other disciplines. People often don’t develop the multidisciplinary mindset that's needed to accurately see a problem. And because you don’t have the right models to understand the situation, you overuse the models you do have and use them even when they don’t belong.
The world does not isolate itself into discrete disciplines. You need to work hard at synthesizing across the borders of our knowledge, and most importantly, synthesizing all of the ideas you learn with reality itself.
"The chief enemy of good decisions is a lack of sufficient perspectives on a problem."
Alain de Botton
Seeing the System
A metaphor highlighting how you become blind to the system:
There are these two young fish swimming along and they happen to meet an older fish swimming the other way, who nods at them and says “Morning, boys. How’s the water?” And the two young fish swim on for a bit, and then eventually one of them looks over at the other and goes “What the hell is water?”
Furthermore, when you are distant from the effect of decisions you become even more blind to the system. Without feedback, it becomes easier to keep our current views rather than update them. This is prevalent in an organisational setting when organizations over a certain size often remove you from the direct consequences of decisions.
The Map is Not the Territory
A map is a model of reality. It is not reality itself.
We use maps to make sense of the world. They help us move through life without the need to hold all the specific details of the territory in our minds. If a map were to represent the territory with perfect fidelity, it would no longer be a reduction and thus would no longer be useful to us. Maps are always a snapshot in time. Therefore they can represent something that no longer exists.
All maps are imperfect. You should accept that the maps you hold will always be wrong and never be an accurate representation of the territory. When maps are created it includes the creator's biases, assumptions and beliefs. This is important to keep in mind as you think through problems and make better decisions.
“What makes these models so dangerous … is that the constraints that are assumed to be fixed for the purpose of analysis are taken on faith as being fixed in the empirical setting.”
"The London underground map is super useful to travellers. The train drivers don’t use it at all! "
Maps describe a territory in a useful way, but with a specific purpose. They cannot be everything to everyone.
Learn more: https://learnwardleymapping.com/
First Principles Thinking
"I don’t know what’s the matter with people: they don’t learn by understanding; they learn by some other way—by rote or something. Their knowledge is so fragile! "
First-principles thinking is one of the best ways to reverse-engineer complicated situations and unleash creative possibility. Sometimes called reasoning from first principles, it’s a tool to help clarify complicated problems by separating the underlying ideas or facts from any assumptions based on them. What remains are the essentials. If you know the first principles of something, you can build the rest of your knowledge around them to produce something new.
First-principles expose the boundaries that you have to work within. From this, you can challenge the fundamental assumptions of the situation more effectively.
First-principles thinking helps you avoid the problem of relying on someone else’s tactics without understanding the rationale behind them. If you know the principles you can change the tactics. Reasoning from first principles allows you to step outside of history and conventional wisdom. When you really understand the principles at work, you can decide if the existing methods make sense.
"As to methods, there may be a million and then some, but principles are few. The man who grasps principles can successfully select his own methods. The man who tries methods, ignoring principles, is sure to have trouble."
2 techniques to identify the principles in a situation:
Socratic questioning generally follows this process:
Clarifying your thinking and explaining the origins of your ideas. (Why do I think this? What exactly do I think?)
Challenging assumptions. (How do I know this is true? What if I thought the opposite?)
Looking for evidence. (How can I back this up? What are the sources?)
Considering alternative perspectives. (What might others think? How do I know I am correct?)
Examining consequences and implications. (What if I am wrong? What are the consequences if I am?)
Questioning the original questions. (Why did I think that? Was I correct? What conclusions can I draw from the reasoning process?)
Second-order thinking, also known as the “Law of Unintended Consequences”, is thinking holistically. It requires you to not only consider not only the direct but also indirect, more distant effects of a decision. Failing to consider the second-and third-order effects can unleash disaster. Very often, the second level of effects is not considered until it’s too late.
When making choices, considering consequences can help you avoid future problems. When making a decision, ask this question: And then what?
“You can never merely do one thing.”
Garrett Hardin - First Law of Ecology
"When we try to pick out anything by itself, we find it hitched to everything else in the Universe."
"Once a few people decide to stand on their tip-toes, everyone has to stand on their tip-toes. No one can see any better, but they’re all worse off."
Warren Buffet - Letter to Shareholders, 1985.
"The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function. One should, for example, be able to see that things are hopeless yet be determined to make them otherwise. "
F. Scott Fitzgerald
Inversion means approaching a situation from the opposite end of the natural starting point.
There are two approaches to applying inversion in your life. Start by assuming that what you’re trying to prove is either true or false, then show what else would have to be true. Instead of aiming directly for your goal, think deeply about what you want to avoid and then see what options are leftover.
"Anybody can make the simple complicated. Creativity is making the complicated simple."
Simpler explanations are more likely to be true than complicated ones. This is the essence of Occam’s Razor, a classic principle of logic and problem-solving.
Occam’s Razor is a general rule by which you select among competing explanations. You should prefer the simplest explanation with the fewest moving parts. If all else is equal, that is if two competing models both have equal explanatory power, it’s more likely that the simple solution suffices.
The simpler explanation is more robust in the face of uncertainty.
In another life-and-death situation, in 1989 Bengal tigers killed about 60 villagers from India’s Ganges delta.13 No weapons seemed to work against them, including lacing dummies with live wires to shock the tigers away from human populations. Then a student at the Science Club of Calcutta noticed that tigers only attacked when they thought they were unseen, and recalled that the patterns decorating some species of butterflies, beetles, and caterpillars look like big eyes, ostensibly to trick predators into thinking their prey was also watching them. The result: a human face mask, worn on the back of the head. Remarkably, no one wearing a mask was attacked by a tiger for the next three years; anyone killed by tigers during that time had either refused to wear the mask or had taken it off while working.
People tend to undervalue the elementary ideas and overvalue the complicated ones.
"Most geniuses—especially those who lead others—prosper not by deconstructing intricate complexities but by exploiting unrecognized simplicities."
Hanlon’s Razor states that you should not attribute to malice that which is more easily explained by stupidity. Instead of generally assuming that bad results are the fault of a bad actor, you should look for options instead of missing opportunities.
The explanation most likely to be right is the one that contains the least amount of intent.
The fallacy of conjunction - You over-conclude based on the available information.
When you see something you don’t like happen and which seems wrong, you assume it’s intentional. But it’s more likely that it’s completely unintentional.
The theory says that of all possible motives behind an action, the ones that require the least amount of energy to execute (such as ignorance or laziness) are more likely to occur than one that requires active malice.