Introduction

  • Mental models: recurring patterns that occur in life
  • Knowing mental models and creating mental pictures can help us understand what is going on and make better decisions
  • Each discipline has its own set of mental models but a subset of that can be used for general life patterns
  • Mental models give us access to higher level thinking but want it to be broad enough that we have the right model for the right situation

Being Wrong Less

  • Inverse thinking: invert the situation and think through
    • The inverse of being right more is to be wrong less
  • Unforced error: an error one makes that is solely influenced by your skill and not by others (eg. missing a badminton shot because you are bad at clears, not because the opponent shot an awesome clear)
    • Notice unforced errors in life and reduce them as much as you can
  • Antifragility: get better as you experience shocks
    • You can make your thinking antifragile by getting better from your mistakes
  • First principle thinking: use basic principles and build up to your conclusion
    • Think of math proofs: we use the basic building blocks to get to where we need to be
    • Allows you to approach unfamiliar environments in innovative ways because the basic building blocks of your thinking has not changed
    • Avoiding the trap of conventional wisdom
    • Eg: career moves should come from your own first principles of what you require in work and then apply only to positions that satisfy that.
  • De-risking: any first principle is simply an assumption and could be false. You need to test your assumptions
    • First break down your assumptions as far as you can and then test easily
    • In real life, this testing can simply be getting more info or talking to people
    • Any plan you have initially can easily be wrong (battle plans never survive first contact) so this should help you
  • Premature optimization: creating the perfect work too early
  • Minimum viable product: the minimum amount of features that a product needs before it can be tested
    • Build your “MVPs” and test in order to derisk
  • Ockham’s Razor: the simplest explanations are most likely correct
    • Apply the razor to your principles and ask yourself if it is actually needed to arrive at your conclusion
  • Conjunction fallacy: let there be events A and B. From prior knowledge, we know B is quite likely, but people that follow this fallacy also thinks the probability of A and B are also likely, when it’s not the case
    • Eg: Linda is a social activist. Is it more likely that she is a bank teller or that she is a bank teller and has gone to a protest? Although probability of protest is high given prior info, the conjuction of protest attendance and being a bank teller is not necessarily as high (unless every bank teller goes to protests)
    • The conjunction of two or more events is a low probability
  • Overfitting: deriving a model that is way too specific from past evidence
    • Often because assumptions are not simple enough
    • Avoid this by asking if our data can explain multiple conclusions and not only the one we are overfitting
  • Reference frame: your perspective on an issue may be different to another’s
    • Just like the physics reference frame: On a train, you see yourself as standing still but people outside train see you moving
    • Understand your perspective differences when making decisions
    • Framing: frame a situation such that you make a particular perspective more likely, all because of the words you use (eg. frame an innovative initiative as beating the competition rather than a use of expensive resources)
    • Be aware of framing when situations are presented to you
    • Nudging: similar to framing where words/design nudges you to interpret situation in a way that benefits someone else (eg. menus in restaurants often nudge you to buy certain dishes by design of menu)
    • Anchoring: get really influenced by your initial impression of a situation (eg. Trump anchors supporters to extreme positions so that later compromises don’t seem that bad/can be spun for political purposes)
  • Availiability bias: view gets biased because of info made availiable to you
    • Media often uses this. Immigration from Mexico is actually really low but since everyone talks about immigration, you think it is much worse than it actually is
    • Often caused by giving too much weight to stuff within your reference frame at the expense of the big picture
    • Eg: end-of-year review by manager often swayed by recent performance
    • Filter bubble: algorithms will filter out stories they think you won’t like, so you get even more bias, leading to echo chambers
  • When it comes to people, realize that everyone is coming with different experiences and perspectives. In statistics, we call this our prior
  • Always a third story in the conflict between two actors, which is a neutral observation of the situation
    • Try your level best to be impartial during conflict
    • Disarms opponents as it signals willingness to agree and empathy
  • Most respectful intrepretation: interpret the other’s party’s actions in the most respectful way possible, or give them the benefit of the doubt
    • Eg: Email a prof but don’t get an email back. Don’t think the prof has ill intent, but rather really busy and is trying to get back to you
    • Helps you build trust, which can be really useful
  • Hanlon’s Razor: never attribute to malic that which is adequately explained by carelessness
    • Someone caused a negative situation. More likely it was because they were careless and didn’t think it through rather than a deliberate malicious act
  • Third story, most respectful intrepretation and Hanlon’s razor attempts to overcome fundamental attribution error
    • Frequently make errors by atributing others’ behavior to internal/fundamental motivations rather than external factors
  • Self-serving bias: you tend to view yourself as having better reasons for doing something compared to someone else
  • Veil of ignorance: imagine ourselves ignorant of our own positions and try to think from other people’s shoes
    • Eg: If we were thinking about immigration policies, think about how it is to be a refugee
  • Just world hypothesis: people always get what they deserve because of their actions alone, not randomness
    • Leads to victim blaming and ignorance of birth lottery
    • Inaccurate at individual level
  • Learned helplessness: some people can’t get out of a helpless situation because they have learned to stop trying to escape it after many attempts
    • If we can show that their actions can make a difference, we can break this mental model
    • Eg: Utah gives homeless people an apartment and a social worker that can guide them through reintegration. Huge success reduced homelessness pop. by 72%
  • You can be anchored to an entire way of thinking, making it very difficult to convince you a new idea if you already believe in a contradictory idea
    • Paradigm shift: theories don’t change gradually, but rather in bursts. Old guard will hold onto old ideas and they die, leading to a burst of change
    • Semmelweis reflex: data was leading to truth but explanations were incorrect, so we reject the conclusion as a whole
  • Confirmation bias: we are biased to interpret information that confirms our preexisting beliefs
    • This is why startups are often founded by outsiders, because they don’t have the same preexisting beliefs as the insiders
  • Backfire effect: dig into preexisting beliefs further when presented with contradictory evidence
  • Disconfirmation bias: subject contradictory theories to higher burden of proof than preexisting theories
  • Confirmation biases and related models relate to the concept of cognitive dissonance, which is the stress produced when you hold two contradictory beliefs at once
  • Thinking in gray: most truths are not black and white
    • Don’t form an opinion on something until you can get the max amount of info
  • Devil’s advocate: try to argue from the opposite point of view
    • Either you can do it or get someone who already has an opposing POV
  • Following your intuitions alone in uncertain situations can lead to confirmation biases, framing, anchoring and more, so slow down and think in new situations
  • To build up intuition in uncertain situations, either build up conclusion from first principles or perform root cause analysis
    • Proximate cause is the thing that immediately caused an event to happen. The root cause is the real reason (eg. Challenger exploded because hydrogen tank ignited but it was all because of organizational failure)
    • Perform postmortems after event (whether good or bad) and analyze via 5 Whys
  • Optimisitic probability bias: you want something to happen so badly that you think that the event is more likely

Anything That Can Go Wrong, Will

  • Unintended consequences are often predictable
  • Tragedy of the commons: benefits for indivdiual may often be bad for community, esp. if depleting a shared resource
    • Also known as the tyranny of small decisions: these small individualistic decisions lead to terrible consequences for entire community
    • Can be prevented if there is someone who can foresee what happens and vetoes these small decisions
  • Free riders: people that take advantage of resource without paying
    • There seems to be no harm with free riding, but if enough people do that, it can degrade the resource, leading to a tragedy of the commons
  • Herd immunity: don’t need the entire population vaccinated, just enough that those not vaccinated are protected
    • People then think that they don’t need to be vaccinated, which reduces the overall effectiveness of the vaccination program
    • Same concept with taxes: you don’t need everyone to pay for the country to run, but if people start to evade taxes, things get real bad
  • Externalities: unintended consequences that affect an entity without consent and usually comes from some external source
    • Eg: Pollution creates a negative externality for people living near the pollution source
    • Occur when there is spillover effect, where effect of activity spills over core interactions of activity
    • You can internalize an externality by asking entity that created the externality to pay for it. High price hopefully stops externality
    • Coase theorem: you can use a natural marketplace to internalize an externality
      • Good example is cap-and-trade systems for pollution
      • Requires well-defined property rights, rational actors and low transaction costs
    • If in charge of any system, think ahead of possible negative externalities you could create and try to avoid
  • Moral hazard: you take on more risk when you have info that makes you believe that you are more protected
    • Eg: More reckless driving in rental cars than normal cars, acting on behalf of someone
    • Usually because of assymetric information if the issue is with acting on behalf of others
  • Adverse selection: select transactions that will benefit them due partially from asymmetric info
    • Can lead to market breakdown
    • Eg: healthy people know that they don’t need to opt-in for health insurance for Affordable Care Act because P(end up in hospital) ~ 0. They pay for the fine rather than the insurance, which means the premiums rise for everyone, which leads to more healthy people leaving
  • Market failure is often a cause of no intervention, leading to tragedy of the commons
    • Interventions can also fail
    • Eg: research into antibiotics has dropped even though there is significant risk of bacterial infection outbreak. Since we don’t want to use antibiotics often, drugs often expire before use, so not much of a benefit from company POV. No government intervention would prevent us from having these antibiotics because business is not interested due to low profitability
  • Goodhart’s Law: when a measure becomes a target, it ceases to be a good measure
    • Eg: Facebook’s obsession over growth metrics led to breakdown of privacy and increase in filter bubbles
    • People will go to any length’s possible to meet target, even though the method may not be best in the long-run (perverse incentives)
    • Cobra effect: attempted solution makes problem worse
    • Streisand effect: trying to hide something leads to more attention
    • Hydra effect: removing one head leads to another two (eg. removing a drug lord creates two new drug lords who will cause worse problems)
    • Observer effect: the mere act of observing something change behaviour (causes chilling of behaviour)
    • When you set targets, be very careful to not create perverse incentives and pay very close attention to incentive structure. People’s self-interest should support your goals.
  • Boiling frog: class of unintended consequences where you do not react to gradual change, either because you don’t notice it or choose to ignore (head in sand)
    • Arises when people become fixated on short-term and creates technical debt
    • These outstanding debts create path dependence, where your future choices are now dictated by past decisions
    • Choose choices that perserve optionality
    • Downside of trying to keep options open is that it requires more resources and increases costs
    • Use the precautionary principle to eliminate paths that you are pursuing: be very cautious of paths that may cause harm in some way or another
    • First understand what are the long-term problems, work backward to figure out how it arose, then take the necessary level of precaution and pay your debt
  • Information overload: overwhelmed with information that you can use to make decisions
    • Also known as analysis paralysis because you can’t do anything because you are analyzing so much
    • Kind of like the model of ‘perfect is the enemy of good’: not making a choice while waiting to make a choice is a decision in itself, which actually opts for the status quo
    • Can deal with this by categorizing decisions into either reversible or irreversible. Irreversible decisions should be made cautiously but don’t be afraid to make reversivle decisions
    • Hick’s Law: more choices leads to slower decision time. Limit your choices in the first place or create multi-step decisions
    • Decision fatigue: more and more decision making in a limited time leads to worsening decision quality
    • You can frontload your decisions to a time when you are not that overwhelmed
  • Murphy’s Law: anything that can go wrong will go wrong
    • Just be prepared and have a plan when things go wrong

Spend Your Time Wisely

  • North star: the guiding vision of the company/yourself
    • You can point your actions toward your north star and prevent short-termism
    • The north star can evolve as you progress
    • Really small steps can compound into large gains
  • Two-front wars: dividing your attention will lead to defeat
    • Eg: Germany was defeated in both world wars because it had to pay attention to both fronts
  • Multitasking is a form of two-front warfare since your focus is divided
    • Context-switching is simply extra overhead and quite expensive
    • Focusing on one thing can even help with unconcious thinking, as the mind will start to drift to your top priority during inconsequential tasks
  • Deep work: spend extremely long periods of time working on single problem
    • Helps you focus on the most important problem that you need to solve
  • How do you determine what you should focus on? Use the Eisenhower Decision Matrix

  • Dedicate deep work time to ‘Decide’ matrix
  • Sayre’s Law: in any dispute, the intensity of feeling is inversely proportional to the value of the issue at stake
  • Parkinson’s Law of Triviality: organizations give importance to trivial issues
    • Bike-shedding: avoid talking about the difficult issues and instead spend all your time on the trivial issues
    • Timebox or schedule ahead of time to avoid spending too much time on trivial things
  • To choose what to work on in more detail, consider the opportunity cost of each task and choose the task with the lowest cost
    • Similar frameworks: opportunity cost of capital, BATNA
  • Always consider the leverage of your choices, which is simply the outsized multiple of outcome you can get from a set input
    • Highest leverage activities have the lowest opportunity cost
    • Much like the Pareto effect, where 20% of effort leads to 80% of outcome (also known as a power law distribution)
  • Each additional hour spent on a task produces diminishing returns, or sometimes even negative returns

  • As your progress in task, think whether there is more high-leverage task you could rather do. If the opportunity cost is high, switch
  • People procrastinate too much because of present bias, where near-term rewards are considered better than long-term goals
    • Think of procrastination as a negative interest fee, so you can see negative compounding with increased procrastination
    • Net Present Value (NPV) is the sum of discounted earnings
    • We often use hyperbolic discounting, where we value instant gratification a lot more than long-term gratification
  • To combat procrastination, you can commit yourself to a future in some way (eg. pay for a gym membership if you want to be fit)
    • Penalty for breaking it should be harsh and commitment should be specific
  • You can use the default effect as well, which is the effect stemming from people just accepting the default option
    • Organ donation increases significantly if it is opt-in vs. opt-out
    • You can schedule your time out in advance to make use of default
  • Parkinson’s Law: work expands so as to fill the time avaliable for completion
    • Hofstadter’s Law: it always takes longer than you expect, even when you take Hoftstadter’s Law into account
    • Prevent this by knowing that you don’t need perfection to finish
  • Sometimes you want to quit the project because of loss aversion
    • Due to shift in change of reference. If you have guaranteed some gain, you want to lock that in so you prevent taking risks
    • Always use opportunity cost to determine if you should walk or accept
  • Sunk-cost fallacy: costs of task have already been used so you want to complete it
    • Concorde fallacy: prohibitively expensive project because people kept throwing money at it due to sunk cost
    • Again, evaluate from opportunity cost: can I do something better with the time and money I am putting into this project
  • Use the third story model and mortems to help predict where you can possibly run into project traps and mitigate accordingly
  • When you are working on a project, know that you are likely not the first person who has encountered this problem. No need to reinvent the wheel
  • Pay attention to the design patterns/best practices in your field
    • Anti-patterns: seemingly intuitive but actually ineffective solution to common problem
    • Try to predict in advance if you would be using an anti-pattern
  • Brute-force solutions: exhaustive, not intellectual sophisticated solution
    • Problem is scale
  • You can use a heuristic solution, which uses some heuristics and trial-and-error
    • Facebook’s content moderation policy is mostly a heuristic solution
  • Algorithms are another solution. Many of them are often black boxes
  • Automate things that you do repeatedly to save time
  • Economies of scale: entity becomes more efficient with size
    • Amazon functions entirely on economies of scale
  • Another way to speed up a task is to do parallel processing
    • Much like divide-and-conquer strategies
  • Another way to get a solution in a hard situation is to reframe the problem
    • Disney didn’t try to reduce long wait times, but instead made those wait times enjoyable
    • Hackers reframed their problem to find out the best way to get your password rather than guess it, usually through social engineering

Becoming One With Nature

  • Natural selection: traits that give individuals a reproductive advantage helped select them over others
    • Good model for societal evolution and why certain ideas flourish
    • Evolution is happening much faster due to high interconnections
    • This is why we need to be open to new ideas and be highly adaptable, because our skills may not give us a long-term advantage
  • Best way to improve ourselves is through scientific method
  • Inertia: resistance to change
    • You have very high inertia mentally due to biases
    • Overcoming inertia requires payment of strategy tax, which is essentially the cost you need to pay for acting on a non-traditional strategy despite long-term changes
    • This is why you don’t want to have rigid long-term strategies
    • Shirky principle: institutions will try to preserve the problem for which they are the solution
      • Ex: TurboTax lobbies against automatic filing from government because that would remove the whole problem they are solving
    • Lindy effect: the longer something survives, the more likely it is to survive longer (eg. Shakespeare, Beatles)
    • Peak is the turning point when something is about to turn unpopular
    • It will take time for something to be unpopular due to momentum (systematic processes that entrench the object)
  • Inertia of culture is much greater than the inertia of company strategy. If you act on a plan that is different than the culture of the company, it won’t do well (eg. healthcare.gov release by US gov)
    • Because situations change quickly, having a low-inertia culture, i.e. a highly-adaptable culture, you can change your culture fast to adapt to strategy
  • Model of inertia: a flywheel (like a merry-go-round, where starting it requires lots of effort but maintaining is easy)
    • Similarly, becoming an expert is hard to start but easy to maintain
    • This is why multitasking is bad: hard to develop momentum on anything
  • Changing something is hard because of homeostasis: deviation from normal induces opposite effect (get’s too cold, body warms up. Get’s too hot, body cools)
    • Try to mitigate underlying reasons why homeostasis occurs
    • Collect data to prevent homeostasis from occuring in orgs
  • Identify potential energy in organization and center of gravity (eg. key influencer) and try to change that to cause change
  • Identify the activation energy and the catalysts to reduce activation energy
    • Eg: takedown of Confederate statues had high activation energy and BLM acted as a catalyst
  • Forcing function: prescheduled event/function that forces you to take action
    • Use forcing functions to act as catalysts for change
  • Critical mass: minimum amount of material needed to start a chain reaction
    • To start changing dramatically, we need a tipping/inflection point
    • Eg: need a certain amount of people for a party to feel like a party, and final person to arrive to meet the critical mass requirement creates tipping point
    • If you have expertise in an area that is about to reach the tipping point, you have massive value as your leverage increases
  • Technology adoption life cycle:
    • Innovators: takes risks and connected to merging fields
    • Early adopters: tries out new things once more fleshed out, don’t require social proof, often pushes idea past tipping point
    • Early majority: willing to adopt as soon as value prop established by early adopters. Don’t want to waste time/money
    • Late majority: skeptical of new things and requires social proof
    • Laggards: last group to adopt because of necessity
  • Root cause of reaching critical mass is network effect, where each person that joins service makes the service more enticing
    • Metcalfe’s Law: nonlinear growth in networks as you add more nodes
  • Cascading failures: failure feeds into more failures
    • Eg: airline disasters because of cascading failures, not one failure
  • Chaotic systems: easy to determine trend, but near impossible to determine final state
    • Butterfly effect: small changes can lead to big outcomes in chaotic systems
    • Adaptibility is key to success in chaos
  • Luck surface area: interact with more people to increase chance of luck
    • Of course, be judicious about the events you attend in order to preserve deep work times
    • Increasing personal entropy as there are more combinations that can lead to a good outcome (a little bit of disorder is not terrible, can actually be good)
  • Make better decisions using concept of polarity and creating own 2x2 matrices/graphs
    • Avoid black-white fallacy: things don’t always fall neatly into categories and are often continuous
  • Fallacy arises from in-group favoritism and out-group bias. We need to realize that most situations are not zero sum (one group loses while other wins)
    • There are always several factors in a negotiation and not all of them are valued equally by both parties, which means you can use give-and-take

Lies and Statistics

  • Human brains are conditioned for anecdotal evidence, but that isn’t really representative of the general truth
    • Eg: people are more likely to write a review if they are extremely impressed or disappointed with an offering, which doesn’t reflect most people’s experience with it
  • Correlation does not imply causation!
    • Usually confounding factors (third factor that influences the outcome)
  • Always create a hypothesis before you want to test something out
    • Avoid sharpshooter/moving target fallacy where you change the experiment to get the right results
    • Best experiments are blind randomized controlled experiments
    • Observers should also be blinded to avoid observer expectancy bias where the knowledge of your treatment group can impact the behaviour of the observer, influencing the experiment
    • Placebo effect: positive impact despite giving no treatment
  • Endpoint metric for experiments may be hard to collect, so we often have to use proxy metrics
    • University rankings use proxy metrics to determine ranks
  • Selection bias: the data that you have is somehow selected using some bias, making it not representative of the population
    • Why do more well-funded schools have better scores? Not only because of money, because well-funded schools select students that lead to better scores
    • Nonresponse bias: segment of population doesn’t answer survey skews results
    • Survivorship bias: ignore the evidence of objects that did not make it through trial (eg. classic airplane armor example)
  • When presented with data, always ask yourself: who is issing from sample population? Are there any methods that make this experiment non-random?
  • Response bias: knowing that your response matters may affect the way you answer the survey
    • Also caused by wording, order of questions, poor memory, etc
  • Try to call these biases as best as you can
  • Law of small numbers fallacy: drawing conclusions from small samples
  • Gambler’s fallacy: if independent, outcome of one experiment affects another (which is not true, independence means that outcomes do not influence each other)
    • Particularly true for sequence of probabilistic decisions (eg. judges giving parole may be less inclined to give approval if last 3 were given approval)
  • Clustering illusion: drawing patterns where there is nothing
  • Don’t confuse improbability with impossibility
  • Regression to the mean: extreme events followed by more typical events
    • Something rare has a low probability. Repeating that is even more rare
  • Note measures of centre and variance in a dataset
  • Normal distributions are quite common in our world and tell us that slight variance from mean is common
    • Central limit theorem: if you have a sufficiently large sample, take the average, and repeat a lot of times, you will get a normal distribution
    • Margin of error/confidence intervals tell us expected range of data that houses true parameter of data
    • Reducing margin of error requires a exponential rise in sample size
  • Conditional probability is very useful because life is full of conditions
    • Fallacy: P(B | A) ~ P(A | B) because discounting base rate (P(A) or P(B))
    • You can relate these probabilities using Bayes Theorem
  • Bigger sample size is always better but it takes resources
  • 2 types of errors: false positives (falsely thought object in positive class) and false negatives (falsely thought object in negative class)
    • Decisions will require you to think through tradeoff of above 2 types of errors
    • Set false positive rate determine sample size to detect real result with good probability (power)
  • Absence of evidence is not evidence of absence
  • Don’t focus too much on p-values because it can lead to black and white thinking
  • Replication crisis: we often cannot replicate the results we get from papers
  • p-hacking: running additional tests to look for statistically significant results
    • Prevent by specifying tests you want to test ahead of time
  • Publication bias: results that are not statistically significant are not published
    • Papers with non-significant results are still important
  • Meta-analyses: running analysis on other analysis (FiveThirtyEight)
    • Can get closer to truth only if sub-analysis are similar enough

Decisions, Decisions

  • To evaluate decisions, we often use pro-con lists
    • Major con: lots of grey options (can’t be easily classified as pro or con), weighting of each item seems to be equal, no interrelation, may create grass-is-greener mentality
  • Maslow’s Hammer: if you only have a hammer, everything looks like a nail
    • Pro-con lists have become hammers. Not well-suited to decision making
  • Can improve pro-con list by attaching numbers to each item (cost-benefit)
  • Talk to people who have made similar decisions in past and make sure that you didn’t miss out on any important factors
  • Use dollar values instead of points to value each pro/con
    • Intangibles should also be priced (think of how much you would pay for it)
  • Look at pros and cons over time to evaluate more thoroughly
    • Earlier benefits can be used immediately, inflation would depreciate the value of later benefits, more predicatable
    • Use discount rate to discount future benefits (look at NPV calculations)
    • To choose discount rate: use sensitivity analysis to determine benefit in future across different discount rates and think which is most likely (eg. look at inflation, change based on risk:reward)
    • Sensitivity analyses can be very useful: if providing dollar value of intangible is difficult, simply run an analysis and see how final decision can be affected by dollar value of intangible uncovers key levers of decision
    • Discount rates aren’t great for extreme long-term consequences, like climate change mitigation
    • Be wary of evaluating decisions in different timelines! Discount rates cannot account for this
  • Use decision trees when there is a lot of uncertainity and look at expected value
    • You can price in different scenarios and choose relatively accurate probabilities to arrive at your expected value
    • Use utility values to price in the cost of intangibles
  • Be wary of black swan events, which are events which seem like they have small probabilities but are actually more likely than you think
    • Black swan events come from fat-tailed distributions, so probabilities of extreme events are higher
    • You can underestimate black swan events by not understanding their underlying distributions/reason or not pricing in cascading failures
    • Eg: floods in Houston are ‘once in 500 years” but there have been several in three years! Reason climate change is fattening tail of extreme events
  • Systems thinking: draw out diagrams to understand systems and where they are vulnerable
    • Use simulations if necessary, like Monte Carlo
    • Systems behave according to Chatelier’s principle: system will react negatively to external stimuli and readjust to equilibrium (could be new equilibrium, not like homeostasis in that regard)
    • Hysteresis: system behaves depending on history (like path dependence)
  • Systems thinking can help with achieving a global optimum rather than a local optimum
  • Think about known knowns, known unknowns and unknown unknowns
    • Try to make everything known knowns
    • Uncover unknown unknowns through scenario analysis, where you run through different scenarios and expect what happens
    • Scenario analysis is challenging because you may get anchored easily, so always question assumptions and run thought experiments
  • Counterfactual thinking: thinking about “what if” if past was different
    • Beware of butterfly effect
  • Scenario analysis and counterfactual thinking is best done in groups, but this could lead to groupthink/bandwagon effect
    • Mitigate groupthink by questioning assumptions, evaluate all ideas critically, Devil’s advocate, diversity of thought, indep. sub groups
  • Crowdsourcing for decision making works when:
    • Diversity of opinion
    • Independent thinking
    • Aggregation of thinking can happen
  • People that can forecast well have:
    • Intelligence
    • Domain expertise
    • Practice in forecasting
    • Work in teams
    • Open-minded and willing to change beliefs
    • Understand probabilities of past events
    • Takes time
    • Revise predictions constantly
  • When you have come to conclusion of decision, write out thought process which enables you to find holes in your logic

Dealing with Conflict

  • Arms race situations (escalating conflict) is common in society
    • Employers want increasingly selective schools school arms race or race for more status symbols
    • Avoid arms races at all costs and focus on what makes you unique
  • Game theory: study of strategy and decision making in adversarial situations
  • Nash equilibrium: set of choices of which a change of strategy by any player would worsen the outcome
    • Think of prisoner dilllema: equilibrium is when both betrays each other because keeping silent by either one would worsen outcome.
    • Best option != Nash equilbrium because the prisonner dilemma best situation is if both cooperate and not say anything, but if either one betrays, that betters the outcome for that person. Thus, cooperation is unstable
    • Seek out Nash equilibrium of any conflict, because that is the most likely consequence
  • Cooperation is usually better than betrayal in conflict.
    • Only reciprocate with bad behaviour if opponent iterates with bad behaviour
    • In games where reputation matters, cooperation is all the more important
  • Use a payoff matrix + decision tree to evaluate how to get to your outcome
  • Reciprocity: you tend to feel an oblication to return a favor regardless of whether favor was invited or not
    • Big way of influencing people
  • Commitment: being consistent with promises is important, otherwise would lead to cognitive dissonance
  • Liking: you are more prone to take advice from people you like, who share similar characteristics to you
    • You can use this via mirroring
  • Social proof: drawing on social cues as proof that you are making a good decision
  • Scarcity: you become more interested in opportunities the more scarce they are
  • Authority: more inclined to follow percieved authority figures
  • Conflicts can be framed in certain ways for a certain outcome
    • American Revolution started because Thomas Paine framed conflict betweeen colonists and England as Americans vs. English
    • Some conflicts are because of social norms and others are because of market norms. Social norms are involving no money, but market norms do. As soon as you frame it one way or another, things drastically change
    • Frame things such that they appear fair, but even this can be framed in different ways. Fairness = equal distribution or fairness=whoever followed procedure best gets reward?
  • Beware of strawmanning: reducing your argument into something much simpler, which often loses much of the nuance of the argument
  • Beware of ad hominem attacks
  • Dark patterns of influence: Potemkin villages (fake shell of actual reality), bait-and-switch tactics
  • Some conflicts can be so Pyhrric that the best move is not to play at all
    • You can use deterrants to prevent other side from playing as well
    • You can use stick(bad cop) and carrot(good cop) model to create better outcomes
    • Contain bad outcomes and stop bleeding (quick and dirty solution) if things go bad to prevent any domino effect
    • You can even attract bad outcomes in one place, known as honeypotting and simulatenously destroy (attempted in Iraq by US)
  • Domino effect often misused because people don’t understand causality
    • Slippery slope argument: one small thing chain bad outcome
    • Broken windows theory: broken windows environment that encourages crime. Unclear if this is actually true
    • Gateway drug theory: one drug use is gateway to nmore drug use. Also unclear (correlation != causation)
      • Companies use this to lure people to buy products at cheap prices and hope that they inevitably buy higher-priced products
  • When considering if something will cause domino effect, write down list of all possible outcomes along with probabilities and then assess if you need to use containment/appeasement
  • You can use red line tactic to create deterrance: say that you will do something extreme to prevent others from doing some certain action
    • People may call your bluff, which puts your credibility on the line
  • War of attrition: do not get into one
    • If you do, you need to change the game. Eg: guerilla warfare
  • Generals always fight the last war: people generally approach conflict like their last conflict, which may not be a good idea
  • Endgame: need to consider your exit strategy
    • Don’t burn bridges with the exit strategy, but sometimes necessary (eg. Cortes sinks his boats, Caesar crosses Rubicon)

Unlocking People’s Potential

  • Joy’s law: most of the smartest work for someone else no matter who you are
  • Rumsfeld’s rule: you go to war with the army you have, not the army you wish you had
  • 10x individuals: individuals who perform much above average. Can make dream team but very rare and often cannot replicate in other functions
    • This means that we can actually craft 10x talent with the right factors
  • First rule: people are not interchangeable due to personality traits
  • Personality traits:
    • Extroversion
    • Openness
    • Conscientiousness (organized vs. easygoing)
    • Agreeableness
    • Neuroticism (nervous vs. confident)
  • Other ways of classifying people: specialist vs. generalist, IQ & EQ
  • Every organization goes through three phases:
    • Commando: cheap, get things done, high damage
    • Soldier: build on commando work but need infrastructure to work bc so many
    • Police: hate change but build economies of scale
  • Foxes like details & broad, while hedgehogs and focused
  • Manage to the person than to the role & create new roles if necessary
  • Peter’s principle: managers rise to their level of incompetence
    • Keep this in mind for promotions
    • Setting up career paths is essential if you want people to stay
  • Higher roles tend to require more strategy than tactics
  • Make roles & responsibilities crystal clear using directly responsible individual
    • Prevents bystander effect where people don’t take responsibility because they are in a group and they assume someone else will
  • Use deliberate practice for mentorship
    • Mentor can identify edge goals and how you can achieve them while providing feedback
  • Spacing effect: learning effects greater when spaced out over time
    • Mentor-mentee should use this and rotate skills to improve
  • Give specific feedback but make sure you have set groundwork to show that you actually care about the person
  • To give tasks to mentees, think about conviction-consequence matrix
    • If high conviction on highly impactful idea, don’t delegate. If you have low conviction on a highly impactful idea or high convinction on low impact idea, delegate it sometimes. If low conviction on low impact idea, delegate it
    • Can give high conviction low impact stuff to mentees
  • Important to believe that team members can grow rather than fixed
    • Pygmalion effect: high expections higher performance
    • Golem effect: low expectations low performance
    • Having high expectations for people can make them perform better
  • If you keep putting people into challenging situations, you can breed imposter syndrome
    • Combat by recognizing commonality, expectation of small failures when operating out of comfort zone and connect with peers who have felt it
  • Dunning-Kruger effect:

  • Beware of Dunning-Kruger when coaching others: don’t let them get too cocky but also provide support as thye get
  • Maslow’s Hierarchy of Needs

  • Hierarchy tells us that if we want to become amazing, addressing impostor syndrome is a must
  • Memories of past are tainted by hindsight bias. You think certain things are obvious when looking back but they weren’t at the moment
    • Use counterfactual thinking to force you to consider other options that were present at the time
    • Record your decisions down so you can look back at it
    • Beware of self-serving bias
  • Every group has a culture: common beliefs, behavioral patterns & social norms
    • Low-context cultures requires very little context in communication and is very explicit and direct. High-context cultures require much more context
    • Other dimensions to evaluate culture: tight vs loose, hierarchical vs egalitarian, collectivist vs individualist, objective vs subjective
  • Shaping a good culture: establishing a solid vision, defining clear values, reinforcing via frequent communication, create processes that adhere to vision and values, lead by example, establish traditions, fostering accountability, rewarding based on cultural behaviours
  • Conflict is much more than firepower, but also winning hearts & minds
  • Companies are either there as loyalists or as mercenaries
  • Loyalists drawn by leadership, mission, values& location
  • Use Paul Graham’s comparison of maker vs. manager schedule and adjust accordingly
  • Culture erodes as it scales: often due to Dunbar’s number of 150 which is the maximum size of a stable, cohesive group
    • Too many new people at once can destroy culture
  • Mythical man-month: adding more people won’t speed up anything
    • New people have their own cost of onboarding and can’t improve the work that much faster
  • Boots on the ground: military terminology which means that example needs to be set in order for something to follow
    • To create good culture, your own boots need to be on the ground

Flex Your Market Power

  • Arbitrage: taking advantage of price differences for same product in two different settings
    • Reselling is a good example of exercising arbitrage
    • These opportunities don’t exist very long because others will do it
  • Signature of sustainable competitive advantage: market power
    • Power to profitably raise prices in market
    • This is easily done if you have a monopoly, like EpiPen companies
    • Markets with no power have perfect competition where everyone sells same good and prevents massive price discrepancies
  • Without market power, you are subjected to the whims of supply and demand
    • You need to pick an industry for high demand in the long run and you need to differentiate yourself to develop some market value
  • If you want to be extremely successful, you need to make contrarian bets
    • Consensus-contrarian matrix: if you are right and you are in consensus, you get regular returns. If you are right and contrarian, you get outsized returns. Otherwise you get no returns
    • This requires an appetite for risk
  • Contrarian bets require information asymmetry which you can take advantage
    • Known as a secret: no one else has thought of or way too risky
    • A secret can also be an ability to take a good idea to a great idea
    • Find people that are on the edge, the enthusiasts and hackers. They usually are on the cutting edge of the field
  • Contrarian bets require timing in order to make use of inertia
    • Apple Newton didn’t take off like iPad. Internet only became a huge hit after 2000
    • Ask yourself: why now? If new opportunity: now what?
  • Knowing something others don’t and having good timing still won’t guarantee success. Need great execution
  • First person to bring idea to market usually has first-move advantage but can also have a disadvantage if they make a lot of mistakes (others copy and avoid mistakes)
  • First mover hinges on being first to reach product/market fit: point at which product is a such a great fit for the market that consumers demanding more
    • First to P/M fit is much more successful than first to market
    • Much like resonant frequency
  • One of the best ways to reach P/M fit is to be customer centric and use scientific method to rapidly change product given feedback
    • Use MVPs to do the job
    • This can be applied everwhere: talk to community before moving, talk to current employees before taking job. Think who are customers and talk directly about your “product”
  • OODA loop: observe, orient, decide act
    • Make your OODA loop as fast as possible so that you can reach P/M fit
    • Helps you adapt to changing circumstances much faster than usual
  • Pivot if cannot reach P/M fit
    • Difficult because requires going against organizational inertia
    • Useful if current strategy is not going anywhere. Consult with advisors for more objective standpoint. Look for bright spots that you can focus in on before determining to pivot, like a beachhead
    • Jobs-to-be-done: think about what the role of your product is and determine whether something else can do that job better
  • When talking to customers, they often describe solutions rather than problems
  • Not just about # of customers, but also about size of customers
    • Your customer development must differ across scale & size
    • Can develop personas to help you understand customers better
  • Protect your position by building a moat: protected property, specialized skills/processes that take a long time to build, exclusive access, trusted brand, substantial control of distribution channel, amazing team, flywheels, faster OODA loop
    • Be explicit and note down what your moat is
    • Combine different moats to create a ‘force-field’
    • In other words, it creates high barriers of entry
  • Organizations that can create moats make customers feel locked-in because of percieved switching costs
  • Regulatory capture: regulatory agencies get captured by the special interest groups that they are supposed to be regulating, ultimately protecting them
  • Strong moats with regulatory capture can lead to winner-takes-most markets
  • Just because you ‘won’ the market doesn’t mean that you will win in eternity. As Andy Grove of Intel once said: ‘Only the paranoid survive’
    • Constantly re-evaluate strength of working moat and pivot if necessary
  • Innovations often mean that incumbents have tough choice of accepting and cannabilizing themselves or getting destroyed by innovation in future
    • Incumbents should pay attention to even the smallest of threats and use lifecycle adoption curve to model it out

Conclusion

  • Richard Feynman: “I learned very early the difference between knowing the name of something and knowing something”
  • Cargo cult: imitating and hoping that end result happens
    • Polynesians imitated airport but no airplanes came
    • Usually happens when people don’t understand what they are doing
  • Dangerous to use wrong model in situation. Think carefully and deeply
  • How to become a real superthinker:
    • Get a fellow partner to discuss and get feedback
    • Write: the act clarifies your thinking
  • These practices should increase your circle of competence. Be very careful when you are operating just outside of circle of competence