Psychology of Money Archives - FinMasters https://finmasters.com/psychology-of-money/ Master Your Finances and Reach Your Goals Tue, 16 Jan 2024 11:15:39 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 10+ Best Behavioral Finance Books to Read in 2024 https://finmasters.com/best-behavioral-finance-books/ https://finmasters.com/best-behavioral-finance-books/#respond Sun, 22 Jan 2023 17:00:04 +0000 https://finmasters.com/?p=108442 Behavioral finance books help understand the impact of psychology on financial decisions. Check out the top titles.

The post 10+ Best Behavioral Finance Books to Read in 2024 appeared first on FinMasters.

]]>
Investors often let emotions and biases guide their financial decisions, leading them to make careless mistakes. Individuals managing their personal finances also frequently put reason and facts in the backseat.

How can you become more conscious of risk, act in your best interest, and prevent emotions and biases from getting the best of you? Behavioral finance books can help.

The following titles will teach you how to make rational decisions and fewer mistakes, whether you’re managing personal finance or investments.

How We Chose These Books

We considered several factors when selecting books for this list, such as the author’s expertise, awards, critical acclaim, and online reviews. We also included new and noteworthy titles to provide readers with a diverse range of options and keep up-to-date with the latest trends.

Thinking, Fast and Slow

1. Thinking, Fast and Slow

by Daniel Kahneman

Thinking, Fast and Slow (2011) by Daniel Kahneman is among the best investment books for beginners. It delves into cognitive biases and people’s two conflicting modes of thinking: intuitive and deliberate.

It explains how processing information instinctively can be helpful but often leads to errors in judgment. Putting emotions aside and allowing time for logical reasoning can help you make rational decisions.

🔑 Key takeaways:

  • Intuition and deliberation in behavioral economics;
  • How overconfidence causes biased, subjective decisions;
  • Objectively processing and responding to statistical information;
  • Prospect theory.

This best-seller’s primary themes are the exploration of many cognitive biases and heuristics. Grab your copy to understand how to overcome them and take charge of your financial behavior.

📖 Favorite quote from the book:

Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.

✍ About the author: Daniel Kahneman is a world-renowned psychologist and economist famous for his findings on behavioral economics, which earned him the 2002 Nobel Prize in Economic Sciences. He developed prospect theory, co-founded TGG Group, and teaches psychology and public affairs at Princeton University.

Read key ideas on Blinkist →


Freakonomics book cover

2. Freakonomics

A Rouge Economist Explores the Hidden Side of Everything

by Stephen J. Dubner & Steven Levitt

Freakonomics (2005), by Stephen J. Dubner & Steven Levitt, is an unusual book that correlates economics and sociocultural phenomena. It explains ways for you to apply economic theory to numerous seemingly unrelated subjects, including pop culture.

The book also discusses controversial ideas like abortion and gun violence, for which it has received criticism. However, the authors tie them back to economic principles with behavioral arguments.

🔑 Key takeaways:

  • Economic knowledge can correct the irrational behavior governing our lives;
  • We need economics to understand human behavior;
  • Nature and nurture affect how people behave;
  • Freeing yourself from conventional thinking can make you a better economist.

Freakonomics might be a challenging read, but it is worth it if you are anything but traditional.

📖 Favorite quote from the book:

Information is a beacon, a cudgel, an olive branch, a deterrent—all depending on who wields it and how. Information is so powerful that the assumption of information, even if the information does not actually exist, can have a sobering effect.

✍ About the authors: Steven Levitt is an economist who won the John Bates Clark Medal in 2003. He co-founded TGG Group and the Center for RISC at the University of Chicago, where he works as an economics professor.

Stephen J. Dubner is a journalist, author, and host of the Freakonomics Radio podcast. He co-authored several Freakonomics sequels and has received many awards for his journalistic work.

Read key ideas on Blinkist →


The Psychology of Money book cover

3. The Psychology of Money

Timeless Lessons on Wealth, Greed, and Happiness

by Morgan Housel

The Psychology of Money (2020), by Morgan Housel, isn’t your ordinary behavioral finance book targeting investing and practical tips. It focuses on perceptions of money and their impact on financial decision-making.

The book contains 19 short stories that expand our understanding of economics, wealth, greed, success, and happiness, highlighting the impact of psychology on financial decisions.

🔑 Key takeaways:

  • Be reasonable instead of rational;
  • Embrace market volatility;
  • Save money and wait for lucrative opportunities;
  • Leave your ego out of the equation;
  • Carefully plan investments and subsequent steps because not all strategies are winning.

This engaging read also tackles personal finance management, teaching the readers about the significance of wise spending and savings decisions on the journey to wealth.

📖 Favorite quote from the book:

A genius who loses control of their emotions can be a financial disaster. The opposite is also true. Ordinary folks with no financial education can be wealthy if they have a handful of behavioral skills that have nothing to do with formal measures of intelligence.

✍ About the author: Morgan Housel is a financial journalist and behavioral finance expert who used to write for the Wall Street Journal and the Motley Fool. He is a partner at the Collaborative Fund and has won numerous awards, including the SABEW’s Best in Business Award and the New York Times Sydney Award.

Read key ideas on Blinkist →


Predictably Irrational book cover

4. Predictably Irrational

The Hidden Forces That Shape Our Decisions

by Dan Ariely

Predictably Irrational (2008), by Dan Ariely. is an excellent book for beginner investors and students studying behavioral finance. It reads like a story, containing personal anecdotes from everyday experiences to help readers delve into an investor’s mind.

Ariely uses exciting experiments and real-life examples to show how people act irrationally, even when intentionally trying to make a rational decision.

🔑 Key takeaways:

  • Irrational behavior is systemic and predictable;
  • Emotions stand in the way of good judgment;
  • People put more value on free products than they actually have (the zero price effect);
  • We overvalue owned objects due to emotional biases (the endowment effect);
  • Predicting irrational behavior can help evaluate market sentiment.

Besides its valuable investment lessons, this book contains examples demonstrating that many people don’t use rational thought when comparing prices. For instance, we may expect a more expensive medicine to be more effective than its affordable counterpart, even if both are placebos.

Get your copy if you are looking for a thought-provoking read that might help you become more rational with finances.

📖 Favorite quote from the book:

The danger of expecting nothing is that, in the end, it might be all we’ll get.

✍ About the author: Dan Ariely is a psychology and behavioral economics professor at Duke University, where he runs the Center of Advanced Hindsight research lab. He taught at MIT, co-founded several companies focusing on behavioral science, and has published several best-sellers, including Predictably Irrational and The Upside of Irrationality.

Read key ideas on Blinkist →


Beyond Greed and Fear book cover

5. Beyond Greed and Fear

Understanding Behavioral Finance and the Psychology of Investing

by Hersh Shefrin

Beyond Greed and Fear (1999), by Hersh Shefrin, looks at how psychology influences financial decisions. Shefrin teaches investors how to prevent emotions, bias, and overconfidence from clouding their judgment.

The book delves into how we let mistakes increase our fear without learning the lessons those mistakes can teach. When we think we see profit, we become greedy and make the same mistakes over and over again. Investors can overcome the cycle of greed and fear by recognizing and avoiding cognitive biases and mistakes.

🔑 Key takeaways:

  • The concepts of heuristic biases, frame dependence, and market inefficiency;
  • Even seasoned professionals let emotions and cognitive biases guide their actions;
  • Investment models don’t consider these fundamental elements of human behavior;
  • Ignoring the effects of the psychology of investing leads to errors and financial losses.

You may tie this book’s themes to trading, but its primary focus is investing.

📖 Favorite quote from the book:

One of the most striking claims of behavioral finance is that heuristic-driven bias and frame dependence can cause prices to deviate from fundamental values for long periods.

✍ About the author: Hersh Shefrin is an economist focusing on behavioral finance. He is a finance professor at Santa Clara University, where he previously taught economics, and writes for the Wall Street Journal, Forbes, Vox, and Huffington Post. He has published many research articles and books and won the Albert Nelson Marquis Lifetime Achievement Award in 2019.


The Behavioral Investor book cover

6. The Behavioral Investor

by Daniel Crosby

The Behavioral Investor (2018), by Daniel Crosby, is among the top must-read books for investors. It delves deep into connections between human nature and finance and offers practical tips for better financial behavior and higher returns.

This book can help you increase self-awareness, avoid common investment pitfalls, overcome natural inclinations, improve your portfolio management and build wealth.

🔑 Key takeaways:

  • The psychological, neurological, and sociological factors impacting your investment decisions;
  • The four psychological tendencies (ego, conservatism, attention, and emotion) influencing investment behavior;
  • Practical solutions to these problems and wealth management tips.

The author also proposes a rules-based behavioral investing (RBI) method for moving away from emotional biases and toward rational, evidence-based decision-making.

📖 Favorite quote from the book:

Far from seamlessly assimilating new ideas into our existing belief framework, research shows that we actually tend to get more firm in our cherished beliefs when those beliefs become challenged.

✍ About the author: Daniel Crosby is a psychologist, asset manager, and behavioral finance expert helping companies maximize investment returns and build wealth. He’s a thought leader writing for Investment News and WealthManagement.com and producing the “Irrationality Index,” which measures greed and fear to forecast negative returns.

Read key ideas on Blinkist →


Inefficient Markets book cover

7. Inefficient Markets

An Introduction to Behavioral Finance

by Andrei Shleifer

The efficient market hypothesis (EMH) states that asset prices reflect all available information and have fair fundamental values. It assumes that all investors are rational and that pricing discrepancy is impossible due to arbitrage opportunities.

That is not the case with real-world financial markets. Institutional and psychological evidence challenge the EMH’s assumptions.

In Inefficient Markets (2000), Andrei Shleifer proposes behavioral finance as an alternative, presenting empirical evidence and theories for analyzing actual markets.

🔑 Key takeaways:

  • Real-world financial markets are inefficient;
  • Investors make irrational decisions due to cognitive biases;
  • Behavioral finance helps accurately analyze financial data and forecast asset prices.

This book is an excellent read for beginners in behavioral finance. The author’s easily intelligible explanations of various financial market models help newbies understand and apply the knowledge in real life.

✍ About the author: Andrei Shleifer is a financial, behavioral, and development economist, researcher, and author. He won the John Bates Clark Medal in 1999, co-founded LSV Asset Management, and taught at Princeton University and the University of Chicago. He has been an economics professor at Harvard University since 1991.


Misbehaving book cover

8. Misbehaving

The Making of Behavioral Economics

by Richard H. Thaler

Richard H. Thaler’s titles are among the most valuable behavioral finance books. The Nobel laureate’s Misbehaving (2017) is a gripping book you’ll want to read in one sitting because the wisdom and often humorous illustrations won’t let you put it down.

Thaler explains how people frequently make irrational financial decisions because they let emotional biases guide their actions. They misbehave, and markets move irrationally as a result.

🔑 Key takeaways:

  • Investors are irrational and misbehave because of emotional biases;
  • Misbehavior can have detrimental consequences;
  • Behavioral economics provides a new lens to analyze economic theory;

The author’s personal experiences make the book even more engaging. Grab your copy if you want an entertaining and insightful read that can help you make wiser financial decisions.

📖 Favorite quote from the book:

The purely economic man is indeed close to being a social moron. Economic theory has been much preoccupied with this rational fool.

✍ About the author: Richard H. Thaler is a behavioral economist, author, and columnist for The New York Times. He won the Nobel Prize in Economic Sciences in 2017 for outstanding contributions to behavioral economics. He has been an economics and behavioral science professor at the University of Chicago’s Booth School of Business since 1995.


The Little Book of Behavioral Investing book cover

9. The Little Book of Behavioral Investing

How Not to Be Your Own Worst Enemy

by James Montier

The Little Book of Behavioral Investing (2010), by James Montier, explains how human behavior affects financial markets and offers guidelines on maintaining a profitable investment portfolio.

The author shares insights into investors’ psychological pitfalls, such as emotions, hindsight bias, loss aversion, and overconfidence, and offers solutions to avoid them. He also explains the significance of learning from mistakes and overcoming behavioral challenges.

🔑 Key takeaways:

  • Careful planning helps maintain investment discipline;
  • Guesswork and failing to learn from mistakes can lead to losses;
  • Focusing on low-risk assets helps overcome the fear of making investment decisions; 
  • Monitoring price fluctuations can help you make profitable decisions;
  • Fundamental analysis is crucial to determine fair market values.

This short read is ideal for aspiring investors dipping their toes in financial markets. It offers a useful foundation for making psychology work in your favor and breaking the barriers hindering high returns.

📖 Favorite quote from the book:

Success in investing doesn’t correlate with IQ once you’re above the level of 100. Once you have ordinary intelligence, what you need is the temperament to control the urges that get other people into trouble in investing.

✍ About the author: James Montier is a renowned economist with several behavioral and the author of several investing books. He is a fellow of the Royal Society of Arts and a visiting fellow at the University of Durham. He was the co-head of General Strategy at Société Générale and currently works as an asset allocation manager at GMO.


Dollars and Sense book cover

10. Dollars and Sense

How We Misthink Money and How to Spend Smarter

by Dan Ariely & Jeff Kreisler

Dollars and Sense (2017) is a page-turner that dives into the irrational world of personal finance and provides guidance on overcoming behavioral challenges. It’s a good choice for the reader who is more focused on personal finance than on investing.

The authors explore everyday topics to help us understand how emotions affect our financial choices and may cost us more than we think. They discuss how we perceive money and address the tendencies driving us to spend more than we should, from using credit cards for groceries to overpaying for items on vacation.

🔑 Key takeaways:

  • How psychology affects your financial decisions;
  • How to change your instincts for better financial decision-making;
  • Practical advice on improving financial choices, including spending and saving more wisely.

This enjoyable read explains behavioral finance in an easy-to-understand language, infusing humor into real-life examples to help readers relate. It is a fascinating behavioral finance book for beginners.

📖 Favorite quote from the book:

Happiness too often seems to be less a reflection of our actual happiness and more a reflection of the ways in which we compare ourselves to others.

✍ About the authors: Behavioral economist Dan Ariely has partnered with Jeff Kreisler, a lawyer, author, stand-up comedian, and behavioral science advocate. Kreisler is head of Behavioral Science at J.P. Morgan Chace, founding editor at PeopleScience, and an executive humor coach at Stanford Business School.

Read key ideas on Blinkist →


Nudge book cover

11. Nudge

Improving Decisions About Health, Wealth, and Happiness

by Richard H. Thaler & Cass R. Sunstein

Nudge (2008) is a fantastic read about the behavioral patterns that lead us to poor financial choices, negatively affecting our health, wealth, and happiness. It provides real-world scenarios and tips to improve our financial behavior.

The authors explain how people succumb to biases when making financial decisions and share enlightening examples from extensive research on behavioral science.

🔑 Key takeaways:

  • “Choice architecture” can change consumer behavior by “nudging” people in the right direction without restricting options;
  • Even a minor change can “nudge” people to make better decisions;
  • Using the nudge theory to redesign our environment can help us live better lives.

The authors have rewritten this New York Times best-seller to share new research and personal experiences. Nudge: The Final Edition (2021) provides insights on personal finance, credit card debt, mortgages, retirement savings, climate change, medical care, and other areas where we can make better decisions for a more fulfilling life.

📖 Favorite quote from the book:

If you want to nudge people into socially desirable behavior, do not, by any means, let them know that their current actions are better than the social norm.

✍ About the author: Behavioral economist and Nobel laureate Richard H. Thaler has teamed up with legal scholar Cass R. Sunstein in a collaboration for Nudge. Sunstein worked for the Obama administration, won the Holberg Prize in 2018, and taught at the University of Chicago Law School. He currently teaches at Harvard Law School as the Robert Walmsley University Professor.

Read key ideas on Blinkist →


The Winner's Curse book cover

12. The Winner’s Curse

Paradoxes and Anomalies of Economic Life

by Richard H. Thaler

The Winner’s Curse (1991) is a fascinating exploration of irrational economic behavior. Thaler tackles many paradoxes and anomalies of everyday economic life, explaining a common phenomenon in financial markets and value auctions: the winner’s curse.

The winner’s curse is an economic anomaly indicating that the winner is also a loser because they overestimate an item’s value and often overpay for it.

Thaler shares many real-world examples to support his wisdom, including auction bidders and consumers saving money on one product to spend the savings on another.

🔑 Key takeaways:

  • Why people make irrational financial decisions;
  • Economic paradoxes and anomalies, primarily the winner’s curse;
  • Financial markets are inefficient.

This book may reveal essential economic wisdom, but it is slightly challenging for readers with no background in economics. It is a dense book with many mathematical examples that make it ideal for academic readers.


The Art of Contrary Thinking book cover

13. The Art of Contrary Thinking

by Humphrey B. Neill

Do you follow the pack as an investor or choose an oppositional course? The former can be risky, according to the late Humphrey B. Neill.

“When everyone thinks alike, everyone is wrong.”

This pearl of wisdom is one of many this renowned economist discussed in The Art of Contrary Thinking (1976). He explained the effects of herd mentality on investment decisions and financial markets and proposed a different thinking philosophy he applied when doing business or investing.

Contrary thinking can help investors capture the growing value from others following in their footsteps. It can also help forecast market shifts and prevent financial losses.

🔑 Key takeaways:

  • The herd is usually wrong;
  • Contrary thinking can help you stay on the right track;
  • An oppositional strategy can help you prevent or minimize losses.

This book gives you a formula for thinking like a successful investor, making better decisions, leveraging value, understanding how people behave, and generating accurate forecasts. It is still a relevant read after nearly half a century.

📖 Favorite quote from the book:

When governments, including our own, carry great debt loads, it behooves thoughtful citizens to recall occasionally the disastrous inflations of the past, the illusions of bootstrap economics—which always are accompanied by crowd hysteria.

✍ About the author: Humphrey B. Neill was an economist, investment thinker, and World War I lieutenant. He dedicated his life to analyzing and writing about the folly of herd mentality and its effects on investment decisions. Always encouraging people to question the consensus and providing mountains of evidence, he earned the title “father of contrary opinion” from Life magazine.


Other Behavioral Finance Books We Considered

There are many behavioral finance books out there. We’ve rounded up the best, but when you’ve finished reading those, you also might like to read some of the following.

Conclusion

These behavioral finance books deserve a spot on every financially focused bookshelf, from investors and asset managers to individuals seeking personal finance advice.

Their wisdom and real-world examples can be valuable tools for profitable investments and better economic decisions devoid of emotions and biases.

The post 10+ Best Behavioral Finance Books to Read in 2024 appeared first on FinMasters.

]]>
https://finmasters.com/best-behavioral-finance-books/feed/ 0
What Daniel Kahneman Thinks Investors Should Know https://finmasters.com/daniel-kahneman-investors/ https://finmasters.com/daniel-kahneman-investors/#comments Sun, 20 Dec 2020 08:00:04 +0000 https://www.vintagevalueinvesting.com/?p=3437 Daniel Kahneman gives a great breakdown of 6 cognitive biases and illusions that affect both our everyday lives and our investing success.

The post What Daniel Kahneman Thinks Investors Should Know appeared first on FinMasters.

]]>

Daniel Kahneman is a professor of behavioral & cognitive psychology at Princeton, winner of the 2002 Nobel Prize for economics, and author of the best-selling book on cognitive biases and heuristics: Thinking Fast & Slow.

Heuristics are mental shortcuts that we use to solve problems and reach judgments.

In the interview “Masters in Business”, Daniel Kahneman sat down with Barry Ritholz and discussed his first steps in heuristics, while also giving a unique breakdown of the cognitive biases and what investors should know.

Key Takeaways

  • Attribute Substitution – we tend to simplify complex judgments with simpler but related heuristics.
  • Availability Heuristic – We often don’t wait for additional details, so mental shortcuts play a huge role in forming opinions.
  • Anchoring Bias – how we view the first piece of information influences subsequent decisions and options.
  • Loss Aversion – we tend to prefer avoiding losses over acquiring gains, which affects our decision-making abilities.
  • Narrow Framing – preferring multiple viewpoints as opposed to just one showcases how our narrow views affect the risks we take.

Here are the cognitive biases that Kahneman has identified:

1. Attribute Substitution

📖 Definition: Attribute substitution occurs when an individual has to make a judgment (of a target attribute) that is computationally complex and instead substitutes a more easily calculated heuristic attribute.

“You ask someone a complicated question, like: What is the probability of an event? And they can’t answer it because it’s very difficult. But there are easier questions that are related to that one that they can answer. Such as: Is this a surprising event? That is something that people know right away. Is it a typical result of that kind of mechanism? And people can answer that right away.

So what happens is people take the answer to the easy question, they use it to answer the difficult question, and they think they have answered the difficult question. But in fact, they haven’t – they’ve answered an easier one.

I call it attribute substitution – to substitute one question for another. So if I ask you: How happy are you these days? Now you know your mood right now – so you’re very likely to tell me your mood right now and think that you’ve answered the more general question of ‘How happy are you these days?’”

2. Availability Heuristic (aka “What You See Is All There Is” or WYSIATI)

📖 Definition: The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person’s mind when evaluating a specific topic, concept, method, or decision.

“People are really not aware of information that they don’t have. The idea is that you take whatever information you have, and you make the best story possible out of that information. And the information you don’t have – you don’t feel that it’s necessary.

I have an example that I think brings that out: I tell you about a national leader and that she is intelligent and firm. Now, do you have an impression already of whether she’s a good leader or a bad leader? You certainly do. She’s a good leader. But the third word that I was about to say is “corrupt.

The point is that you don’t wait for information that you didn’t have. You formed an impression as we were going from the information that you did have. And this is “What You See Is All There Is” (WYSIATI).”

3. Anchoring Bias

📖 Definition: Anchoring describes the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions.

“I’ll give you an example. In the example of negotiation, many people think that you have an advantage if you go second. But actually, the advantage is going first. And the reason is in something about the way the mind works. The mind tries to make sense out of whatever you put before it. So this built-in tendency that we have of trying to make sense of everything that we encounter, that is a mechanism for anchoring.”

4. Loss Aversion

📖 Definition: Loss aversion refers to people’s tendency to prefer avoiding losses to acquiring equivalent gains: it is worse to lose one’s jacket than to find one. Some studies have suggested that losses are twice as powerful, psychologically, as gains.

“Losses loom larger than gains. And we have a pretty good idea of by how much they loom larger than gains, and it’s by about 2-to-1.

An example is: I’ll offer you a gamble on the toss of a coin. If it shows tails, you lose $100. And if it shows heads, you win X. What would X have to be for that gamble to become really attractive to you? Most people – and this has been well established – demand more than $200… Meaning it takes $200 of potential gain to compensate for $100 of potential loss when the chances of the two are equal. So that’s loss aversion. It turns out that loss aversion has enormous consequences.

What is it about losses that makes them so much more painful than gains are pleasurable? In other words, why does this 2-to-1 loss aversion even exist?

“This is evolutionary. You would imagine in evolution that, threats are more important than opportunities. And so it’s a very general phenomenon that bad things sort of preempt or are stronger than good things in our experience. So loss aversion is a special case of something much broader.”

So there’s always another opportunity coming along, another game, another deer coming by but an actual genuine loss – hey, that’s permanent, and you don’t recover from that.

“That’s right. Anyway, you take it more seriously. So if there is a deer in your sights and a lion, you are going to be busy about the lion and not the deer.”

That leads to the obvious question: what can investors do to protect themselves against this hard-wired loss aversion?

“There are several things they can do. One is not to look at their results – not to look too often at how well they’re doing.”

And today, you can look tick-by-tick, minute-by-minute, it’s the worst thing that could happen.

“It’s a very, very bad idea to look too often. When you look very often, you are tempted to make changes, and where individual investors lose money is when they make changes in their allocation. Virtually on average, whenever an investor makes a move, it’s likely to lose money. Because there are professionals on the other side betting against the kind of moves that individual investors make.”

5. Narrow Framing

📖 Definition: Framing refers to the context in which a decision is made or the context in which a decision is placed in order to influence that decision.

“If I ask a regular person in the street would you take a gamble that if you lose, you lose $100, and if you win, you win $180 on the toss of a coin… Most people don’t like it… Now when you ask the same people in the street, okay, you don’t want this one [coin toss], would you take ten [coin tosses]?

So we’ll toss ten coins, and every time if you lose, you lose $100, and if you win, you win $180, everybody wants the ten – nobody wants the one. In the repeated play, when the game is repeated, then people become much closer to risk neutral and they see the advantage of gambling.

One question that I ask people when I tell them about that – so you’ve turned down $180, but you would accept ten of those – are you on your deathbed? That’s the question I ask. Is that the last decision you’re going to make? And clearly, there are going to be more opportunities to gamble, perhaps not exactly the same gamble, but there’ll be many more opportunities.

You need to have a policy for how you deal with risks and then make your individual decisions in terms of a broader policy. Then you’ll be much closer to rationality.

It’s very closely related to What You See Is All There Is. We tend to see decisions in isolation. We don’t see the decision about whether I take this gamble as one of many similar decisions that I’m going to make in the future.

So are people overly outcome-focused to the detriment of the process?

What they are, we call that “narrow framing.” They view the situation narrowly. And that is true in all domains. So, for example, we say that people are myopic – that they have a narrow time horizon. To be more rational, you want to look further in time, and then you’ll make better decisions.

If you’re thinking of where you will be a long time from now, it’s completely different from thinking about how will I feel tomorrow if I make this bet and I lose.”

6. Theory-Induced Blindness / Hindsight Bias

📖 Definitions: 
Theory-induced blindness: Once you have accepted a theory, it is extraordinarily difficult to notice its flaws.
Hindsight bias is the inclination, after an event has occurred, to see the event as having been predictable, despite there having been little or no objective basis for predicting it.

Let’s talk about being wrong, and being able to admit that you’re wrong. John Kenneth Galbraith once famously said, “Faced with the choice between changing one’s mind and proving that there is no need to do so, almost everyone gets busy on the proof. You called this “theory-induced blindness.” So why are we so unwilling to admit when we’re wrong?

“You know you try to make the best story possible. And the best story possible includes quite frequently, “Actually, I didn’t make that mistake.” You know, so something occurred – and in fact, I did not anticipate it – but in retrospect, I did anticipate it. This is called hindsight.

And one of the main reasons that we don’t admit that we’re wrong is that whatever happens, we have a story, we can make a story, we can make sense of it, we think we understand it, and when we think we understand it we alter our image of what we thought earlier.

I’ll give you a kind of example: So you have two teams that are about to play football. And the two teams are about evenly balanced. Then one of them completely crushes the other. Now after you have just seen that, they’re not equally strong. You perceive one of them as much stronger than the other and that perception gives you the sense that this must have been visible in advance, that one of them was much stronger than the other.

So hindsight is a big deal. It allows us to keep a coherent view of the world, it blinds us to surprises, it prevents us from learning the right thing, it allows us to learn the wrong thing – that is whenever we’re surprised by something, even if we do admit that we’ve made a mistake or [you say] “I’ll never make that mistake again”- but in fact what you should learn when you make a mistake because you did not anticipate something is that the world is difficult to anticipate. That’s the correct lesson to learn from surprises. That the world is surprising.

It’s not that my prediction is wrong. It’s that predicting, in general, is almost impossible.”

Summary

There are definitely a lot of parallels between Daniel Kahneman’s research and Warren Buffett‘s and Charlie Munger‘s investment philosophy, such as:

  • Don’t be too active
  • Make your decisions with a long-term perspective
  • Admit your mistakes
  • Don’t try to predict what’s unpredictable
  • Strive to become as rational as possible

What to Read and Listen to Next?

If you want to learn more about cognitive biases, heuristics, and illusions, then be sure to check out Daniel Kahneman’s awesome book Thinking Fast and Slow.

Finally, if you want to listen to the entire interview between Daniel Kahneman and Barry Ritholz, click to listen below or check out Ritholz’s Masters In Business podcast. Ritholz usually gets some really awesome guests from the world of finance on his podcast and the discussions are almost always incredibly interesting.


The Ultimate Guide to Value Investing ebook

The Ultimate Guide to Value Investing

Do you want to know how to invest like the value investing legend Warren Buffett? All you need is money to invest, a little patience—and this book. Learn more

[contact-form-7]

The post What Daniel Kahneman Thinks Investors Should Know appeared first on FinMasters.

]]>
https://finmasters.com/daniel-kahneman-investors/feed/ 5
Beginner’s Guide to Logical Fallacies (With Examples) https://finmasters.com/logical-fallacy/ Wed, 06 Nov 2019 08:23:54 +0000 https://fallacyinlogic.com/?p=91 L:ogical fallacies are reasoning errors that weaken your argument. Learn what fallacies are and how understanding them can benefit you.

The post Beginner’s Guide to Logical Fallacies (With Examples) appeared first on FinMasters.

]]>
Logical fallacies are errors in reasoning that can undermine arguments and lead to bad decisions. They are often found in politics, media, advertising, and daily discussions.

Recognizing and understanding these fallacies enhances critical thinking and argumentative skills, helping you to identify flaws in reasoning, construct more persuasive arguments, and make better choices.

Key Takeaways

  • What is a logical fallacy? A logical fallacy is the use of erroneous reasoning that renders an argument invalid or unsound.
  • Understanding fallacies helps you think better. Studying fallacies sharpens your ability to critically analyze information, enabling you to identify, avoid, or challenge misleading arguments in various contexts.
  • Apply your understanding to others and yourself. Spotting fallacies isn’t just about winning arguments. It will help you assess your thinking and avoid inaccurate or unsupportable conclusions.

What Is an Argument?

Before we dive into fallacies, let’s first quickly look at what an argument exactly is.

As you know, in everyday situations an “argument” refers to a situation where people are having a (heated) disagreement. In philosophy and logic, however, it means something more; it’s a set of statements — premises and a conclusion — made for or against a particular idea, theory, or position.

☔ As an example, consider the following:

“Every time there is rain coming, my joints start aching. My joints started aching. So, there must be rain coming.”

This is an argument that has two premises and a conclusion. If we break it down, it would look like this:

  • Premise 1: Every time there is rain coming, my joints start aching.
  • Premise 2: My joints started aching.
  • Conclusion: There must be rain coming.

In essence, the premises of an argument are meant to provide us with enough reasons to accept the conclusion. The cause for most (but not all) arguments failing is that they don’t offer strong enough premises to achieve this.

Types of Fallacies

Types of Fallacies

A logical fallacy is the use of erroneous reasoning that renders the argument either invalid or unsound.

As Dave Kemper summarized in his book Fusion: Integrated Reading and Writing:

A logical fallacy is a false statement that weakens an argument by distorting an issue, drawing false conclusions, misusing evidence, or misusing language.

Dave Kemper et al., Fusion: Integrated Reading and Writing. Cengage, 2015.

As mentioned at the beginning, they may be committed unintentionally due to carelessness or lack of a better understanding of them, however, often they are committed deliberately in order to persuade someone.

The word “fallacy” comes from the Latin word fallacia, which translates to “deceit”, “deception,” or “trick”. These words describe them quite accurately: they are deceivingly persuasive and are frequently used to trick or fool people.

Moreover, classifying specific fallacies accurately is challenging because of the large variety of their application and structure. There are, in fact, hundreds of them and more than two dozen types and sub-types. However, they are mainly divided into two broad categories: formal and informal fallacies.

Origin

The origin of logical fallacies goes all the way back to Ancient Greece and, more specifically, to the well-known Greek philosopher Aristotle (384 – 322 BC), who laid the foundation by identifying the first thirteen fallacies in his On Sophistical Refutations. In his work, he didn’t only aim to show how one can win debates by making logically valid and sound arguments but also demonstrated how to refute various claims.

Formal Fallacy

Formal fallacy, also known as a non sequitur and deductive fallacy, refers to a flaw in the structure of a deductive argument.

Deductive arguments intend to provide a necessarily true conclusion given that the premises are also true. Hence, its validity is dependant on the structure of the argument. Furthermore, they can be valid or invalid, or sound or unsound:

A valid deductive argument is one that cannot simultaneously have true premises and a false conclusion. Otherwise, it’s invalid.

A sound deductive argument is one that is valid and all of its premises are true. Otherwise, it’s unsound.

Examples

One common type of formal fallacy is the affirming the consequent, and its logical form looks like this:

  • Premise 1: If A is true, then B is true.
  • Premise 2: B is true.
  • Conclusion: Therefore, A is true.

☔ An example would be:

  • Premise 1: If it’s raining, then the streets are wet.
  • Premise 2: The streets are wet.
  • Conclusion: Therefore, it’s raining.

There is a clear error here because the conclusion doesn’t follow from the given premises; it doesn’t necessarily mean it’s currently raining, even if the streets are indeed wet. As such, the truth of the premises doesn’t logically guarantee the truth of the conclusion, making the argument fallacious.

Another non sequitur would be denying the antecedent. It’s quite closely related to the previous one, but here the mistake arises because it incorrectly deduces the inverse of the conditional statement:

  • Premise 1: If A, then B.
  • Premise 2: Not A.
  • Conclusion: Therefore, not B.

Or:

  • Premise 1: If he’s a human, then he has a brain.
  • Premise 2: He isn’t a human (he’s a dog).
  • Conclusion: Therefore, he doesn’t have a brain.

Similarly, this argument is invalid due to a flaw in the structure; even though both premises are true, the conclusion is still false.

Informal Fallacy

Informal fallacies deal with the non-structural flaws in arguments. Essentially, they deal with all the other errors that formal fallacies don’t. Furthermore, although they typically occur in inductive arguments, they may also apply to deductive ones.

An inductive argument is one that is meant to provide strong enough premises to support a probable truth of the conclusion. As such, the success of an inductive argument relies on the evidence supporting the conclusion, that is, on the strength of its premises.

👶 To give you an example, consider the following:

  1. Pregnancy tests are around 98% accurate.
  2. Chloe got a positive result on a pregnancy test.
  3. Chloe is most likely pregnant.

This is a reasonable inductive argument: Since the accuracy rate of the pregnancy test is as high as 98%, it is justified to assume that Chloe is pregnant.

🚗 Another example:

“I’ve had my car for 5 years, and it has never broken down. Therefore, I don’t have to worry about it breaking down tomorrow.”

Assuming it’s true that the car has never broken down in 5 years, then it would be unlikely that it will break down tomorrow; the premise is strong enough to warrant a probable truth of the conclusion.

Now, due to the fact, there is almost an unlimited number of ways the premises can actually fail at backing up the conclusion, there is a very large variety of identified informal fallacies. As such, they are organized into three sub-categories: fallacies of ambiguity, fallacies of relevance, and fallacies of sufficiency.

Fallacies of Ambiguity

These types of fallacies are caused by a lack of clarity. Some examples include:

  • Accent fallacy — placing unusual stress or emphasis on certain words to change the meaning of a sentence.
  • Composition fallacy — asserting that if something is true of the parts, it must be true of the whole.

Fallacies of Relevance

Fallacies of relevance attempt to persuade by using non-logical means. They often use emotional appeals as evidence for the conclusion. For instance:

  • Appeal to pity — using the feeling of pity to persuade.
  • Appeal to force — using force or threat of force to persuade.
  • Straw man — distorting an opponent’s argument in order to make it easier to attack.

Fallacies of Sufficiency

In essence, fallacies of sufficiency occur when the evidence fails to provide, in one way or another, adequate support for the conclusion.

  • Hasty generalization — drawing a conclusion from an insufficient sample size.
  • False dilemma — presenting only two possible choices when in fact, more alternatives exist.
  • Weak analogy — drawing a connection between two things, even though the connection is insufficient for making any conclusions based on it.

Benefits of Studying Logical Fallacies

Logical fallacies are a common occurrence in debates and discussions everywhere — from politics to media to advertising to philosophical debates. They are an important aspect of argumentation, as well as logical and critical thinking.

Ideally, whenever we are expressing our opinions to other people and, in effect, attempting to persuade them that we are right, we should be doing it with sound reasoning and relevant facts. However, in reality, this doesn’t happen most of the time; people argue for things without proper reasons and end up using various tactics to bypass logic — for reasons such as lack of evidence and personal gain.

This applies to both verbal and written persuasion. As William R. Smalzer explains here:

There are three good reasons to avoid logical fallacies in your writing. First, logical fallacies are wrong and, simply put, dishonest if you use them knowingly. Second, they take away from the strength of your argument. Finally, the use of logical fallacies can make your readers feel that you do not consider them to be very intelligent.

William R. Smalzer, Write to Be Read: Reading, Reflection, and Writing, 2nd ed. Cambridge Univ. Press, 2005.

More precisely, some of the benefits you may gain from studying logic and fallacies include:

  • It’ll help you develop your vocabulary and form better, more persuasive arguments of your own, which, in turn, will make you seem more credible and can help you reach your goals.
  • You’ll be better able to evaluate other people’s arguments and spot and counter poor reasoning.
  • It’ll help you defend yourself from people who wish to influence your beliefs, values, or actions in a way that may be against your self-interests.

Examples of Logical Fallacies

Now, let’s take a closer look at some of the most common types of fallacies.

Ad Hominem

Ad hominem occurs when someone attacks the person behind an argument instead of addressing the actual merit of their argument. The attacks may be directed toward the person’s character, morals, background, intelligence, or reputation.

👉 Here are a couple of examples:

  1. “You didn’t even finish high school; therefore, we shouldn’t listen to your opinion about anything!”
  2. Mike: “There are so many Earth-like planets out there that there must be intelligent life on some of them.” 
    Jenny: “What would a moron like you possibly know about this?”

Red Herring

Red herring fallacy happens when one derails the original issue to a different, irrelevant one. It’s a deliberate attempt to move the focus away from a certain topic in order to gain an advantage.

🎣 An example would be:

Joanna: “Why did you buy that new fishing rod? It exceeds the monthly budget that we both agreed upon.” 
John: “Well, because it was on sale. I had to buy it now.”

John commits the red herring here because he tries to distract Joanna from the real issue, which is the fact that he exceeded the budget that they had both agreed upon.

Straw Man

Straw man occurs when an opponent attacks a distorted version of the original argument that they themselves created. More accurately, it’s an intentionally misrepresented or exaggerated version of the issue that better suits the arguer’s agenda.

👉 For Example

  • John: “I believe sport hunting is immoral.”
    Michael: “So you want us all to be vegetarians because animals are more important than people?!”
  • Kim: “I think our company should allocate a larger portion of the budget to customer support because we are struggling in that area.”
    Andy: “We’ll go bankrupt if we spend all our money on customer support.”

Bandwagon

Bandwagon fallacy, also known as “appeal to popularity”, is when something is claimed to be good or true solely because it is popular. In other words, it’s based on the assumption that a majority’s opinion must be correct.

🍔 For Example

  • “Intermittent fasting is the most popular way to lose weight right now. Thus, it must be the right way to do it.”
  • “McDonald’s is the best fast food restaurant in the world, they have served 100 billion people worldwide.”

Slippery Slope

The fallacy of slippery slope works by taking the argument from a relatively small first step to an ultimate conclusion via a number of inaccurate connections. The conclusion is typically some sort of extreme.

🎮 For Example

  • “If I let my child play video games, she will not do her homework, her grades will suffer, and she won’t be able to go to college.”
  • “If we legalize gay marriage, next people will want to legalize polygamy.”

Appeal to Nature

Appeal to nature is based on the belief that if something is natural, it must be good or the right thing to do, and conversely, if something is unnatural, it must be bad and should be avoided.

🌺 For Example

“Herbal medicines are natural, unlike antibiotics and other modern medicines. Therefore, herbal medicines are better for you.”


The post Beginner’s Guide to Logical Fallacies (With Examples) appeared first on FinMasters.

]]>
False Dilemma (Logical Fallacy): Definition and Examples https://finmasters.com/false-dilemma/ https://finmasters.com/false-dilemma/#respond Wed, 12 Feb 2020 09:06:15 +0000 https://fallacyinlogic.com/?p=425 A false dilemma occurs when a limited number of choices, outcomes, or views are presented as the only possibilities.

The post False Dilemma (Logical Fallacy): Definition and Examples appeared first on FinMasters.

]]>
The false dilemma is a logical fallacy, that presents choices as binary or mutually exclusive, often distorting reality. It simplifies complex issues, obstructing rational decision-making and debate. The false dilemma appears regularly in contexts from politics to daily discussions

Key Takeaways

  • The false dilemma is intended to steer the recipient of the message toward the option the speaker wants.
  • It is common in debates, where it’s used to present complex, nuanced issues as overly simplistic, stifling genuine, honest discussions.
  • The false dilemma may be used intentionally or subconsciously. The motive remains the same: sway the audience towards the speaker’s desired choice or outcome.

What Is a False Dilemma?

✍ A false dilemma occurs when a limited number of choices, outcomes, or views are presented as the only possibilities when, in fact, more possibilities exist. As such, it unjustifiably puts issues into black-or-white terms.

Accordingly, it’s also known as the either-or fallacy, all-or-nothing fallacy, and black-and-white thinking.

💔 A simple example would be:

“You either love me or hate me”.

This is a false dilemma as there are other emotions people may feel for each other than just these two extremes.

Essentially, this fallacy can be committed in two ways: by suggesting that there are only two possible options when more exist, or by incorrectly presenting the choices as mutually exclusive (only one of the options can be true). Also, one of the given options is often clearly undesirable, while the other one — which the arguer may want us to choose — seems acceptable and rational.

Furthermore, it’s frequently characterized by “either-this-or-that” type of language, implying that if one of the choices is true, the other one must be false, or if you don’t accept one, the other must be accepted. In reality, however, both of the options may be false or could be accepted at the same time.

Why It Occurs

This fallacy is typically committed because one fails to take into consideration other possible options that would apply to the issue. This can be due to carelessness or, as it sometimes is, a deliberate persuasion strategy by the arguer.

As D. Q. McInerny noted in his book Being Logical: A Guide to Good Thinking:

The fallacy seeks to create a false sense of urgency in an audience, to force them to choose between the alternatives carefully selected by the perpetrator of the fallacy.

👉 Examples

  • America: Love it or leave it.
  • “You are either with us or against us.”
  • “If you are wrong, I must be right.”
  • “I didn’t see you at the charity fundraiser today. I guess you are not a good person after all.”
  • “We either keep euthanasia illegal and show that we value human life, or we legalize euthanasia and thus decide that human life is worthless.”
  • “Either the evolution theories are correct, or creationists are right. Those are the only options we have”
  • “If you are not Republican, then you must be a Democrat.”
  • “Either you are with us, or you are with the terrorists.”(George W. Bush, 2001)
  • “Would you rather be stuck in your boring job forever or pursue your passion?”
  • “You’re either part of the solution or part of the problem.”
False Dilemma - Example and definition

The post False Dilemma (Logical Fallacy): Definition and Examples appeared first on FinMasters.

]]>
https://finmasters.com/false-dilemma/feed/ 0
Money Scripts: Understanding Your Relationship With Money https://finmasters.com/money-scripts/ https://finmasters.com/money-scripts/#respond Mon, 21 Jun 2021 10:00:00 +0000 https://finmasters.com/?p=7370 Money scripts are the unconscious beliefs you have about money. Learn what yours are and how they affect your financial decisions.

The post Money Scripts: Understanding Your Relationship With Money appeared first on FinMasters.

]]>

We all like to think our financial decisions are fully rational, but the truth is that our subconscious beliefs have a dramatic impact on our money decisions[1]. These beliefs ar known as money scripts, and it’s important to know what yours are and understand them.

Key Takeaways

  • Money scripts start in childhood. These beliefs and attitudes often develop in childhood and are influenced by experiences or parental examples.
  • There are many types of money scripts: A study identifies four main types of money scripts: Money Avoidance, Money Worship, Money Status, and Money Vigilance.
  • Money scripts affect your financial Behavior: Money scripts strongly influence financial behaviors, affecting everything from spending habits and debt management to attitudes toward wealth and self-worth.
  • It’s important to identify your money scripts. Recognizing your money script is crucial for understanding your subconscious beliefs about money and how they drive your financial decisions.
  • You can change your money scripts. If your money scripts are hurting you, consider financial therapy to identify and address deep-seated beliefs affecting financial choices.

What Are Money Scripts?

Money scripts are subconscious beliefs about money that people develop as early as childhood. They can be shaped by your experiences or even passed down to you from your parents’ own beliefs.

Even though you may not be consciously aware of them, they have a powerful influence over the financial decisions you make as an adult.

The term money script was coined in 2011 by financial psychologists Ted Klontz and Brad Klontz in their famous Journal of Financial Therapy study about money beliefs and financial behaviors. During the study, they asked participants how much they agreed with 72 money-related beliefs such as:

  • I do not deserve money.
  • Money would solve all my problems.
  • I should save money, not spend it.
  • Money is what gives life meaning.

Their study demonstrated that people’s financial behaviors and decisions are heavily influenced by their beliefs around money.

👉 For example:
Self-limiting money scripts such as “there will never be enough money” are linked to disordered financial decisions like carrying revolving debt.

The study identified four main types of money scripts, known as the Klontz-Money Script Inventory (or Klontz-MSI).

Money Scripts Types

Learning about the four types of money scripts is the first step to uncovering and understanding your own beliefs about money. Let’s take a look at each one and how it can impact your financial well-being.

Four money scripts

1. Money Avoidance

Money is the root of all evil”, a money avoider is likely to say. Following this belief, money avoiders associate affluence with greed and corruption. They also think they’re undeserving of money themselves, especially when others have less. This causes them to feel guilty for desiring money, as they associate it with being a bad person.

The study found that people who are money avoidant share similar characteristics, including worries and anxiety about money or extreme frugality.

Money avoidance can lead to unhealthy and self-sabotaging financial decisions like living in denial about your financial situation, having a low income, and not budgeting your money.

2. Money Worship

People who worship money believe that it will be their salvation. They associate having more money with freedom, happiness, and a better life in general.

As a result, people who subscribe to this money script often:

  • Carry revolving debt
  • Overwork
  • Seek fulfillment in buying and accumulating more
  • Overspend and make risky financial decisions

Ironically, money worshipers may also have a scarcity mentality that prevents them from attaining this coveted wealth. They will say things like “I will never be able to afford that” or “there will never be enough money to go around”. Because they worship money, they exaggerate its scarcity.

3. Money Status

Do you believe your self-worth is tied to how much money you have?

Those with a ‘money status’ script certainly do. If you identify with this belief, you’re very competitive and you often compare your wealth to those around you. You believe that success is attained by acquiring more money and material possessions.

Klontz found that many people who carry this belief grew up in households with a lower income, and inherited the belief that attaining a higher socio-economic status will increase their sense of self-worth or bring them happiness.

The consequences of this money script include overspending, unhappiness, and anxiety. Money status believers also tend to keep their finances a secret even from their spouses. This is one of the biggest financial mistakes couples make, and it can lead to a lack of trust and communication.

4. Money Vigilance

People who are vigilant about their money resonate with some of the following statements:

  • If you can’t pay cash for something, you shouldn’t buy it.
  • It is important to save your money.
  • You shouldn’t share how much you earn with other people.

If you relate to these money script, you’re on the cautious side when it comes to how you handle your money. What this means is that you don’t like to rely on credit cards, you have savings and emergency funds, and you believe money should be earned with work and not handed out.

Being vigilant about money is generally good for your financial health.

However, being too vigilant can lead to anxiety and worries about money. It can prevent you from enjoying the money you have and living your life to the fullest.

Do any of these money scripts describe you?

Good. Identifying your money script is the first step toward uncovering your unconscious beliefs about money and how they’ve been driving your decisions. Whatever your financial situation is at this very moment, you got here guided by these scripts.

But just as these limiting beliefs entered your subconscious, they can also be changed to improve your financial situation.

Changing Money Beliefs

There are actionable steps you can take to unlearn your current self-limiting beliefs about money and improve your financial well-being.

Step 1: Identify What They Are

First, identify what your money beliefs are and write them down. You may find yourself identifying with a specific money script category like money vigilance, or you may fall into multiple categories.

Then, observe the financial decisions you usually make and your beliefs. You should start to notice a relationship between the two. You’ll need to decide whether your money script is preventing you from reaching your financial goals. If it is, you may wish to change it.

Step 2: Take Action

Once you’ve identified your beliefs, it’s time to replace them with better ones, and take steps to break the habits that they formed in your daily life.

It’s time to rewrite your money scripts. Here are some examples:

1. Money Avoidance Tips

👎 Don’t say:

  •  “I don’t deserve to be rich.”
  •  “Good people shouldn’t care about money.”
  •  “I don’t deserve money when others have less.”

👍 Do say:

  • “I deserve financial freedom and abundance.”
  • “Good people can be financially secure.”
  • “I can live well and help those less fortunate at the same time.”

Along with rewriting your beliefs, take actionable steps to change your habits:

Your beliefs should not associate money with negative feelings, but with positive ones.

2. Money Worship Tips

People who associate money with happiness and believe that money is the solution to their problems are sorely disappointed when they finally get it and find out it hasn’t made them any happier.

Breaking out of this mentality is not easy, but it’s possible. Here’s how you can start:

Most importantly, detach your happiness from the concept of money. Find the happiness and freedom you think money will bring, in actual meaningful activities. This can include spending time with your loved ones, pursuing a hobby you enjoy, or giving to those in need if you have the means.

3. Money Status Tips

As with money worship, you need to detach your self-worth from how much money you have or don’t have. With that in mind, consider these helpful tips:

  • Work toward financial security, not to ‘prove’ your status.
  • Spend with intention. Don’t spend to impress others, to appear wealthy, or to boost your self worth.
  • Discuss finances with your partner and be honest.

You should also learn not to define others by how much money they have and how successful you perceive them to be.

4. Money Vigilance Tips

Being cautious with your money is generally associated with healthy money habits. However, being overly cautious takes the joy out of your life. When you become afraid to spend your money, what’s the point of having it in the first place?

Learn to have more of a balance. You can spend money on things you enjoy (like a holiday, a nice dinner, or a new outfit) and still be smart about your money habits.

How Financial Therapy Can Help

If you’re experiencing persistent financial problems, you may need more than just financial advice.

That’s where financial therapy comes in. A financial therapist can help you identify your unconscious self-limiting beliefs about money and how they’ve influenced your poor financial choices and behavior. Financial therapy can teach you the discipline to overcome your self-sabotaging financial habits and make meaningful changes in your life.

Final Thoughts

Henry Ford said something very true – “ Whether you think you can, or you think you can’t – you’re right”. Your thoughts create your current reality.

When you continue to have self-limiting money scripts, your actions will always try to confirm those beliefs. It’s like a self-fulfilling prophecy.

You can identify your subconscious money beliefs and empower yourself to make better choices!

The post Money Scripts: Understanding Your Relationship With Money appeared first on FinMasters.

]]>
https://finmasters.com/money-scripts/feed/ 0
No True Scotsman Fallacy – Definition and Examples https://finmasters.com/no-true-scotsman-fallacy/ https://finmasters.com/no-true-scotsman-fallacy/#respond Tue, 31 Mar 2020 03:54:34 +0000 https://fallacyinlogic.com/?p=461 No true Scotsman occurs when someone defends a generalization by redefining the criteria and dismissing examples that are contradictory.

The post No True Scotsman Fallacy – Definition and Examples appeared first on FinMasters.

]]>
No true Scotsman is a logical fallacy, meaning an error in reasoning, in which someone defends a generalization by redefining the criteria and dismissing examples that are contradictory.

It is also known as an “appeal to purity” as it aims to refute any arguments or evidence against a certain ideal by appealing to its “purity”. As such, this argument is used in an attempt to protect various groups from criticism, such as political parties and religious groups.

No True Scotsman - Example and definition

Definition

👉 No true Scotsman fallacy occurs when someone attempts to defend a universal claim by excluding any counter-examples for not being “pure” enough.

In other words, they reject instances that don’t fit into the category by changing the definition to a more specific one rather than acknowledging the evidence that contradicts the generalization.

Note that in this fallacy, “Scotsmen” can be replaced with any other group.

A typical logical form of a no true Scotsman- argument is:

  • All X are Y
  • (It is shown that not all X are Y)
  • All true X are Y

👉 The example this fallacy is named for goes as follows:

Angus: “No Scotsman puts sugar on his porridge.”
Scotty: “But my uncle is a Scotsman and he puts sugar on his porridge.”
Angus: “But no true Scotsman puts sugar on his porridge!”

Here, Angus changes the definition of his generalization attempt in ad hoc fashion and simply dismisses Scotty’s counter-example.

Use of No True Scotsman

This type of argument is common and can be made for any group. For instance, it is often used to defend a particular religious group by excluding those who behave in unfavorable ways as not “true” members of the religion.

This can also be seen as an example of cherry-picking, although in reverse; rather than choosing only the examples that are beneficial, one denies all the disadvantageous ones.

Antony Flew, who first mentioned the no true Scotsman fallacy and coined the term, gave the following explanation in his book Thinking About Thinking: Or, Do I Sincerely Want to Be Right?:

Imagine Hamish McDonald, a Scotsman, sitting down with his Glasgow Morning Herald and seeing an article about how the ‘Brighton Sex Maniac Strikes Again’.

Hamish is shocked and declares that ‘No Scotsman would do such a thing’. The next day he sits down to read his Glasgow Morning Herald again; and, this time, finds an article about an Aberdeen man whose brutal actions make the Brighton sex maniac seem almost gentlemanly.

This fact shows that Hamish was wrong in his opinion, but is he going to admit this? Not likely. This time he says: ‘No true Scotsman would do such a thing’.

Not Fallacious

This fallacy does not occur if there is a clear and accepted definition of the group and what it requires to belong to that group, and this definition is violated by the arguer. For example:

  1. “No vegetarian eats meat.”
  2. “Well, my friend says she is a vegetarian but she still eats meat.”
  3. “But no true vegetarian eats meat.”

This is not a fallacy because being a vegetarian, by definition, is the practice of abstaining from the consumption of meat; if she consumes meat, she is not really a vegetarian. Thus, this fallacy can only occur in a situation where the definition can be redefined due to a lack of clear understanding or agreement of the criteria.


The post No True Scotsman Fallacy – Definition and Examples appeared first on FinMasters.

]]>
https://finmasters.com/no-true-scotsman-fallacy/feed/ 0
Ad Hominem: When Personal Attacks Become Fallacious https://finmasters.com/ad-hominem-fallacy/ Tue, 26 Nov 2019 02:10:45 +0000 https://fallacyinlogic.com/?p=284 Ad hominem fallacy is based on personal and irrelevant attacks against the source of an argument instead of addressing the argument itself.

The post Ad Hominem: When Personal Attacks Become Fallacious appeared first on FinMasters.

]]>
Many people, if not most, have at least heard of the ad hominem fallacy. And, not for nothing: it is one of the most common type of logical fallacy – an error in reasoning that weakens an argument or trick of thought used as a debate tactic.

Although the name “ad hominem” is widely recognized, the fallacy behind it is perhaps not as well understood. For instance, it is often not mentioned that there are, in fact, several different types of ad hominem fallacies. Each of them works quite differently, however, they are all based on attacks against the person making an argument instead of criticizing the argument itself.

Ad hominem, in all its forms, is an extremely common offender almost everywhere – from disagreements among friends to debates between state leaders. Keeping this in mind, as well as that it often occurs due to a lack of argumentation skills, it is clearly important to understand it; learning about ad hominem makes you better able to identify and counter it, as well as avoid committing it yourself.

In this article, we will cover everything you need to know about this reasoning flaw.

What Is Ad Hominem?

👉 Ad hominem, short for argumentum ad hominem, is a logical fallacy that is based on personal and irrelevant attacks against the source of an argument instead of addressing the argument itself.

In other words, the attacker takes aim at their opponent’s supposed failings that are unrelated to the issue at hand rather than focusing on the validity of the argument or position they support.

The attacks can be directed toward someone’s character, background, past actions, intelligence, morals, physical appearance, or credentials. As such, this fallacy tends to appeal to people’s emotions and prejudices instead of intellect.

👩‍🎓 One ad hominem example would be:

Carly: “I think that climate change is the most important issue of our time, and everyone should acknowledge that.”
Jamie: “You didn’t even go to college, so obviously, you have no idea what you are talking about.”

Here, Jamie’s response is not only insulting but also unrelated to Carly’s claim: pointing out that she didn’t go to college proves nothing about the truthfulness of her words. In other words, rather than address Carly’s argument, he simply dismisses it with an offending comment. This is an example of abusive ad hominem.

There are a number of different types of ad hominems, the abusive and the circumstantial being the most usual types. And, as mentioned earlier, although they are all based on criticism of the individual behind a claim, each of them does it in a different way.

Category

It belongs to the broad category of informal fallacies and falls into their subgroup of relevance fallacies. And, even more precisely, it’s also a type of genetic fallacy.

  • Informal fallacies refer to arguments containing irrelevant or invalid evidence that renders the conclusion incorrect. They stem from an error in reasoning rather than an error in the argument’s logical structure.
  • Fallacies of relevance occur when the evidence for an argument is not relevant to the conclusion and thus doesn’t provide adequate reasons to believe that the conclusion is valid.
  • Genetic fallacy refers to attacks directed toward the source of an argument instead of addressing the argument itself.

Use of Ad Hominems

This logical fallacy is commonplace in a wide variety of discussions and situations. It is often committed out of desperation when one doesn’t have a decent counterargument or when one wants to avoid the topic at hand.

In the political arena, its use is also referred to as “mudslinging”, and it’s often the meat and potatoes of political campaigns. For instance, calling your opponent offending nicknames, such as “lyin’ Hillary and “crooked Hillary”, can be seen as fallacious ad hominems when they are used in an attempt to discredit the opponent’s arguments.

In many cases, criticizing your adversary personally is a powerful (although unethical) strategy if your goal is to pull focus off the real issue. Personal insults tend to have an emotional appeal, which can be effective in manipulating the audience’s opinion and possibly damaging the credibility of the opposing side.

Ad Hominem Fallacies

There are five main types of ad hominems: abusive, circumstantial, tu quoque, guilt by association, and poisoning the well.

1. Abusive

Ad hominem abusive is probably the most frequently occurring type. It occurs when someone makes an abusive attack towards someone by criticizing their attributes such as character, background, morals, physical appearance, or hobbies. In other words, it’s an attempt to discredit an argument by insulting the arguer.

It is also known as “name-calling” and “damning the source”.

👉 Its logical form goes as:

  • Person A makes argument X.
  • Person A is an idiot.
  • Therefore, argument X is false.

🌎 For Example

Mike: “There are so many Earth-like planets out there that I think there must be life on some of them.”
Jenny: “What could you possibly know about this, you are a moron who spends his nights watching Netflix.”


2. Circumstantial

Circumstantial ad hominem, also known as “appeal to motive”, arises when someone says that since a certain claim must be predisposed by the arguer’s personal circumstances, it is, therefore, invalid.

This is logically fallacious specifically because it asserts that an argument must be false if there is such a connection between a person’s circumstances and their claim that could possibly affect their decision-making. In reality, however, it doesn’t disprove the logic or validity of the claim; a car salesman may really believe that the car he is selling is an excellent vehicle.

👉 As such, its logical form is:

  • Person 1 makes an argument X.
  • Person 1 has a personal interest in X to be true.
  • Therefore, X is false.

However, note that if there is strong evidence for a conflict of interest and enough reasons to believe that the individual’s position is indeed biased, it is reasonable to call them out on it.

👉 For Example

  • Kate: “Since our student council currently consists mostly of boys, I think it would be good if we get more girls in it and make it more balanced.”
    Jim: “You only say that because you are a girl yourself, so your opinion doesn’t matter.”
  • A politician argues that the country would be better off if it were to increase spending on education. His opponent, however, points out that these words should be dismissed entirely since the politician would benefit personally from such an increase.

3. Tu Quoque

Also called the appeal to hypocrisy, tu quoque (Latin for “you too”) is based on the claim that a person’s argument must be invalid because their past actions or words are not consistent with it.

In essence, rather than trying to refute the logic or evidence the person is using, one responds by pointing out that he or she has acted in the same manner themselves.

It’s considered to be a flawed line of reasoning because, even though it may show the opponent’s hypocrisy, it doesn’t really address the actual substance of an argument.

The logical form of a tu quoque is:

  • Person 1 makes an argument X.
  • Person 2 points out that X is also true about Person 1.
  • Therefore, X is false.

🚙 For Example

  • Mary: “You should quit smoking, it has been proved many times how dangerous it is.”
    Elise: “Well, you smoke yourself, so you can’t actually believe that.”
  • Jim: “I believe that striving to reduce our carbon footprints on an individual level would have a positive effect on the climate.”
    Ken: “You shouldn’t be preaching about the climate and carbon footprints; you drive an SUV!”

4. Guilt by Association

Types of Ad Hominem Fallacies

Guilt by association is a type of ad hominem fallacy in which someone is discredited due to their supposed association with something negative; since the characteristics of something negative, such as a bad person or an evil idea, and the characteristics of the person that it’s associated with are said to be the same, the person is therefore viewed as “guilty” too.

👉 The typical form for this argument is:

  • Person 1 supports position X.
  • Person 2, who is evil, also supports X.
  • Therefore, Person 1 is evil too.

When this type of fallacious connection is made in a positive context, it’s called honor by association. The reasoning behind it is the same, only the person or a group is associated with something that is seen as positive.

👉 For Example

  • Jonah: “I’m a vegetarian because vegetarianism has been proven to have many health benefits over diets containing meat.”
    Anna: “Didn’t you know that Hitler was a vegetarian too? You must be like him.”
  • “Stalin was an atheist and an evil man. Therefore, all atheists must be evil.”

5. Poisoning the Well

Poisoning the well is a fallacy that arises when negative information about someone is presented preemptively in order to discredit or ridicule following claims made by that person.

It is also known as a smear tactic; rather than having to counter a claim in legitimate ways, one resort to smearing their opponent’s reputation and thus making their words less credible.

🐍 For Example

  • Carol: “I’m going on a date tonight with Jack.”
    Katherine: “Really? I heard a rumor that he might be a pathological liar; you shouldn’t believe anything he says.”
  • “My opponent is incompetent as a politician and, quite frankly, as a man. Therefore, we have all the reasons to simply dismiss the arguments he will make today.”

6. Ad Feminam

Ad feminam (Latin for “to the woman”) is also a specific form of ad hominem argument, albeit a lesser-known one.

It uses female stereotypes to attack a woman’s position. For example, suggesting that someone’s (who is a female) claim must be false or irrational because of pregnancy or menstruation hormones.


Not an Ad Hominem

Not every insult or criticism of a person is an ad hominem, or fallacious for that matter. Essentially, the distinctive factor is that, in every fallacious personal attack, the criticism is irrelevant to the actual issue under discussion.

  • Relevant Criticism – An argument against a person is not fallacious when it’s clearly relevant to the discussion, i.e. when a person’s characteristics, credentials, skills, or such are directly related to the topic.
    For example, if someone who is in a position to enforce the law has acted against the law, then pointing it out would be relevant. This, of course, also applies in a case where the actual topic is about someone’s personal attributes.
  • Conflict of Interest – As noted earlier, it is also relevant when to point out a clear conflict of interest; if there is reasonable evidence to believe that the arguer is predisposed to take a certain position, calling them out on it may not be fallacious.
  • A Simple Insult – If the attack is not being used as evidence to support the counter-argument, then it’s simply an insult, not a fallacy.
    For example, if someone makes a sound counter-argument and simultaneously throws an insult at the other person, it wouldn’t be seen as fallacious (even though it would be rude and unproductive).

How to Counter

Ad hominem arguments are often committed emotionally because one lacks the skills and knowledge to legitimately refute opposing claims. As such, it can be difficult to prevent people from using them; we can’t have too much control over other people and how they will behave. However, despite this, you still may be able to lower the risk to some degree – and, as the saying goes, prevention is better than cure.

In order to achieve this, try to make your point politely and with consideration to your opponent’s point of view. When you come out as respectful and non-judgmental, the odds of your opponent wanting to offend you are lower, even if they disagree with your viewpoints. This applies especially when you are about to criticize something that the other side has a strong interest in.

Now, in a case where your opponent has already launched an ad hominem at you, you have a few different ways to approach the situation:

  • 😢 Don’t Get Emotionally Involved Avoid getting emotionally involved yourself, and instead, keep the conversation polite and constructive (at least on your part) and never respond to an insult with an insult. Also, keep in mind that when someone resorts to personal attacks, they often do it out of desperation, which may be a sign of the strength of your argument.
  • 🎯 Point Out the Fallacy – A good approach in most situations is to point out the use of ad hominem, highlight its irrelevance to your claim, and then steer the attention back to the original issue.
    In particular, when the fallacy touches on your intentions to hold a certain position or is meant to hurt your credibility otherwise, it’s important to call attention to it. If you decide not to acknowledge it, it may inevitably seem as if you agree with it.
    However, if it’s purely an irrelevant insult (“You are a jerk!”), you may choose to ignore it and move on.
  • 🔊 Call Them Out – It may also be effective to make your opponent accountable for their use of the fallacy, challenge them to justify the personal attack, and explain how and why they think it’s relevant to the conversation.
  • 🚶 Leave the Discussion – If continuing with constructive conversation seems to be out of the question, the best option may be to leave the discussion – or at least try to change the topic to a more suitable one.
    The conversation is likely not helpful to anyone if one side chooses to argue irrationally, even after you have pointed out their erroneous reasoning.

The post Ad Hominem: When Personal Attacks Become Fallacious appeared first on FinMasters.

]]>
Tu Quoque Fallacy – Definition and Examples https://finmasters.com/tu-quoque-fallacy/ https://finmasters.com/tu-quoque-fallacy/#respond Tue, 03 Dec 2019 00:50:58 +0000 https://fallacyinlogic.com/?p=328 Tu quoque fallacy occurs when someone's argument is discredited solely based on the allegation that their past actions or words are not consistent with their views.

The post Tu Quoque Fallacy – Definition and Examples appeared first on FinMasters.

]]>
Tu quoque (Latin for “you too”) is a common type of logical fallacy, meaning a flaw in reasoning that weakens an argument or a trick of thought used as a debate tactic. It occurs when someone’s argument is discredited solely based on the allegation that their past actions or words are not consistent with their views.

It is also known as “ad hominem tu quoque” since it’s considered to be one of the different types of ad hominem arguments.

In this article, we’ll explain in detail how this erroneous line of reasoning works, as well as examine a variety of examples.

Tu Quoque Fallacy - Definition and example
Tu quoque is often used to shift the focus to the opponent’s weaknesses in debates.

Overview

👉 Tu quoque is a fallacy in which someone asserts that their opponent’s argument must be invalid because it is inconsistent with their past words and actions.

In other words, one points out that the opponent has acted in the same manner themselves and fallaciously uses the (alleged) hypocrisy as evidence to refute their argument.

This reasoning is fallacious because it dismisses the argument solely on grounds of personal shortcomings; it doesn’t disprove the logic of an argument, even though it may show the arguer’s hypocrisy. In fact, such arguments often don’t address the substance of the opposing claim at all, even though they appear as relevant counterarguments.

As such, its logical form is as follows:

  • Person 1 makes argument X.
  • Person 2 points out that X is also true about 1.
  • Therefore, X is false.

🚭 For Example

Kate: “ Smoking is unhealthy for you, you really should quit.”
Maria: “You have been smoking for 10 years yourself, so there goes your argument.”

Here, Maria commits the fallacy since she uses hypocrisy to refute Kate’s claim, but in reality, however, it doesn’t disprove or even address the actual claim Kate was making. It is irrelevant to the truth value of her point if she has smoked herself or not.

As Scott F. Aikin explained in his paper Tu Quoque Arguments and the Significance of Hypocrisy:

The hypocrisy of the arguer is not necessarily evidence of the falsity of what she argues. However, one may feel a gut feeling there is something right about tu quoque arguments in that the acceptability of the view proposed is challenged.

This fallacy is also known as the “appeal to hypocrisy”, the “you too” fallacy, and “pot calling the kettle black” fallacy. Also, it’s an informal fallacy and, more specifically, falls into their subcategory of relevance fallacies.

Pronunciation

Tu quoque is pronounced as “tyoo-kwoh-kwee”.

It typically functions as a noun in the English language, although it may also be used to modify other nouns (for example, “tu quoque argument”).

Use of Tu Quoque Fallacy

Similarly to red herring arguments, appeals to hypocrisy are used as a distraction so that one may avoid having to deal with a certain issue or question. It’s quite common to hear “but what about X, look at what they did”- type of allegations in various discussions with both adults and children.

Furthermore, it tends to include a strong emotional appeal and thus can be effective in influencing people’s opinions and judgments. Such a strategy is often employed in the political arena: During the debate, a candidate shifts the focus to their opponent’s “poor” character while seemingly refuting their argument by pointing out that they are being a hypocrite.

Example of how tu quoque argument may be used online. (source)

Examples

To help you better understand this fallacy, here are a few examples from various situations.

🏛 Example in Politics

Politician 1: “My opponent has almost always failed to deliver his election promises, and everyone should remember that.”
Politician 2: “You didn’t deliver your promise to increase the tax rate for rich people, which was at the center of your election campaign.”

Answering criticism with criticism, like in this example, doesn’t directly address the issue at hand, even though it may seem to do so. It simply shifts the focus to the opponent’s character or actions, which are generally irrelevant to the logic of their argument.

🏡 Example at Home

Parent: “You have to clean your room, it’s too messy.”
Child: “But your room is messy too, so why should I listen to you?”

This is a textbook example. In discussions between a parent and a child there are different factors that affect the relevance of a claim, such as a parent’s authority and dissimilar needs due to the age difference.

🏫 Example in School

Hannah: “I think that global warming is the most important issue of our time and everyone should acknowledge that.”
Mark: “But you drive an SUV, therefore you can’t actually believe that.”

The fact that Hannah drives an SUV doesn’t invalidate her argument or necessarily mean that she doesn’t believe in what she says.

However, note that if Hannah’s claim was that driving an SUV is harmful to the climate and therefore unethical, it would be a very unthoughtful argument from her – even if a tu quoque wouldn’t disprove it.


The post Tu Quoque Fallacy – Definition and Examples appeared first on FinMasters.

]]>
https://finmasters.com/tu-quoque-fallacy/feed/ 0
9 Cognitive Biases You Need to Understand to Master Your Money https://finmasters.com/cognitive-biases-master-money/ https://finmasters.com/cognitive-biases-master-money/#comments Wed, 13 May 2015 11:26:20 +0000 https://www.vintagevalueinvesting.com/?p=1005 The world is very complicated. We are constantly presented with new information - every minute of every day - and we have to subconsciously

The post 9 Cognitive Biases You Need to Understand to Master Your Money appeared first on FinMasters.

]]>
The world is incredibly complicated.

We are constantly being presented with new information – every minute of every day – and we have to subconsciously process it all as quickly as possible.

In order to do so, our brain uses shortcuts – which we call heuristics and biases.

While these mental tricks are incredibly useful and accurate 95% of the time, the other 5% of the time (when the situation is slightly changed or unique) the heuristics and biases we use can greatly hurt us.

Guarding yourself against yourself is probably one of the most important things you can do in the world of investing.

If you’ve read Thinking, Fast and Slow by Daniel Kahneman (a book that I highly recommend), then you probably already know about most of the cognitive biases that affect us nearly every day.

In our daily lives, we constantly rely on our instincts to navigate through all kinds of situations, from how to relate to other people to what kind of shampoo we buy. Without those instincts, daily life would be a complete decision overload, making it impossible to even get outside your front door. We have to rely on instinct to make it.

The problem with that is that our instincts aren’t always perfect. For starters, our natural human instincts were developed in our far ancestral past, when we were worried about things like scavenging and hunting for food and about whether another tribe or a group of predators would strike.

Not only that, we have instincts that work really well in some situations and then fall completely flat in other situations.

It’s due to these faulty instincts that we make a lot of common financial mistakes. We overestimate some types of risks and underestimate others. We notice some product features and don’t notice others that are at least as important.

In psychology and behavioral economics, many of these misplaced instincts are referred to collectively as cognitive biases. Cognitive biases are “tendencies to think in certain ways that can lead to systematic deviations from a standard of rationality or good judgment.” In other words, they’re examples of how we use a tool for thinking that works well in some situations and then apply it to other situations where it doesn’t work well and often has a very bad result.

The more you learn about cognitive biases, the more you see them show up in your own life, and when you start to be aware of your own cognitive biases, it becomes easier to correct them and make better decisions for yourself.

Here are nine cognitive biases that have a particularly strong impact on personal finance choices. As you read through these, ask yourself if you’ve ever seen them pop up in your own life.

I’m using Wikipedia as a source below because, on most articles, it does a good job of synthesizing source documents into layman’s terms. It’s not a perfect tool and shouldn’t be relied on as an academic source, but it’s a good place to start and offers lots of links to more reliable sources.


1. Dunning-Kruger Effect

From Wikipedia:

The Dunning–Kruger effect is a cognitive bias wherein unskilled individuals suffer from illusory superiority, mistakenly assessing their ability to be much higher than is accurate. This bias is attributed to a metacognitive inability of the unskilled to recognize their ineptitude. Conversely, highly skilled individuals tend to underestimate their relative competence, erroneously assuming that tasks which are easy for them are also easy for others.

In other words, people who have just a little skill often overinflate that skill and believe that they have a lot of skill. Think of your typical overblown jerk who is moderately skilled but thinks they’re incredibly competent.

On the other hand, those that have a lot of skill assume that many others have the same level of skill. Think of the person at work who’s really good at their job but seems really frustrated with others around him or her.

Two Examples of the Dunning-Kruger Effect

Let’s say you’re a newly hired investment banker at a big investment bank on Wall Street. You just got hired by one of the most prestigious investment firms in the world, so you must have some incredible skills. The thing is, you’re still a new player in the field, but you’ve bought into an illusion that you’re a top dog. Since you believe you have the golden touch, you’re much more likely to invest with overconfidence, which can really cost others.

On the other hand, let’s say you’re a blogger who is writing about frugality and recovery from debt. Your blog suddenly gets popular and you find that a lot of people have had success following the principles that you’ve been sharing on your blog. Because you’ve escaped from debt and you’ve played a small role in helping others escape from debt, you begin to feel more confident in your ability to share information on all sorts of financial topics.

How to Beat It

So, how do you avoid the Dunning-Kruger effect in your daily life.

First, take care to avoid being on the receiving end of such an effect. Get a second opinion (or a third or a fourth) before making any major decision in your life. Don’t make a big financial choice or a big career choice or a big health choice based solely on the advice of just one person. Ask around.

Also, make sure you’re not falling prey to the effect in your own life. Be humble in every aspect of your life. Ask for help sometimes, and get second opinions on major decisions. Don’t be afraid to call in extra help just to make sure you get it right. Accept criticism from others because they’re probably seeing things that your cognitive biases aren’t seeing.

Also, when giving tasks to others, don’t fall prey to your own overinflated expectations. The other person might be doing a good job that just happens to not match what you expected.


2. Anchoring

From Wikipedia:

Anchoring or focalism is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the “anchor”) when making decisions. During decision making, anchoring occurs when individuals use an initial piece of information to make subsequent judgments. Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. For example, the initial price offered for a used car sets the standard for the rest of the negotiations, so that prices lower than the initial price seem more reasonable even if they are still higher than what the car is really worth.

An “anchor price” is the first price you see in relation to a specific item. That “anchor price” tends to stick in your head and alters how you interpret later prices for the same item (or very similar items) that you might see.

Two Examples of Anchoring

To clarify the example given above, a car dealer might put a $10,000 sticker on an $8,000 car so that you think of $10,000 as the value of the car so that when $8,000 is offered, it seems like a deal, even though it really isn’t.

I actually see a lot of anchoring going on at the grocery store. Often, we have a pre-established idea of what each type of item we might buy should cost so we have a sense of whether or not something is a good deal or not. This is why older shoppers sometimes have serious sticker shock at the grocery store because their “anchor price” for goods like a gallon of milk are quite low.

On the other hand, when a new product comes on the market, the initial price for it is often quite high (think Apple products) so that when you see a discount on it, it seems like a good deal even though it’s still somewhat overpriced. (Don’t get me wrong, Apple makes good products, but they’re still overpriced.)

How to Beat It

The most effective way of beating the anchor effect is to research what you buy. If you’re going to buy a used Toyota, do some Kelley Blue Book research before you ever step on the lot so that you have a sense of what kinds of prices you should really expect for a used Toyota. This is a good strategy for any large purchase, so that you know what you should pay before you ever consider opening up the checkbook.

Another good strategy when buying normal grocery items is to examine the prices of the generic item first every single time. I find that I’m more bothered by a higher price than I am tempted by a lower price. If I see a $2 item first and that becomes my “anchor,” a $3 version seems unappealing. On the other hand, if I “anchor” on a $3 name brand item, the $2 generic doesn’t have the same gut impact.


3. Choice-Supportive Bias

From Wikipedia:

In cognitive science, choice-supportive bias is the tendency to retroactively ascribe positive attributes to an option one has selected. It is a cognitive bias. For example, if a person buys a computer from Apple instead of a computer (PC) running Windows, they are likely to ignore or downplay the faults of Apple computers while amplifying those of Windows computers. Conversely, they are also likely to notice and amplify advantages of Apple computers and not notice or de-emphasize those of Windows computers.

This is one that should seem intimately familiar. Think about how people root for their favorite sports team or back up their political stances. They drastically overweigh the positives and drastically underweigh the negatives.

An Example of Choice-Supportive Bias

A good example of this phenomenon when it comes to spending money comes from people who are hung up on their favorite brand of a product. Once a person becomes a regular customer for a particular product, as long as nothing drastically changes about that product, they’ll keep using it and recommending it to others even if better or more cost-efficient competitors come on the market. This is often voiced with the explanation of “it’s always worked for me.”

I do this myself sometimes; for example, I stick by Glad ForceFlex garbage bags because they’ve always met my needs. I can extol the virtues of those bags over and over again, even though there may be more cost-effective bags out there.

How to Beat It

The best strategy is to trust independent reviews. Rather than being loyal to a particular brand and focusing on the positive attributes of that brand, focus instead on trusted independent reviews and use whatever is the best value as identified by those reviews.

This doesn’t just mean looking at such reviews once and always sticking with the best. You should regularly review your buying habits and make sure that the brands you regularly buy are still on top of the pile in terms of bang for the buck. Some products aren’t made as well as they once were and new products come on the market that are simply better bargains, and the only way to be aware of this is to regularly read independent reviews.


4. Confirmation Bias

From Wikipedia:

Confirmation bias, also called myside bias, is the tendency to search for, interpret, or recall information in a way that confirms one’s beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs. People also tend to interpret ambiguous evidence as supporting their existing position.

A great example of this is the tendency of conservative-minded people to obtain their news and opinions from sources that reflect those views, such as Fox News and World Net Daily. If you obtain information and opinion pieces primarily from sources that already mirror your point of view, you’re only going to reinforce your point of view, regardless of whether that happens to be the best solution to the day’s problem.

An Example of Confirmation Bias

How does this kind of thing pop up in personal finance? Well, there are lots of different sources of personal finance information. Some sources heavily promote and focus on entrepreneurship. Others heavily promote and focus on frugality. Still others heavily promote and focus on investing. Generally, the people who focus on just one viewpoint tend to think that their angle is the best angle on surviving and getting ahead financially and will react to other information with disdain.

I often see this in comments, where people who are present for frugality information say that investing isn’t realistic for their financial situation, while investing oriented readers will comment on frugality posts and say that this stuff really won’t help them earn big returns on their money.

It turns out that all of these tools – and more – are useful in getting ahead financially, but when you choose to subscribe to one main perspective, you tend to discard other ones.

How to Beat It

The best approach is always to learn more, especially from different perspectives.If you’re most comfortable and confident with frugality, take the time to learn about investing and entrepreneurship. If you’re an entrepreneur at heart, learn what benefits frugality and investing can bring to your situation. If you’re an investor, learn about how frugality can bring you more money with which to invest.

The same is true for other things like political opinions – look for well-reasoned information and arguments from perspectives different from your own and try to learn from them. Why do they see things that way?


5. Bayesian Conservatism

From Wikipedia:

In cognitive psychology and decision science, conservatism or conservatism bias is a bias in human information processing. This bias describes human belief revision in which persons over-weigh the prior distribution (base rate) and under-weigh new sample evidence when compared to Bayesian belief revision.

This is why people tend not to react well to change. If they’re used to a previous standard price on something and it changes, they’re often slow to change their idea of what a standard price is.

An Example of Bayesian Conservatism

My favorite example of how Bayesian conservatism is used against people is when products change size without changing their price. For example, a beverage company might charge $2 for a 20 ounce product, then change the size to a 16 ounce product.

Because the sticker price doesn’t change, people aren’t (as) upset as they should be. However, the price per ounce has gone from $0.10 to $0.125. Customers are much more likely to stay happy with this price change because the seemingly identical product (the now-sixteen ounce bottle) has the same price it has had for a long time.

The manufacturer doesn’t have to deal with Bayesian conservatism when it comes to price. Most of their customers will still see a bottle of their favorite beverage for the same price on the shelf.

How to Beat It

The best strategy for beating Bayesian conservatism when it comes to products on the shelves is to trust in price-per-unit. Price per unit simply means that you’re focused solely on the price of an item no matter the packaging.

For example, if you’re buying a beverage, you care only about the price per ounce (or liter) of that beverage. You can figure that out by looking at the number of ounces (or liters) in the bottle, then dividing the price by that number.


6. Endowment Effect

From Wikipedia:

In behavioral economics, the endowment effect (also known as divestiture aversion) is the hypothesis that people ascribe more value to things merely because they own them. This is illustrated by the observation that people will tend to pay more to retain something they own than to obtain something owned by someone else — even when there is no cause for attachment, or even if the item was only obtained minutes ago.

You love your stuff, right? Since it’s yours, you probably think it’s valuable. You’d rather hold onto your own favorite sweater, for example, than an identical sweater owned by someone else. That’s the endowment effect at work.

An Example of the Endowment Effect

The best example I’ve seen of the endowment effect comes from people who expect to get the “new” value out of an item that they own that clearly falls into the “used” category. You’ll see this pop up with things like yard sale pricing, where people will often have outrageous prices on items, or on some of the prices people post on Craigslist. I’ve witnessed people request more for a used item than it can be obtained for new.

How to Beat It

If you’re selling something, your best approach is to have others give you an honest appraisal of the item you’re selling. Get an appraiser that you trust and believe in their word. If they give you a number lower than you expect, it’s not because they’re not trustworthy, it’s because you’re falling prey to the endowment effect.

On the other hand, if you’re buying something, don’t assume it’s a bargain simply because it’s used. The seller may be overvaluing that item due to the endowment effect. Do some research and make sure that the price you’re going to pay is actually reasonable. This is especially true on larger items.


7. IKEA Effect

From Wikipedia:

The IKEA effect is a cognitive bias that occurs when consumers place a disproportionately high value on products they partially created. The name derives from the Swedish manufacturer and furniture retailer IKEA, which sells many furniture products that require assembly.

If you made it or were at least partially responsible for making it, it’s probably worth more to you than it is to other people.

An Example of the IKEA Effect

My favorite example of the IKEA effect comes from people who stick with a bad business idea far, far past when they should have given up on it. I’ve seen people persist for years in various multi-level marketing businesses because it’s their business and they don’t want to give up on it. One only has to watch an episode or two of shows like Shark Tank to see many examples of the IKEA effect.

How to Beat It

The best strategy for beating the IKEA effect is to trust the numbers, not your gut.Keep track of the time you’re investing and the return it’s getting you and see if that’s worthwhile. Returns aren’t always in the form of money, of course, but if you’re barely earning a trickle, even from something you love, you should consider another avenue.

Of course, some endeavors do require a lot of time investment up front. In those situations, trust an outside observer. Sometimes, it’s impossible to overcome cognitive biases on your own, so let someone you trust evaluate it and follow their judgment.


8. Social Desirability Bias

From Wikipedia:

Social desirability bias is a social science research term that describes the tendency of survey respondents to answer questions in a manner that will be viewed favorably by others. It can take the form of over-reporting “good behavior” or under-reporting “bad”, or undesirable behavior. The tendency poses a serious problem with conducting research with self-reports, especially questionnaires. This bias interferes with the interpretation of average tendencies as well as individual differences.

We all have a natural tendency to want to report things that will be received favorably by those who we’re reporting to. It’s worth noting that this doesn’t always mean apositive report, just one that you feel will be seen favorably by the person you’re talking to.

An Example of Social Desirability Bias

When someone asks a question on Facebook or in a group where they’re seeking recommendations on a particular product type or thoughts on a specific business, people generally tend to respond in a way that matches the perceived mood of the person asking the question. If the questioner seems negative, then they’ll get mostly negative reviews. If the questioner seems positive, then they’ll get mostly positive reviews. This is amplified if the already-existing reviews are all slanting positive or slanting negative.

How to Beat It

If you’re asking for recommendations, ask in a one-on-one environment so that the response isn’t altered by the thoughts of others. I generally ask questions specifically of the people I trust the most on a particular topic, and I do it privately via a call or a visit or a text message.

Furthermore, I try to ask the question in the most neutral way possible so that I don’t give any indication of any ideas I may already have. I usually ask in as few words as possible because the more I write, the more I tend to reveal already existing positive or negative viewpoints, which will sway the person I’m asking.


9. Spotlight Effect

From Wikipedia:

The spotlight effect is the phenomenon in which people tend to believe they are noticed more than they really are. Being that one is constantly in the center of one’s own world, an accurate evaluation of how much one is noticed by others has shown to be uncommon. The spotlight effect was first reported in 1999, when Thomas Gilovich and Kenneth Savitsky coined the term. The reasoning behind the spotlight effect comes from our human tendency to forget that although one is the center of one’s own world, one is not the center of everyone else’s. This tendency is especially prominent when one does something atypical. Research has empirically shown that such drastic over-estimation of one’s effect on others is widely common.

People simply don’t pay as much attention to you as you think they do.

An Example of the Spotlight Effect

Whenever you buy a new car or a new gadget or new clothes because you imagine that item impressing others, you’re falling prey to the spotlight effect.

Whenever you worry about what the neighbors will think or what a random person on the street will think, you’re falling prey to the spotlight effect.

Whenever you take special preparations in order to present an artificially inflated image of yourself to someone you don’t even know, you’re falling prey to the spotlight effect.

How to Beat It

My usual tool for beating the spotlight effect is to ask myself whether I would care at all if someone else had this item. If I ever detect myself considering what other people would think about this purchase, I try to put myself in their shoes. Would I care? At all? Would I honestly notice other people on the street owning or wearing or using this item? The answer is usually no, and thus the spotlight effect is defused.


Final Thoughts

These cognitive biases – and many others – pop up again and again in daily life. They convince us to spend our money and our time in ways that actually go against our larger goals.

Being aware of them is the first step. Taking simple steps to avoid them is the next step. Practicing those steps until they become unconscious and natural is the final step toward putting our worst cognitive biases to rest.

Good luck!

The post 9 Cognitive Biases You Need to Understand to Master Your Money appeared first on FinMasters.

]]>
https://finmasters.com/cognitive-biases-master-money/feed/ 1
What Is The Loaded Question Fallacy? Definition and Examples https://finmasters.com/loaded-question-fallacy/ https://finmasters.com/loaded-question-fallacy/#respond Fri, 03 Apr 2020 22:56:06 +0000 https://fallacyinlogic.com/?p=465 The loaded question fallacy is a question containing an implicit assumption - that is unverified or controversial.

The post What Is The Loaded Question Fallacy? Definition and Examples appeared first on FinMasters.

]]>
Loaded question, sometimes called a “complex question”, is a type of logical fallacy – an error in reasoning or a trick of thought used as a debate tactic.

This type of question is an attempt to limit the possible answers to only “yes” or “no”, and choosing either response would end up hurting the respondent’s credibility or reputation. As such, loaded questions are frequently used as a rhetorical tool in various contexts, such as journalism and politics.

In this article, we’ll explain how this fallacy works and examine a variety of examples. But first, here are a few quick facts:

What is a loaded question?
It occurs when someone asks a question containing an unjustified (and often offensive) presupposition.

What is an example of a loaded question?
An example would be: “So, have you always had a gambling problem?”

What is the difference between a leading question and a loaded question?
A leading question is one that suggests the answer desired by the speaker, while a loaded question includes an implicit assumption about the respondent.

Overview

The loaded question fallacy is a question containing an implicit assumption – that is unverified or controversial – putting the person being questioned in a defensive and unfavorable position.

It’s a type of trick question: it is designed to imply something that the interrogee probably disagrees with and make the listeners into believing that the implication is true. Moreover, it is typically made in a way that protects the person doing the questioning. As Bo Bennet explained in his Logically Fallacious: The Ultimate Collection of Over 300 Logical Fallacies:

A question that has a presupposition built in, which implies something but protects the one asking the question from accusations of false claims.

Furthermore, it’s also known as a “complex question” (closely related to a loaded question), “false question”, and “fallacy of presupposition”.

Not fallacious

It is important to keep in mind that not every assuming question is loaded. This logical fallacy occurs only if the implication being made is not a verified and accepted fact.

For instance, if the respondent in the first example below is known to be an abuser, then the question wouldn’t be fallacious.

Examples

Loaded Question - Example and definition

👉 Example 1

A classic example of a loaded question is:

  • “Have you stopped beating your wife?”

This question implicitly assumes that the respondent has been abusing his wife in the past, and whether he was to answer “yes” or “no”, he would appear to admit that the implication is true: A “yes” would mean that he has, in fact, been beating his wife in the past, however not anymore, and a “no” would mean that he has and still is beating his wife.

Note that a good response and a way out of such a question would be to directly address the implication and refute it: “I have never beaten my wife”.

👩‍🔬 Example 2

Another example would be:

  • “So you are one of those science-hating creationists?”

Here, the question assumes that the respondent must hate science if they believe in creationism.

This can also be seen as a leading question because it attempts to force them to agree with the questioner’s views; it not-so-subtly suggests that denying the question would be the “correct” answer; replying “yes” would mean that he or she agrees to hate science.

As such, it’s a manipulative attempt by the questioner to limit the possible replies to only those that would serve their agenda.

🧒 Example in Real-Life

Madeleine Albright, who was U.S Ambassador to the U.N, was asked a loaded question and fell into the trap on 60 minutes (in 1996) regarding the effects of UN sanctions against Iraq:

Lesley Stahl: “We have heard that a half million children have died. I mean, that is more children than died in Hiroshima. And, you know, is the price worth it?”
Madeleine Albright: “I think that is a very hard choice, but the price, we think, the price is worth it.”

👉 Other Examples

  • “Why do you hate religious people?”
  • ‘‘Where did you hide the gun?’’
  • “So, have you always had a gambling problem?”
  • “Why are you so lazy?”
  • “Have you always been an alcoholic?”

The post What Is The Loaded Question Fallacy? Definition and Examples appeared first on FinMasters.

]]>
https://finmasters.com/loaded-question-fallacy/feed/ 0