You can protect yourself from misinformation by considering alternative interpretations, being suspicious of repetition, and taking the Pro-Truth Pledge. That's the key take-away message of this episode of the Wise Decision Maker Show, which describes how to defend yourself from misinformation via neuroscience.
Video: “Defending Yourself From Misinformation via Neuroscience”
Podcast: “Defending Yourself From Misinformation via Neuroscience”
Links Mentioned in Videocast and Podcast
- Here’s the article on Defending Yourself From Misinformation via Neuroscience
- The book Pro Truth: A Practical Plan for Putting Truth Back Into Politics is available here
- You are welcome to register for the free Wise Decision Maker Course
Hello, everyone, and welcome to another episode of the wise decision maker guide, where we help you make the wisest and most profitable decisions. Today, I'd like to talk about how you defend yourself from misinformation. And we usually talk about misinformation in the political sphere. But misinformation applies just as much to the business world where we get false information, whether false information about stocks or false information about professional activities, health information, or private life, personal life, there's a lot of health misinformation floating around out there, and all sorts of misinformation. So the techniques that I'll describe are applicable to all sorts of misinformation in all arenas. The first thing to realize about defending yourself from misinformation using neuroscience based techniques is to think about and understand how we deal with information, information in the first place. Before we get to misinformation, our brain is unfortunately, pretty lazy. That's just the reality of it. This is what cognitive neuroscience research shows, we don't like to spend energy on processing things we don't like that our brain doesn't feel good about that it doesn't feel right about spending energy and processing things. So it prefers to follow the route of easiest processing. It's lazy, it's low energy, that's good. A way to that you should think about your brain, just like my brain, my brain is just as lazy as everybody else's brain. And the reality is, yours is too. So as a result, we dislike and we just trust my dislike, and I distrust just intuitively with my emotions, going with my gut information that feels uncomfortable. So when we get information that feels uncomfortable, we inherently dislike and distrust this information. That is just how it works. That's how our brain works. And of course, by contrast, we feel good about what we like and we trust, information that we feel comfortable with. So that is very important that when we go with our gut intuitions, which is what we talk about comfortable information, right? What does it mean for information to be comfortable for us? It means that it feels good in our gut, it feels good, how we are defined by God, heart, whatever, it feels good, the information makes you go like, yes, this is right, this is good. This is true. This feels right. That is the feeling of information that you inherently like and trust that's comfortable. And of course, uncomfortable information is now this is wrong, that can be true. your gut is telling you that it makes us vulnerable. Unfortunately, these feelings about information, and perceptions, where we perceive information we like, well, that's comfortable to us, that we are trustworthy is true. And we perceive information that is uncomfortable to us as something we dislike, something that's not trustworthy. And something that's not that old makes us vulnerable to cognitive biases, those are the dangerous judgment errors that result in the US buying into misinformation, buying into misinformation, because the cognitive biases are how we get to information that we like and trust. That's unfortunately false. So this is critically important to realize, there's lots of information out there that we like and trust, that's false. And there's lots of information out there that we don't like that we don't trust. That's true. It's just the way it works. It's just the way our brain works. And just the way the reality works. And cognitive biases are the specific patterns that you can use to recognize when you're likely to fall for misinformation that are specifically three cognitive biases I want to talk about that are especially important for misinformation for addressing misinformation. And there are over 100 cognitive biases out there. You can learn more about them in my book, never go with your gut how pioneering leaders make the best decisions and avoid business disasters. So just look up the list on Wikipedia, if you want a quick reference guide, but that are free that are most important for addressing misinformation. Those are the truth effect, confirmation bias and narrative fallacy. So those are the three you want to really focus on addressing in order to protect yourself from misinformation. So those are the true effects. What does that refer to? Well, that refers to the fact that when we hear something that's repeated again and again, it feels more true each time we hear. So again, when we hear something when we learn something more than once. We trust it more than we've trusted it before. And that's Just the way our brain works when we hear something, again and again and again, we feel more positive toward it, our lazy brain feels that, hey, we heard this information before, and therefore, it's not as novel. so lazy brains are lazy brains that react strongly to novel information, we inherently have more of a suspicion of novel information, we have more of an arousal response in our brains toward novel information. And when we have more of an arousal response, we have, generally speaking, a discomfort with that information. But when we hear the information again, and again, and again, we have less of an arousal response, we're more comfortable with it. And so as a result, we trust it more, because we have less of that arousal response. So that's something you really got to be watching out for information that's repeated, repeated, repeated again, where the first time it feels the second time, you have less response. And third time It feels Oh, yes, this is old news. And therefore old true news. Even though it might be complete misinformation. That's the first thing you want to watch out for. The second thing, confirmation bias. If you've heard about any cognitive bias, you've probably heard about this one. Confirmation bias is very well known. We inherently look for information that confirms our beliefs. And that's, of course, natural for the lazy brain. Think about the brain. It's a lazy brain. It doesn't like information that goes against its beliefs. So we inherently look for information that confirms what we already believe, because we feel good about it. We feel good about what we already believe, regardless of whether it's true or not. And we look for information that confirms what we already believe, because we want to feel good. We don't inherently want things to be true. That is not a want. This is a thought pattern, our gut feeling, our gut desire is to look for information that is comfortable to us, that confirms what we already believe. So that is what we look for, what are our beliefs, our feelings, our intuitions, our desires, all of these sorts of things. And another part of the confirmation bias. Of course, the inverse of that is that we reject information that goes against our beliefs, our feelings, our intuitions, our desires. If we don't like something, if we don't want something to be true, then we will inherently reject information about being true, even though that information may be very accurate. So you gotta watch out for confirmation bias, especially with kinds of false claims. If you're defending yourself from misinformation, not simply looking for accurate information, but specifically defending yourself from misinformation. When you feel that something is true, because it meets your beliefs. You've got to be suspicious of yourself, those intuitions, those feelings, those as dyers, they may well lead you in the wrong direction. And next, the third thing. And last but not least, is the narrative fallacy. We trust narratives we like stories that sound convincing, but some true. And that's why everyone who wants to convince us communicates in stories, whether it's politicians, whether it's people who are presenting, trying to get us to buy something, sell us on something in business settings, or in health settings, someone who wants to push some kind of homeopathic fake medicine. They tell stories they tell convincing Stories of Real people or people that sound like real people. And that sounds true, these stories feel true to us. Our brain, the deeper emotional part of our brain. The lazier part of our brain really likes to take in information from narratives. It likes to have easy, convincing explanations of the world. So like so simple, clear narratives, those are the ones that sound convincing to us. The reality is, the world is not simple. The world is not clear. The world is complex, confusing, contradictory. So when you hear those simple, clear narratives about reality, you get to be suspicious that you're being fed some gray day misinformation. So now that we know about the free most important cognitive biases that make us vulnerable to misinformation, how do we actually fix them? How do we fix our brains? The first technique so there are a number of techniques that research has shown are effective for addressing misinformation, addressing those cognitive biases that lead us to buy into misinformation. The first is whenever you hear a claim, whenever you hear a story, whenever something that confirms your beliefs, Especially something that you hear more than once. Consider alternative possibilities. And be wary of favoring ones that, especially repeated claims that favor your side. So you want to consider alternative possibilities and can always consider alternative possibilities. Now, there's a claim out there that you're hearing, consider immediately what can be the alternatives to consider the opposite. You know, one thing you might hear one thing, but the reality might be the complete opposite. Or the thing that you hear might be an exaggeration of reality. So for example, with stories, very often what happens with stories, I mean, some stories are outright Bs, people are just feeding you complete misinformation. But some stories are a form of misinformation, where you cherry pick a story from a variety of stories, some maybe, for it had maybe, let's say, some medicine, some supposedly homeopathic medication, some fake medication, you know, there's a story of someone who took it, and they got better. Now, that's the story that will be repeated time and time and time again, even though there might be 1000 people who took the same medication and didn't get better, got worse, or and though the person who took the medication might have gotten better for a whole variety of other reasons. You know, there's a lot of research in medicine that shows that overtime or on bodies cure a lot of illnesses. So taking a medication does not and then getting better does not mean that that medication is what caused you to get better. You want to have clear evidence based research studies showing that hey, this medication is actually correlated with an improvement in your health, rather than fake homeopathic medicine, which cherry picks things. So that's often an exaggeration, that results in misinformation. You want to question stories, so please make sure to question stories in particular, and something that you can do more broadly. So besides these strategies, considering alternative possibilities, be wary of repeating claims that match your beliefs for so fighting the confirmation bias, illusory truth effect, questioning stories, those narratives that you're getting and falling for the narrative fallacy. More broadly, you can make a personal commitment to the truth at pro truth pledge.org. Again, that's PR O TR, u th p la gg e.org Pro truth pledge.org. That website lists a set of 12 simple, clear behaviors, including those that I just talked about just now considering alternative questioning stories, being wary of repeated beliefs, and many more so 12 behaviors that neuroscience research has clearly shown are correlated with the truth. So if you follow those behaviors, you'll be much more likely to protect yourself from misinformation, and also protect our society from misinformation, protect your colleagues from misinformation, protect everyone you engage with from misinformation, if you follow those 12 behaviors. So that's how you protect yourself from misinformation. I hope this has been helpful for you. Something I want to note is that there's a blog in the show notes that has a lot more resources on this topic. So please check out the blog, it'll go into much more depth, all the citations for this stuff, so you can read it in more for more depth. If you like this episode of the wise decision maker show, please click like. And of course, follow it on whatever venue you're getting your wise decision maker show. We are both in video form and an audio form. So you might be hearing the podcast, you might be watching the video. Please go and check out the other one. It will be in the show notes as well. Leave your comments, I'd love to hear what you think about our shows that helps us improve our content and help make it better for you going forward. Now something that I wanted to mention is that all of these shows are based on a number of books that I wrote. Now one of the one that I already mentioned before now is never go with your gut hopin you're nearing leaders to make the best decisions and avoid business disasters. It's linked in the show notes, talks about how to make the wisest decisions in all sorts of settings and addresses cognitive biases. Another one that's really irrelevant to truthfulness is called pro truth, a practical plan for putting truth back into politics. And that talks about all sorts of things relating to truthfulness and how to protect yourself and our society and our businesses from misinformation. One thing I want to note is that there's a free resource that you can take advantage of that's supremely useful. It's called The wise decision making course. There are eight video based modules that provide clear guidance on how to make the wisest decisions and avoid business disasters and avoid all sorts of disasters, including by protecting yourself from misinformation addressing cognitive biases. The first module of that course is an assessment on dangerous judgment errors in the workplace, which helps you understand where these cognitive biases might be harming you and your workplace and your colleagues and your life in general. So check that out at disaster avoidance experts comm forward slash subscribe again, that's disaster avoidance experts dot com forward slash subscribe and of course it's going to be linked in the show notes. Alright everyone, I hope you've enjoyed this episode of the wise decision maker show. And as always, I'm wishing you the wisest and most profitable decisions
Transcribed by https://otter.ai
Originally Published at Disaster Avoidance Experts
Bio: An internationally-recognized thought leader known as the Disaster Avoidance Expert, Dr. Gleb Tsipursky is on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A best-selling author, he is best known for Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters (Career Press, 2019), The Blindspots Between Us: How to Overcome Unconscious Cognitive Bias and Build Better Relationships (New Harbinger, 2020), and Resilience: Adapt and Plan for the New Abnormal of the COVID-19 Coronavirus Pandemic (Changemakers Books, 2020). He published over 550 articles and gave more than 450 interviews to prominent venues such as Inc. Magazine, Entrepreneur, CBS News, Time, Business Insider, Government Executive, The Chronicle of Philanthropy, Fast Company, and elsewhere. His expertise comes from over 20 years of consulting, coaching, and speaking and training as the CEO of Disaster Avoidance Experts. It also stems from over 15 years in academia as a behavioral economist and cognitive neuroscientist. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, LinkedIn, and register for his free Wise Decision Maker Course.
Disaster Avoidance Experts
Dr. Gleb Tsipursky is on a mission to protect leaders from dangerous judgment errors known as cognitive biases by developing the most effective decision-making strategies. A bestselling author, he wrote Never Go With Your Gut (2019), The Blindspots Between Us (2020), and The Truth Seeker’s Handbook (2017). His cutting-edge thought leadership was featured in over 400 articles and 350 interviews in Time, Fast Company, CBS News, Inc. Magazine, and CNBC. His expertise comes from over 20 years of consulting, coaching, and speaking and training experience as the CEO of Disaster Avoidance Experts, along with over 15 years in academia as a behavioral economist and cognitive neuroscientist. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on Twitter @gleb_tsipursky, on Instagram @dr_gleb_tsipursky, and visit https://DisasterAvoidanceExperts.com/GlebTsipursky to learn more.