Video Transcript: Think Harder
Hi, I'm David Feddes and my aim in this talk is to help you think harder. When we're thinking about logic and critical thinking, our goal is rationality. Daniel Kahneman, the great psychologist speaks of the human mind in terms of two systems, system one is impulsive and intuitive. And system two is capable of reasoning, and it's cautious. But at least for some people, it's also lazy. Now, when we think about rationality, we sometimes equate it with intelligence. But intelligence is just sheer brainpower. Rationality, is when you actually avoid mistakes, rationality works to avoid cognitive biases, realizing some of the mistakes that intuition can make. And rationality is thinking carefully to avoid the mistakes. And to get at the truth more accurately. In this talk, I'm going to be leaning especially on two books Thinking Fast and Slow, by Daniel Kahneman and The Art of Thinking Clearly, by Rolf Dobelli. Kahneman is someone who won the Nobel Prize in economics, but is actually Professor Emeritus of Psychology at Princeton University. And Rolf Dobelli is a novelist and an entrepreneur who isn't really a scholar in the realm of philosophy, although he does have a PhD in philosophy, but doesn't teach that that's not his career anymore. But Dobelli is a novelist, as well as an entrepreneur. And he's done a lot of thinking about the area of rationality. And The Art of Thinking Clearly is kind of an easy to read book that comes in snippets, whereas Kahneman's book is a massive and detailed scholarly work. When we think of thinking, we have to look at two kinds of knowledge. And Dobelli mentions real knowledge, those who think deeply and who pursue understanding and really have a grasp of things. And in contrast to that there is chauffeur knowledge. Dobelli repeats a story from Charlie Munger, about the physicist Max Planck. He's a great thinker from the early 1900s. And in this story, he is giving Lectures on Physics all over Germany. And it's a lecture that he gives again and again and again on very, very complex topics. And his chauffeur is there every time and hears this lecture again and again and again. And one day the chauffeur says to the great physicist, Sir, aren't you getting a little bored giving that same talk? I could give that talk? Why don't we switch places? You dress up as a chauffeur. And when we go to Munich, I'll give the talk. And that sounded good to Planck. So they went through with the plan. And it worked. The chauffeur remembered that talk on physics, and he recited it perfectly. There's only one problem. At the end, a professor of physics stood up to ask a very, very difficult question. And when the question was finished, the chauffeur standing up in front, looked at it for a moment said sir. I'm really surprised and disappointed that a professor in such an important city would ask such a simple question. Even my chauffeur could answer that. And so he asked Max Planck in the front row to answer the question. Now, I'm not sure I believe that story. But Dobelli tells this story to make a point that there is such a thing as real knowledge where you're the one who thought through these deep theories and made these great discoveries versus the chauffeur who's heard it told a lot, the driver who just happens to be able to act like a parrot and repeat it all. Now. When we think about a celebrity CEO of a corporation, sometimes they're celebrated as some great genius, when in fact, they're a good showman. And they are able to act very well. News anchors aren't always people who are super reporters who know how to dig into details. You know what they really are. They're people who know how to read teleprompters, and have great hair, and sometimes cheekbones, in the right place, or the proper kind of chin. And so chauffeur knowledge can sometimes be putting on a show and that can be a difficulty for a star preacher too after a while. If you have a certain kind of showmanship skill, you can learn to talk a certain way. You can absorb a bit of knowledge from somebody else and spout it back and act like you're some kind of great preacher when in fact you're not a person of prayer. You're not seeking to know God better. You're not studying the scriptures thoroughly. But boy, you can kind of repeat somebody else's sermon, and look good and sound good while you're doing it. So again, just keep in mind those two kinds of knowledge, real knowledge, and chauffeur knowledge. Now intuition is a wonderful gift from God, and the marvels of intuition. enable us to know some things just instantly and adequately, the way that our minds are put together in a marvelous way by God allows us by intuition, to almost instantly draw upon a whole bunch of different kinds of knowledge and reach a conclusion just like that. But intuition not only has Marvels it has some biases. Sometimes, intuition jumps to false conclusions, and it prompts bad decisions. Daniel Kahneman talks about cognitive ease, and cognitive strain, when your mind is feeling good, and you're in a good mood, and you like what you see. And you believe what you're being told what you hear, you trust your intuitions, and the information that you're working with in the setting that you're in, or just comfortably familiar, you feel like you're in a groove. But when you've got cognitive ease, in fact, you're kind of casual and superficial in your thinking. That's the downside of cognitive ease. Cognitive strain, also has a downside cognitive strain involves being very vigilant. Sometimes you're suspicious, and doubtful, and that's always harder than just believing. It takes more effort, it's less comfortable. It also results in fewer errors. But the downside of cognitive strain is that often, you're less creative, and less intuitive, when you're really digging in and thinking hard and doubting. So each kind of situation has its advantages and disadvantages, cognitive ease, and cognitive strain. Let me just warn you, this kind of talk or this kind of class involves some cognitive strain. It involves some hard thinking. Logic is not always easy. Probability is not always easy. Critical thinking can be very hard. And it's not always an exercise in creativity, or exercising your intuition. It is something that takes strain but can also sharpen and strengthen your thinking. Heuristics is a word that sometimes used for a mental shortcut to deal with a hard question. It's related to the word the Greek word Eureka, I found it. And there is this feeling that I know the answer I found it. And intuition uses various heuristics various shortcuts to provide quick simple answers, and these are often adequate, but sometimes, heuristics of intuition are flawed and misleading those quick answers those easy answers those ones that seem instantly right. Aren't. Daniel Kahneman says, You are rarely stumped. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You often have answers to questions that you do not completely understand relying on evidence. You can neither explain nor defend. You're never stumped. That doesn't mean you're never wrong. It said if some people they are often wrong, but never uncertain. Well, that's because almost all of us are wired or feel like we have instant answers. We're not stumped. We know the answer. And if we don't know it, we'll make it up almost on the spot. Sometimes without even knowing we made it up. We use heuristics, those mental shortcuts. Kahneman says if a satisfactory answer to a hard question is not found quickly, system, one will find a related question that is easier to answer and will answer that question. instead. when called upon to judge probability, people actually judge something else and believe they've judged probability, because they're not really good at handling real probability questions. So they substitute and that is one of the things that our minds do we substitute? Here's a hard question, which person would be the best to hire for this position? Here's an easier question. Who makes the best impression during a job interview? You like their face you like the way their hair is? You like the way they talk and whether they could actually do the job? Well, that's another matter. It's literally another question. But many job interviews because they can't really answer the question, who has the skills, the experience and everything it takes will depend more on the impression of a job interview, which answers the question Who makes the best impression? Not who is going to be the best at this job? Here's a hard question. What did this Bible passage mean, in its original context? That's a hard question. You've got to dig into all kinds of study, you have to think about the rest of the Bible book in which that passage appears. You've got to think about the original history if you can, and learn more about the the ancient Near East. And so that's a tough question. Here's a lot easier question. What comes to mind right away when I read this passage, that was a lot easier, wasn't it? And it saved me a whole lot of work. But I may end up being dead wrong, because my first impression of the passage is a very different thing than what did the Bible passage mean, in its original context? And that's called exegesis, finding out what it meant in its original context. And it is not always what it sounded like the first time it reached my ears. Here's a hard question, what is my overall impact on my church? And on my community? It's a hard question to answer you don't know what's going on in the lives of all the various people in your congregation, or in your town and surrounding area. But you still say, Well, what's my mood? Actually, you don't say it, you just substitute almost subconsciously, what's my mood based on a recent compliment, or a criticism? And if you're in bad mood, because somebody criticized you say, Boy, I'm not having a very good impact in my church. Or if somebody had a compliment? Well, I'm really on a roll, I'm having a great impact. But what you're actually doing is substitution. You're answering the hard question about overall impact, with a vibe or a mood based on maybe one incident. And we do that very frequently, where there's a tough question that we can't really answer. So we just quickly and unconsciously substitute a different question and we answer it very easily. Here's another related bias, then of our mind, or of our intuition, the affect bias, which is kind of a bias toward how we feel it's a mental shortcut based on an instant like or dislike, something instantly creates either a feeling of repulsion or attraction. And we sometimes even phrase it, I feel like this is the best thing to do, or I feel that this is true. Well, we equate I think, with I feel, and we literally do that with the affect bias, the feeling we have controls the thought we think a study of 27 different stock exchanges found that if it was sunny in the morning, stocks were more likely to go up than if it wasn't Sunny. Is that objective valuation of various corporations, stock prices, no. But sunshine affected the mood of the brokers and their willingness to purchase stocks. The affect bias is just kind of going with your gut, going with that gut feeling you have and making your decision based on that. The affect bias affects your evaluation because if you like something, then the risks of doing it look smaller, and the benefits look larger. If you dislike something, then the risks look bigger. And the benefits look smaller your feeling of like or dislike for, let's say, genetically modified organisms in your food right away, you may have a feeling about that, and you will not easily evaluate the risks and the benefits or vaccines. Some people are very opposed to vaccines, others are very opposed to those who oppose vaccines. And right away once you have that instant like or dislike, it will color nearly everything else you think about the benefits or the risks of something. Daniel Kahneman says system two is more of an apologist for the emotions of system one than a critic of those emotions an endorser rather than an enforcer. System two is supposed to be kind of a reality check on system one that very often, it simply dreams up explanations and digs up evidence to support what system one has already fed into it. In its search for information and arguments is mostly constrained to information that is consistent with existing beliefs, not with an intention to examine them. So system two is just looking for evidence to support what it already believes because of what system one feels. Another bias that's closely related is Association bias. You sense connections where there aren't any sense connections where none exist. And it's rooted often in an emotional response. If you're an investor, and you put some money in the stock market for the first time, and it goes up, you think I'm pretty good at this. I'm pretty smart. When in fact, you were probably just investing at a time when stocks were going up, and most dummies would have done well. On the other hand, if you invested at just the time it plummeted, you say, oh, man, am I stupid, I'm a terrible investor, you had beginner's bad luck. And you might be gun shy about investing ever after. When you watch TV ads, you see these gorgeous models. And they're associated with a product on TV, or you see these macho popular athletes. And they're standing beside a car that somebody is trying to sell or some sports drink or something else that's being sold. And you have an association bias and TV, advertisers know that you, you link in your subconscious, the beautiful model, and you say, Boy, I like beauty. And I like that product. I like great athletes. And I like that product. And don't even think about you don't reason a an athlete has been paid millions and millions of dollars to peddle this product to me, and it has nothing to do with the quality of the product. Well, nobody ever said advertising uses careful reasoning and logic. Another example of association bias, bad news isn't fun to receive. And so when you receive bad news, you don't like the person who brought it very much. Sometimes in a corporate environment. If somebody is honest about the pitfalls of a particular problem, or reports to the manager, what's going wrong, the manager will dislike the person who brought the bad news. In old times, some kings would kill the messenger of bad news. And we have to be aware of that. Because sometimes what we need more than anything else is somebody who's honest enough and brave enough to bring us some bad news and be willing to listen, but Association bias often creates dislike towards the messenger who brings bad news. Well, let's move on. Let me just ask you about risk factors. What causes more deaths, accident or stroke? Tornadoes, or asthma, accident? or diabetes? Now, you probably can't give the exact percentages or number of deaths caused by any of these. But if you had to guess, would you guess accident or stroke, tornadoes or asthma, accident or diabetes? Strokes killed twice as many people as accidents. But 80% of people said accidents kill more. Asthma kills 20 times more people than tornadoes. But most people think tornadoes kill more diabetes kills four times more people than accidents do. But accidents were thought 300 times more likely than diabetes to cause death. This is what studies indicated. Now why is that? Well, it might be an example of availability bias. availability bias means that you map reality, using examples that come to mind most easily. And that's often very dramatic news or a powerful experience or vivid pictures. And a tornado is a lot more sensational than asthma an accident makes the news somebody dying of diabetes doesn't make the news. And so we overestimate the likelihood of dying of more sensational causes such as a tornado, which appears on the evening news far more often, than a much more subtle and silent killer. That's an example of availability bias. Kahneman uses the acronym WYSIATI. WYSIATI what you see is all there is what you see is all there is there's a lot more out there. But to your mind, the only thing there is what you notice. And you don't know what you don't know. There's an old comedy skit, in which the comedian is looking and searching around in a circle of light. And surrounding that circle of light is darkness. And while he's searching after a while, a policeman comes along. And the policeman says, What are you looking for? He says, I'm looking for my keys. I lost them. So they both start searching around in that circle of light. And after a while, the policeman says Well, are you sure you lost them here? No, says the comedian. I lost them over there. But there's no light over there. Well, that's availability bias. You look where your light is even If the answer lies somewhere else. Now, an availability cascade means that you have an event which is highly unlikely, but it gets reported in the news. And it's very sensational. Terrorists, school shootings. When this happens, sometimes there's a report and it turns into one thing and another and there's a broad panic and then it leads government action. And something's got to be done. When in fact, those kinds of events, though they're sensational, and terrible, are extremely, extremely rare compared to the things that afflict and kill people much more commonly, everyday. But an availability cascade means it's available on all your TV sets, and something's gotta be done about it. Then there's story bias. The story bias is a tendency to shape information into a story or pattern that makes sense to you. And all of us are wired that way or just about all of us, the mind seeks consistent, colorful, simple story, not messy complexity that doesn't really have a plot to it. A stories often invent details, even without evidence, and stories also discard details that don't fit the storyline. Even though there may be evidence for those details. We like stories that are simple, strong, clear, and make us feel like we understand the situation. Here's an example from the time of the Coronavirus pandemic, a lot of shelves that were supposed to have toilet paper, were empty. And instantly people knew why that was. There are evil people, or at least really stupid people who hoard gobs and gobs of toilet paper for themselves, and nobody else can have any. That was a story that made perfect sense of those empty shelves that should have been stocked the toilet paper. A much more boring story is this. With the virus and the economic shutdowns, supply chains were interrupted, and it was harder to deliver toilet paper to stores. Moreover, while people had to stay home, they use 40% more toilet paper at home than they had been using. And meanwhile, the schools and businesses that use toilet paper in huge rolls that are manufactured by entirely different company had gobs and gobs of excess toilet paper that nobody was using. Because those schools and businesses were shut down. It was a supply chain issue. And the fact that people were using more because they were staying home more. But the story was there were these evil hoarders, it's a lot more exciting, isn't it to think of a villain, and of hoarding than to think of abstract things like messed up supply chains, and truckers and one company is manufacturing one kind of toilet paper and another is manufacturing a different kind. And the ones that are manufacturing, the kind that gets used at home isn't able to crank enough out of its factories to meet the 40% increase in demand. Now what a messy story. That happens to be almost certainly the true story, except for a few hoarders here and there, but the shelves would have been empty, no matter what, because people even who weren't hoarding, we're using more at home. But we like our stories, and we like them simple. And we like heroes and villains. So you have these stories that just make up detail sometimes because they pop right in our minds, our intuition starts filling in blanks. And at the same time, if there's anything that doesn't fit our story, yeah, I don't think we need to bother with that. And that's not even a conscious thing. You're not saying on purpose. I'm discarding this you just, you don't think about it in the first place. So you have story bias the mind, especially system one says Daniel Kahneman appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities. And we have a harder time thinking about more abstract things, or things that don't make much sense or seem to be a bit more random story bias is also called Narrative bias the confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind. Not necessarily that the story is true. So next time you hear a story that seems to make a lot of sense of everything. If you slow down and think harder, you find that often there are details that don't match and a few other details that were maybe almost automatically made up to fill in some details are just supposed. By the way, speaking of story bias, both Daniel Kahneman and Rolf Dobelli warn against story bias. And they have an enormous story bias running throughout each of their books entirely. When they talk about Thinking Fast and Slow, they talk about thinking fast is something our hunter ancestors had to do during the process of evolution. And they really hadn't react fast in certain situations. And this is where we got our fast intuition and our fast thinking, oh, so we have researchers back then evaluating speed of thinking in response to the pouncing of a saber toothed Tiger. They're just making up stories. And the story pops up again, and again, and again, all through both books. So let me just warn you that even people who write about story bias have their own big stories that they use, to make sense out of things where no research and no real connection has ever been demonstrated. Christians have a big story too, we believe that God created the heavens and the earth, that humanity was made perfect that he fell into sin. We even believe that perhaps system one goes wrong because of the fall into sin. And system two is lazy and doesn't want to think harder, because of sins impact not because of evolutionary developments, but because of flaws in our thinking that have crept in. So we all have stories, sometimes we have what's called a meta narrative, a really big story that's meant to make sense of just about everything, or at least everything fits under that umbrella. And story bias is a role that evolution plays big time in the thinking of many otherwise excellent scholars. Another thing that we need to be aware of, as we think critically is framing. The framing is when you have phrases that have the same meaning, but they're stated in different ways. And when it's stated in different ways, it's received in different ways. A stock plunged where the market goes down, down, down, down, down, is a correction. Isn't it nice just to have a little correction what might be wrong now and then you're losing your shirt, it's a correction. A problem is now a challenge, or an opportunity in the language of many corporate types. Or, let's go to consumer language 98%. Fat free in surveys is rated healthier than something that's labeled 2% fat. It's rated healthier than things that are rated that have 1% fat now, an item that's 98%, fat free has twice as much actual fat as an item that's only 1% fat. But if you state it as 98%, fat free, it's considered healthier than something that has only 1% fat. So that's how framing works. Framing has people standing in line for the hamburger that's 75% fat free, none of them want that hamburger that's 25% fat, even though it means the same thing. Framing in the used car business, you don't call it a used car, you call it certified pre owned. You don't fire people, you downsize, you don't kill babies, you terminate your exercising choice. This is the glory of reproductive freedom. So, framing is using different words to describe the same reality and in using the different words. It's received very differently. By the one hearing it you need to understand that when people are framing things, when they're communicating with you, you also need to be aware of that. When you're trying to communicate with others the way you say it is as important as what you're actually saying in terms of how it comes across to other people. Here's the question, what do you think of Allen and Ben? Allen is intelligent, industrious, impulsive, critical, stubborn, envious. Ben is envious, stubborn, critical, impulsive, industrious, and intelligent. When people are asked this in psychological studies, they like Ben, and they have a much or they like Alan, and they have a much higher opinion of him than they have of Ben, even though the exact same six traits are listed there. Now it's not presented in that way. They they do the study where you're asked to rate one then you're asked a little differently, but but they they may ask a different audience, but they do it with many, many people and they find again and again that Alan, with those traits in that order is much more popular than Ben with those traits in that order, same set of traits, but in reverse order. Now what's going on with that? Well, that's an example of the halo effect, where first impressions color everything else that you think one aspect triggers a feeling. And then that feeling dominates, how you see the whole picture. Sometimes that can have some major implications. In some studies, students were given photos, and they were asked to rate those photos as likable and this or that. And they were asking also, which photo is more competent, which person looks more competent. So they rated the photos and more competent was equated with a confident smile and a strong chin. What the students didn't know that they were being showed pictures of people who were politicians running any elections. And it turns out that people with the right faces won the election 70% of the time, so by more than two to one margin, the people with the better face were the ones who would win the election. Now actual leadership is unrelated to the strength of your chin, or the quality and confidence of your smile. There's a Bible story where Samuel sees Eliah and Sam wasn't told that one of Jesse's sons is going to be the next king. Samuel takes one look at Eliah and he thought, surely the Lord's anointed stands here before the Lord. But the Lord said to Samuel do not consider his appearance or his height. For I rejected Him. The Lord does not look at all the things man looks at man looks at the outward appearance. But the Lord looks at the heart. It's an ancient example of the halo effect, even a great prophet like Samuel just took one look. And thought that's a King. He looks like a king and God said not so fast. He's tall, he's good looking. He's not King material. The halo effect has a variety of impacts. Daniel Kahneman talks about his own experience as a professor grading essay tests. And he found on studying his own behavior that if the first essay on a test was really good, they affected his grading of all the other essays on that same test, where he would not on purpose, but just give the benefit of the doubt on later essays. It might not have been so hot btt, hey, this kid knows what he's talking about. He found that when he changed his practice, and just went through and graded all of essay, number one on all the tests, then went back through and graded all of essay number two on all the tests, then go back through and graded all of essay number three on all the tests, the grading was very different than if he was just grading one student at a time. And the first essay was good. That was the halo effect. The first essay made an impression, and it just kind of carried through to all the other answers. He said, From then on, he was a lot less confident in his grading skills, but probably more accurate, even though we had less confidence. The halo effect comes when the first in a meeting speaks up and kind of sets the tone. And then the whole discussion is influenced and colored by the first to speak, your first impression of a person often sticks with you, you know the old saying, you don't get a second chance to make a first impression. That's the halo effect. Once you've got that impression, it's kind of hard to get beyond it. The halo effect also comes through sometimes when one's view of Jesus, or your view of the church is shaped by how you feel about one member of a church that you had a run in with or somebody that you had very positive feelings about. And you think the whole church is wonderful, or the whole church stinks. Because of the effect of your experience of that one member. That's the halo effect, taking one part, and then drawing conclusions about all of the rest because of one fact or one experience. Here's a question. How old was Gandhi when he died? Well, on surveys it was asked this way was Gandhi more than 35 years old when he died? They'd answer that question yes or no. And then the next question was, how old was Gandhi when he died? Then the survey asked a whole different set of people a large set of people this question, was Gandhi more than 114 years old when he died? How old was Gandhi when he died? Those who were asked was the less was he over 35 tended to underestimate Gandhi's age by quite a bit. Those who were asked was he more than 114 tended to overestimate Gandhi's age. Now if you actually knew the answer, if you knew that Gandhi died at age 78, then This would have zero impact on you, you'd just know. But when you don't know, you guess. And when you guess on matters, you don't know, there is what's called anchoring bias when people consider a particular value for an unknown quantity before estimating that quantity. The estimates stay close to the number that people considered. So the anchor of 35 made you underestimate Gandhi's age, the anchor 114, though you knew we probably didn't get that old, still tended to bump the estimate higher than it really was, once the anchors there, then when people start estimating, the anchor has a pole on their estimate. And this is involved in a whole variety of ways. It affects the asking price of a house, and then the estimates that people have of the houses actual value, a study was done, actually, more than one study about the impact of the asking price on how people would estimate the actual value of the house. And various students and people who didn't know much about evaluating, evaluating houses were asked and their estimate was always pretty close to the asking price that the researchers gave for the house. Of course, seasoned professional real estate agents wouldn't be affected by an anchor, like an asking price, would they? Or would they? the surveys found that they were almost as influenced by the asking price as the students who didn't know anything, even though they prided themselves and said that they ignored the asking price and did all the comparables and did all of the other research that a good real estate agent would do. And that's how they came up with the price. But the study determined that their price was heavily heavily influenced by that asking price, even though it was just kind of a random asking price. So anchoring, can have a powerful impact even on professionals, the suggested retail price in a store, they put that suggested retail price, and then they tell you 25% off, or 50% off, you think you're getting a great deal. But the only reason you think you're getting great deal is because they said that's the suggested retail price. But they're the ones who made up the suggested retail price. There's an old story about Sid and Harry, who were clothing salesmen and one was more the tailor, the other was working the counter and working in sales. And whenever Sid would see a customer looking in a mirror and really enjoying a suit that he was trying out, Sid would walk over. And then he'd call over to Harry, hey, Harry, how much for this beautiful cotton suit. And Harry would call back $424. And Sid would say, Oh, Harry says $224. And Harry had, you know, been kind of hard of hearing, and the customer thought $224, what a steal. And he'd walk out with a suit in a hurry. Well, of course, that was just fake hard of hearing, and they wanted to $224 for the suit all along. But it's anchoring, you think you're getting a fabulous deal. If somebody put a certain number in your head, and now you got it for a lot less than that. Ralph Dobelli tells a similar story. In his book about anchoring bias. The research was done on the use of sales in stores. And if you had the sign limit 12 Cans per person super sale, then they would sell an average of seven cans per customer. If there was no limit. They sold far less. It's just the anchoring. People said, Oh man, if they won't let you have more than 12, I'm going to grab a bunch. If there was no limit on what was being sold, they actually chose fewer anchoring also can have an impact on charitable donations and and how people try to raise funds. studies were done using fundraising for saving endangered animals. And the question was asked people whether they cared about them. And they did. And so they be asked, Would you be willing to give $5? And the answer was yes. And the follow up question was, how much would you be willing to give? And the answer was $20. So they were a lot more generous than that. $5. The other side of the survey was would you be willing to give $400? No, how much would you be willing to give $143 was the average contribution. So if the anchor was five, then the gift was 20. If the anchor was 400, the gift was 143. You can see the power of anchoring for fundraising methods. Sometimes anchoring even figures into our thinking about theology, the things of God and about ethics. The mainstream In what's considered the mainstream can often be an anchor, and the mainstream shifts. So, if, in old times you were considered a theologian if you believed in the absolute authority and rightness of the Bible, but nowadays, there are extremely liberal theologians who don't believe much of the Bible at all. And so people who say, Well, you know, the Bible has a lot of good stuff in it. And a lot of mistakes are considered pretty mainstream, because they're the anchor. Now, we're in the field of ethics. For a while, the understanding of Christian emphasis would be that you get married, and you stay married for life. And marriage is between a man and a woman. After a while, people got used to divorce. And so that was kind of normal. And that changed, where the anchor is for what is right. In sexual ethics, then, living together became widely accepted. And that changed where the anchor was. But then homosexuality became considered healthy and normal. And so that changed what what is the acceptable range of the anchor and then transsexual to become a man, if you were born a woman or to become a woman, if you were born, a man was emphasized and normalized, and so continually as the anchor of what's considered mainstream changes, then what was considered previously extremely aberrant or strange or perverted, or a result of mental illness came to be considered more and more mainstream, the anchor changes the perception of everything. There's familiarity bias, you're anchored by what you've heard before. And when something is repeated often enough, it's believed a reliable way to make people believe in falsehoods is frequent repetition, because Familiarity is not easily distinguished from truth. Authoritarian institutions and marketers have always known this fact, as Daniel, Daniel Kahneman. Research even finds that if somebody is told a lie, and they know it's a lie, but then later on, they kind of forget where they heard it. And they have only a vague recollection of what was said, they'll be more likely to believe that it's true the next time they hear it, just because it kind of rings a bell. And stuff that kind of rings a bell is more likely to be true. So we have to be very aware that something that's just repeated over and over and over again, whether it's in the media, or in the circle of people that you're around, something that's made us feel familiar, is not necessarily true. Marketers will fire the same message out you again and again, and again, and again, and again and again and again. And authoritarian regimes will have propaganda, repeating the same lies over and over and over. And after a while, it just feels true, because it feels familiar. So think harder. Question your shortcuts you're heuristics. When you're tempted to substitute one question an easier one for a hard one. Get back to asking, what is the question I'm trying to answer? And avoid substituting an easier one, for a harder one and see if you can do the necessary research to answer the harder one. When you're being swayed by just an emotional reaction, ask yourself, am I thinking? Or am I just feeling? am I letting a vibe, overrule my rationality? When the availability bias is there, when stuff comes easily to you, all the latest news report out something you've heard somewhere, but you haven't actually studied the matter. And the question you ask, What have I overlooked? Am I just looking for my keys in the circle of light and haven't bothered to consider whether that area outside my circle of light might contain the answer? You maybe have a story that you like, and it makes a lot of sense to you. But question your story. Does your story distort reality? Are you leaving out some things? Have you just kind of automatically filled in some blanks when you didn't have data for it? Framing? Is the wording of a issue impacting me? Or am I choosing the proper kind of wording to communicate what I'm seeking to communicate? The halo effect is one thing, coloring everything else? Can you separate out your feelings about the one thing, your reaction to the one thing and take a hard level headed look at all of the factors? Anchoring various suggestions are made in areas where you don't actually know the answers. But are those suggestions coming from knowledgeable people who are giving you a decent estimate an accurate estimate? Is it fact based? Or is it just something that's planting or random or worse than random? A manipulative number or position in your mind so that will pull you in that direction? Well, when should you think hard you can't think hard all the time. You have to belief or a decision is minor, go with the intuition and don't think too hard. But if a belief or decision could have important long term impact, then put in the time put in the effort in the hard thinking, Oh, by the way, it is hard. And it sometimes wears you out thinking and deciding take energy, and errors grow. When you're tired when you're weary when you're hungry, a study was done of judges in a parole court. And these judges in the various parole courts were studied. And it was found that they granted early release to applicants far more often shortly after the judges mealtime. If they were dealing with person right after meal time asking for early release. They granted it about 65% of the time. When it was a little later, a few hours after meal time and they were further along and more tired and a little low on glucose, blood sugar, it got a lot closer to zero. And the cases were almost exactly you know, it was done objectively by the researchers, with enough judges and enough different cases to show this and the judges weren't doing this on purpose. They weren't saying I'm hungry, I'm tired. I'm grumpy, denied, denied, denied. They weren't doing it on purpose. But there they had decision fatigue. When you're tired, when you're hungry, TV ads have a bigger impact on you. When you're tempted. When your energy is depleted, you're more likely to give in you're more open to suggestion just oh, whatever. So be refreshed and well fed before you're doing any hard thinking or making big decisions. Oh, and by the way, be refreshed and well fed. Before you take on too many problems in logic and critical thinking. Don't be slug think harder. Don't be smug. Think humbler? That's the topic of our next talk.