Latest news with #FastandSlow
&w=3840&q=100)

First Post
5 days ago
- First Post
Air India plane crash: Is flying risky business?
Catastrophic events such as the Air India crash affect us deeply. The thought of going down in an aircraft may feel more frightening than dying in other ways. All this taps into the emotions of fear, vulnerability and helplessness, and leads to an overemphasis on the risks. However, air travel is still arguably the safest method of transport read more Members of Indian Army's engineering arm prepare to remove the wreckage of an Air India aircraft, bound for London's Gatwick Airport, which crashed during take-off from an airport in Ahmedabad, India. Reuters On Thursday afternoon, an Air India passenger plane bound for London crashed shortly after takeoff from the Indian city of Ahmedabad. There were reportedly 242 people onboard, including two pilots and 10 cabin crew. The most up-to-date reports indicate the death toll , including people on the ground. Miraculously, one passenger – British national Vishwashkumar Ramesh – survived the crash. Thankfully, catastrophic plane crashes such as this are very rare. But seeing news of such a horrific event is traumatic, particularly for people who may have a fear of flying or are due to travel on a plane soon. STORY CONTINUES BELOW THIS AD If you're feeling anxious following this distressing news, it's understandable. But here are some things worth considering when you're thinking about the risk of plane travel. Dangers of flying One of the ways to make sense of risks, especially really small ones, is to put them into context. Although there are various ways to do this, we can first look to figures that tell us the risk of dying in a plane crash per passenger who boards a plane. Arnold Barnett, a professor at the Massachusetts Institute of Technology, calculated that in 2018–22, this figure was one in 13.7 million. By any reckoning, this is an incredibly small risk. And there's a clear trend of air travel getting safer every decade. Barnett's calculations suggest that between 2007 and 2017, the risk was one per 7.9 million. We can also compare the risks of dying in a plane crash with those of dying in a car accident. Although estimates of motor vehicle fatalities vary depending on how you do the calculations and where you are in the world, flying has been estimated to be more than 100 times safer than driving. STORY CONTINUES BELOW THIS AD The tail of the Air India Boeing 787 Dreamliner plane that crashed is seen stuck on a building after the incident in Ahmedabad. Reuters Evolution has skewed our perception of risks The risk of being involved in a plane crash is extremely small. But for a variety of reasons, we often perceive it to be greater than it is. First, there are well-known limitations in how we intuitively estimate risk. Our responses to risk (and many other things) are often shaped far more by emotion and instinct than by logic. As psychologist Daniel Kahneman explains in his book Thinking, Fast and Slow, much of our thinking about risk is driven by intuitive, automatic processes rather than careful reasoning. Notably, our brains evolved to pay attention to threats that are striking or memorable. The risks we faced in primitive times were large, immediate and tangible threats to life. Conversely, the risks we face in the modern world are generally much smaller, less obvious, and play out over the longer term. The brain that served us well in prehistoric times has essentially remained the same, but the world has completely changed. Therefore, our brains are susceptible to errors in thinking and mental shortcuts called cognitive biases that skew our perception of modern risks. This can lead us to overestimate very small risks, such as plane crashes, while underestimating far more probable dangers, such as chronic diseases. STORY CONTINUES BELOW THIS AD Why we overestimate the risks of flying There are several drivers of our misperception of risks when it comes to flying specifically. The fact events such as the Air India plane crash are so rare makes them all the more psychologically powerful when they do occur. And in today's digital media landscape, the proliferation of dramatic footage of the crash itself, along with images of the aftermath, amplifies its emotional and visual impact. The effect these vivid images have on our thinking around the risks of flying is called the availability heuristic. The more unusual and dramatic an event is, the more it stands out in our minds, and the more it skews our perception of its likelihood. Another influence on the way we perceive risks relevant to flying is called dread risk, which is a psychological response we have to certain types of threats. We fear certain risks that feel more catastrophic or unfamiliar. It's the same reason we may disproportionately fear terrorist attacks, when in reality they're very uncommon. Plane crashes usually involve a large number of deaths that occur at one time. And the thought of going down in a plane may feel more frightening than dying in other ways. All this taps into the emotions of fear, vulnerability and helplessness, and leads to an overweighting of the risks. STORY CONTINUES BELOW THIS AD Another factor that contributes to our overestimation of flying risks is our lack of control when flying. When we're passengers on a plane, we are in many ways completely dependent on others. Even though we know pilots are highly trained and commercial aviation is very safe, the lack of control we have as passengers triggers a deep sense of vulnerability. This absence of control makes the situation feel riskier than it actually is, and often riskier than activities where the threat is far greater but there is an (often false) sense of control, such as driving a car. Passengers gather in front of the ticket counter of Air India airlines. File image/Reuters In a nutshell We have an evolutionary bias toward reacting more strongly to particular threats, especially when these events are dramatic, evoke dread and when we feel an absence of control. Although events such as Air India crash affect us deeply, air travel is still arguably the safest method of transport. Understandably, this can get lost in the emotional aftermath of tragic plane crashes. Hassan Vally, Associate Professor, Epidemiology, Deakin University This article is republished from The Conversation under a Creative Commons license. Read the original article.


The Sun
5 days ago
- General
- The Sun
Lessons from sophomore philosophy class
I took a sophomore philosophy course in formal logic from a professor who was an enthusiastic admirer of Socrates, as all are. He did not give many formal lectures. We could read the assigned texts just as well and at a time or place of our choosing, as was his excuse. As such, his class was lively with discussions and rebuttals as well as questions and answers. One of his early exercises was to pretend that we were facing a real-life decision. Should I buy a new car or fix the present one? Or not buy one at all and depend on public transit. Another was whether I should go on to graduate school or find a job; marry now or wait. He would then have us record our decision immediately, for or against, impulsively as it were. Decades later, Daniel Kahneman would call that 'fast thinking' in his book Thinking, Fast and Slow. Following that, our professor would begin the discussion and then make us list the pros and cons for each point. We would then give a numerical weightage to each statement for its importance to us. At the end, we would total the positives and negatives and then compare whether our 'fast thinking' decision made in earlier pre-analysis matched that of our deliberate post-analysis 'slow thinking.' That was his class exercise in rational decision-making, and also the one lesson I found useful and relevant throughout my life. As my dilemma was novel for the class (should I remain in Canada for graduate work or return to Malaysia?), it was discussed extensively as an example of serious decision-making. For added measure, it morphed into a discussion on community obligations versus personal aspirations, where the two would parallel and when they would be at odds, with our professor guiding and prodding us, Socrates-like. We (especially me) were surprised at how different our decisions were before and after that careful methodical analysis. That was also the first time I had entertained the thought of not returning home immediately but to stay back and continue my studies and gain valuable experience. I wanted to return as a seasoned surgeon, not a half-baked one. Looking back at that class exercise and after using that technique many times since, it is not so much the decisions that I have made over the years, rather the process that I have forced myself to engage in, that is, deliberate downstream analysis instead of a rushed decision swayed by impulses and emotions of the moment. Kahneman elaborated that in his Thinking, Fast and Slow. He remains the rare non-economist to have won the Nobel Prize in Economics for his insights on decision-making. Contrary to the prevailing wisdom in the discipline, humans are not the rational Homo economicus we are made out to be, obsessed only with seeking 'maximal utility.' Emotions and other extraneous factors do come into play, often in major roles, with our decisions. Socrates echoed something similar two millennia ago: know thyself! Or more famously quoted, an unexamined life is not worth living, reflecting the importance of critical self-reflection. As a physician and a Muslim, I disagree. All lives, being Allah's precious gift, are worth living, examined or not. My late father used a comparable technique to make us 'think slow.' Before leaving the house for a trip, he would pause and ask, 'Are we all ready?' If we were to answer with a quick perfunctory 'yes', he would be more specific as to whether the back door of the house had been locked and had we left enough water for the cat. The very act of pausing, or slowing our thinking through asking those questions, forces us to mentally recheck things. It is amazing how often we had forgotten to lock the door or switch off a light. Pausing and thinking, otherwise known as deliberating, would trigger many questions: the hows, whys, whats and whens, and most important, the 'what ifs' and the 'are you sure?' queries. Just by posing those simple questions we are already well on the way of exercising critical thinking and arriving at a more satisfactory as well as a successful solution to our problem, if not a more informed decision. That is also how a child learns, by asking endless 'whys.' That can be exasperating to parents but in the end that sharpens and enhances the child's learning. The lessons I learned from my old philosophy class decades ago are still relevant to me now that I am entering my eighth decade of life. That is, be a child again, and often. Be curious. Keep asking why!


India Gazette
13-06-2025
- India Gazette
News of the Air India plane crash is traumatic. Here's how to make sense of the risk
On Thursday afternoon local time, an Air India passenger plane bound for London crashed shortly after takeoff from the northwestern Indian city of Ahmedabad. There were reportedly 242 people onboard, including two pilots and ten cabin crew. The most up-to-date reports indicate the death toll has surpassed 260, including people on the ground. Miraculously, one passenger - British national Vishwashkumar Ramesh - survived the crash. Thankfully, catastrophic plane crashes such as this are very rare. But seeing news of such a horrific event is traumatic, particularly for people who may have a fear of flying or are due to travel on a plane soon. If you're feeling anxious following this distressing news, it's understandable. But here are some things worth considering when you're thinking about the risk of plane travel. One of the ways to make sense of risks, especially really small ones, is to put them into context. Although there are various ways to do this, we can first look to figures that tell us the risk of dying in a plane crash per passenger who boards a plane. Arnold Barnett, a professor at the Massachusetts Institute of Technology, calculated that in 2018-22, this figure was one in 13.7 million. By any reckoning, this is an incredibly small risk. And there's a clear trend of air travel getting safer every decade. Barnett's calculations suggest that between 2007 and 2017, the risk was one per 7.9 million. We can also compare the risks of dying in a plane crash with those of dying in a car accident. Although estimates of motor vehicle fatalities vary depending on how you do the calculations and where you are in the world, flying has been estimated to be more than 100 times safer than driving. The risk of being involved in a plane crash is extremely small. But for a variety of reasons, we often perceive it to be greater than it is. First, there are well-known limitations in how we intuitively estimate risk. Our responses to risk (and many other things) are often shaped far more by emotion and instinct than by logic. As psychologist Daniel Kahneman explains in his book Thinking, Fast and Slow, much of our thinking about risk is driven by intuitive, automatic processes rather than careful reasoning. Notably, our brains evolved to pay attention to threats that are striking or memorable. The risks we faced in primitive times were large, immediate and tangible threats to life. Conversely, the risks we face in the modern world are generally much smaller, less obvious, and play out over the longer term. The brain that served us well in prehistoric times has essentially remained the same, but the world has completely changed. Therefore, our brains are susceptible to errors in thinking and mental shortcuts called cognitive biases that skew our perception of modern risks. This can lead us to overestimate very small risks, such as plane crashes, while underestimating far more probable dangers, such as chronic diseases. There are several drivers of our misperception of risks when it comes to flying specifically. The fact events such as the Air India plane crash are so rare makes them all the more psychologically powerful when they do occur. And in today's digital media landscape, the proliferation of dramatic footage of the crash itself, along with images of the aftermath, amplifies its emotional and visual impact. The effect these vivid images have on our thinking around the risks of flying is called the availability heuristic. The more unusual and dramatic an event is, the more it stands out in our minds, and the more it skews our perception of its likelihood. Another influence on the way we perceive risks relevant to flying is called dread risk, which is a psychological response we have to certain types of threats. We fear certain risks that feel more catastrophic or unfamiliar. It's the same reason we may disproportionately fear terrorist attacks, when in reality they're very uncommon. Plane crashes usually involve a large number of deaths that occur at one time. And the thought of going down in a plane may feel more frightening than dying in other ways. All this taps into the emotions of fear, vulnerability and helplessness, and leads to an overweighting of the risks. Another factor that contributes to our overestimation of flying risks is our lack of control when flying. When we're passengers on a plane, we are in many ways completely dependent on others. Even though we know pilots are highly trained and commercial aviation is very safe, the lack of control we have as passengers triggers a deep sense of vulnerability. This absence of control makes the situation feel riskier than it actually is, and often riskier than activities where the threat is far greater but there is an (often false) sense of control, such as driving a car. We have an evolutionary bias toward reacting more strongly to particular threats, especially when these events are dramatic, evoke dread and when we feel an absence of control. Although events such as Air India crash affect us deeply, air travel is still arguably the safest method of transport. Understandably, this can get lost in the emotional aftermath of tragic plane crashes.


Time of India
29-05-2025
- Business
- Time of India
6 essential books every professional should read to decode human behaviour and communicate smarter
In today's fast-paced, high-stakes professional environment, understanding human behaviour is more than a soft skill, it's a strategic asset. Whether you're managing teams, negotiating deals, leading change, or building client relationships, the ability to decode why people act the way they do is key to effective communication and sustained influence. While countless theories have emerged over time, a handful of books stand out for their clarity, depth, and real-world application. The six acclaimed titles listed here offer powerful frameworks to help professionals manage complex interpersonal dynamics with greater insight and effectiveness. Whether your goal is to influence ethically, make sounder decisions, or lead with empathy, these books serve as indispensable guides. 1. Influence: The Psychology of Persuasion – Robert Cialdini Robert Cialdini's Influence introduces six universal principles that drive human decision-making: reciprocity, commitment, social proof, authority, liking, and scarcity. These principles help explain how marketers, leaders, and even cults can shape behaviour. Professionals will learn how to apply these tactics responsibly, while also recognising and defending against unethical persuasion. This book is especially valuable for those in marketing, negotiations, and stakeholder engagement. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Esta nueva alarma con cámara es casi regalada en Lo Prado (ver precio) Verisure Alarmas Leer más Undo 2. Thinking, Fast and Slow – Daniel Kahneman Nobel laureate Daniel Kahneman explores two core modes of thinking in this groundbreaking work: fast, intuitive decision-making and slow, analytical reasoning. Through compelling insights into biases such as confirmation bias and loss aversion, Kahneman helps readers understand how judgment is often flawed, and how to correct it. A must-read for executives, analysts, and decision-makers seeking to improve cognitive clarity and strategic thinking. 3. The Laws of Human Nature – Robert Greene Drawing from psychology, history, and philosophy, Robert Greene examines why people frequently act irrationally and how to respond with emotional intelligence. The Laws of Human Nature offers tools to identify manipulation, manage egos, and convert adversaries into allies. This book is highly relevant for leaders, consultants, and professionals navigating high-stakes or politically sensitive environments. 4. Predictably Irrational – Dan Ariely Behavioural economist Dan Ariely reveals the underlying logic behind seemingly irrational behaviour in areas such as productivity, spending, and decision-making. Predictably Irrational shows how human actions, though often flawed, follow consistent, predictable patterns. Entrepreneurs, economists, product managers, and policy professionals will find valuable, research-driven insights into how people truly think and behave. 5. How to Win Friends and Influence People – Dale Carnegie Dale Carnegie's enduring bestseller remains one of the most influential works on relationship-building. With practical techniques like using people's names, showing genuine interest, and listening actively, How to Win Friends and Influence People helps readers foster trust and rapport, both vital for effective leadership and team dynamics. This book is essential for managers, client-facing professionals, and anyone seeking to strengthen workplace communication. 6. Quiet: The Power of Introverts in a World That Can't Stop Talking – Susan Cain Susan Cain's Quiet challenges the extrovert-centric model of leadership by showcasing the unique strengths introverts bring to organisations. From thoughtful problem-solving to deep focus and creativity, Cain reveals why introverts are key to building balanced, high-performing teams. This book is particularly insightful for team leaders, HR professionals, and introverted professionals looking to leverage their natural strengths. Why These Books Matter Human behaviour is complex, but understanding its drivers is essential for professional success. These six titles offer research-backed, actionable guidance for improving communication, decision-making, leadership, and interpersonal effectiveness. Whether you're leading a team, presenting to stakeholders, or managing client expectations, the insights in these books provide a foundation for stronger performance and more meaningful professional relationships. All six books are readily available through major bookstores and online retailers—making it easier than ever to access powerful tools to better understand and navigate human behaviour in today's evolving workplace. Ready to empower your child for the AI era? Join our program now! Hurry, only a few seats left.


Forbes
23-04-2025
- General
- Forbes
Slow Down To Speed Up: Problem Identification Drives Transformation
Slowing down to define the problem may feel counterintuitive when urgency is high. Yet, it is the clearest path to long-term success. getty Most managers are both excellent and flawed problem solvers, depending on the context. They have remarkable cognitive skills, but being human, they also suffer from a broad range of innate biases and limitations. There's an old story of a man searching for his keys one night under a streetlight. Eventually, a handful of good Samaritans join the search. After some time with no success, someone finally asks the man, 'Are you sure this is where you lost them?' The man replies, 'No, I lost them in the park.' The helper, confused, asks him why he is looking for them here. The man replies, 'Because this is where the light is.' This scenario is more common than many people realize. In business, the data we collect and the way we interpret it can sometimes be like the streetlight in the parable. People tend to look for answers in the most familiar or obvious places rather than digging deeper. In organizations, this type of problem-solving can lead to wasted resources, lost time, and mounting frustration. Consider the case of a food manufacturing company we recently worked with. This company had a specialty line of allergen-free food products, which required rigorous testing to ensure that trace amounts of peanut residue were not present in their facilities. Despite the fact that their workers diligently cleaned and re-cleaned the lines, testing after each cycle, their production lines repeatedly failed quality control tests. Production was backlogged, costs were up, and workers were frustrated. How could even the slightest trace of residue remain? Someone finally asked a key question: 'Have you tested the testing room?' It turned out that the testing room—not the production line—was contaminated with peanut residue. This type of situational tunnel vision is often heightened when problems arise in routine tasks or familiar environments. Thinking outside the box becomes increasingly challenging when you spend all your time inside it—whether that box is your role, company, or industry. Nobel-Prize-winning psychologist Daniel Kahneman introduced the concept of two types of thinking in his book Thinking, Fast and Slow. His theory breaks down human thought processes into two systems. System 1 thinking is fast, intuitive, and handles routine tasks. It's very efficient but prone to errors and biases because we all tend to jump to conclusions based on patterns we recognize. Within familiar settings, we often rely on mental shortcuts and routine assumptions. This is efficient for everyday tasks but can create blind spots when tackling unique or complex problems. System 2 thinking is more deliberate and analytical. It requires effort, attention, and reasoning, and we use it for complex problems or unfamiliar situations. It might be more reliable, but it's definitely more taxing and slower. A simple example might be when you're driving a familiar route home. You primarily use System 1. You basically operate on autopilot, checking your mirrors or changing lanes without much conscious input. But if something unexpected happens, you're low on gas, or a detour sign forces you to find a new route, your slower and more conscious System 2 thinking kicks in. Seasoned managers and experienced consultants can sometimes fall into the same trap. Familiarity with a topic is not always an advantage for solving complex problems, particularly if your experience leads you to think the problem is not complex. With our own consulting teams, we try to stem this tendency to jump to conclusions by relying on a structured problem-solving methodology designed to reveal problems that are baked into routine operations. But the real secret to this is that we force System 2 thinking on bright people who might otherwise believe they can jump to a solution. This is particularly important at the front end of an engagement, to make sure we properly understand what the actual problems are that we, and the client, need to solve. System 1 and System 2 thinking is well worth considering for companies rushing to take advantage of the many benefits of artificial intelligence (AI). While AI offers enormous potential, it is often treated as a solution in search of a problem. What we currently observe with many organizations is a rush to apply the solution without carefully and thoughtfully understanding the underlying problems that need to be fixed. One company invested heavily in an AI tool to speed up customer support response times. The system performed well, but customer satisfaction scores did not improve. It turned out that customers valued resolution accuracy more than speed. The company had wasted considerable resources applying an elegant fix to the wrong problem. Another recent client believed their challenges stemmed from outdated technology. A deeper analysis revealed that the true cause was poor interdepartmental communication. Fixing the miscommunication saved significant time and money, while the assumed 'solution' (investing in new technology) would not have addressed the underlying issues and may have baked the problems deeper into the routine process. I asked one of our managers, Caleb Emerson, for his thoughts on AI integration. He had three points: Whether it's AI or any other tool, solutions are only as effective as the value of the problems they address. A key constraint that hinders capturing AI's value is the integrity of the underlying data. Unless that is addressed, automation and increasingly sophisticated algorithms will struggle to deliver meaningful results. Slowing down to define the problem may feel counterintuitive when urgency is high. Yet, it is the clearest path to long-term success. Proper problem identification saves time, money, and frustration by focusing resources on effective solutions instead of misguided assumptions. When you take the time to identify the real problems, you accelerate the pace of meaningful change. Instead of spinning your wheels, you are better equipped to drive progress where it counts most.