At Risk-!n 2025, Pierre Lauquin, risk and business continuity manager for the Swiss Government, urged risk professionals to challenge instinctive thinking and rewire how they respond to uncertainty
Risk managers pride themselves on rationality. But as Pierre Lauquin, risk and business continuity manager for the Swiss Government, argued in a high-impact Risk-!n workshop, even the most experienced professionals fall prey to cognitive bias—and it’s damaging how they perceive and assess risk.
“We don’t think the way we think we think,” Lauquin said, citing the work of Nobel-winning psychologist Daniel Kahneman. “Your information that’s going through your eyes or your ears or all your senses is restructured, modified, stretched, modified again before it comes to your conscious mind… and you believe what you see.”
Across a fast-paced 90 minutes, Lauquin demonstrated how unconscious bias distorts decision-making. Using simple but disarming exercises, he showed how professionals routinely misjudge probability, underestimate uncertainty, and reject valid opposing views simply because they don’t align with their initial assumptions.
Confirmation, anchoring and overconfidence: the big three
The three cognitive biases Lauquin explored in depth were confirmation bias, anchoring bias and overconfidence.
“Confirmation bias is found absolutely everywhere,” he said. “If you start to study a project and you have one team that comes with one opinion, you have missed the objective… The right way is to ask the right questions and to be ready to listen to the contrary opinions… because otherwise you will not have the big picture.”
Anchoring bias, he explained, is when an initial piece of information—however irrelevant—frames how people interpret subsequent data. In one group exercise, participants given different initial figures produced widely different estimates to the same question.
“If I ask for your phone number, and then ask how tall you think a tree is, that number still influences your answer—even though it has nothing to do with trees… I found it absolutely horrible. I don’t like it at all,” he admitted.
Overconfidence bias, meanwhile, is the belief that our knowledge or experience makes us better at predicting outcomes than we actually are. Lauquin warned that risk managers often provide overly narrow estimates of cost, time or likelihood—and then commit to them as though they are facts.
“If you go to senior management and say the project will cost 2.5 million, you will be wrong… Even if it’s pretty good, it will be wrong. So it’s much more efficient to give an interval,” he said. “With the information I have today… we can say probably it will be no less than 2 million and can go to 4 million… but we are relatively confident that 2.5 to 3 million will be okay. That’s risk management.”
Changing the conversation about uncertainty
One of the workshop’s key messages was the need to explicitly factor in uncertainty—both in internal discussions and when advising leadership. Lauquin argued that the traditional business approach of demanding single, confident forecasts sets risk managers up to fail.
“We believe that the quality of information we have is greater than the reality… and that’s why we give forecasts that are too confident. Statistically, 66% of projects miss their targets on cost, time or quality. And for big public projects, that number is closer to 90%.”
Lauquin closed by encouraging participants to actively train themselves to spot bias—in themselves and in others—and to build structured countermeasures into decision-making.
“You can’t remove cognitive biases. Never. They are here. They will come all the time. But now we can just try to see where they are coming, and find a way to reduce the impact.”
No comments yet