Science explains why people always think they are right even though they are wrong
Research shows that people tend to trust their judgment, believing they are right even when they only know half the information.

The problem is that people believe they have enough information to form an opinion, even when they don't, according to a study published Wednesday in the journal Plos One.
“Our brains are so confident that they can draw reasonable conclusions with very little information,” said Angus Fletcher, a professor of English at Ohio State University and co-author of the study.
Fletcher, along with two psychology researchers, set out to measure how people make judgments about situations or people based on their confidence in the amount of information they have – even when it’s not the whole story.
“People tend to jump to conclusions very quickly,” he said.
Researchers recruited nearly 1,300 people, with an average age of around 40. All read a fictional story about a school that is running out of water because the local groundwater source is drying up.
About 500 readers of the story supported the school merging with another school, with three arguing in favor of the merger and one neutral view.
Another 500 people read the story with three arguments in favor of keeping the school, the same neutral stance.
The final 300 people, the control group, read a balanced story, including all seven arguments – three in favor of merger, three in favor of staying the same, and one neutral position.
After reading, the researchers asked participants about their opinions about what the school should do and how confident they were that they had enough information to make that judgment.
Surveys show that people tend to agree with the argument they read—whether it’s in favor of merging or staying the same—and they tend to be confident that they have enough information to form that opinion. People in groups that read only one side also tend to be more confident in their opinion than people in a control group that read both sides.
Half of the participants in each group were then asked to read the opposing side's information, which contradicted the article they had previously read.
Although people are confident in their opinions when they only read the arguments supporting a solution, when presented with all the information, they are often willing to change their opinions. They also report that they are subsequently less confident in forming an opinion on the topic.
“We thought that people would actually stick with their initial judgment even when they received information that contradicted that judgment, but it turns out that if they learn something that seems reasonable to them, they're willing to completely change their opinion,” Fletcher says, adding that the study underscores the idea that people fail to consider whether they have all the information about a situation.
However, the researchers note that this finding may not apply to situations where people already have preconceived ideas, such as in politics.
“People are more open and willing to change their opinions than we think,” Fletcher says. However, “this flexibility doesn’t apply to long-standing differences, like political beliefs.”
Todd Rogers, a behavioral scientist at the Harvard Kennedy School of Government, compared the finding to the “invisible gorilla” study, which illustrates the psychological phenomenon of “inattentional blindness,” when a person fails to notice something obvious because they are focused on something else.
“This study captures that with information,” Rogers says. “There seems to be a cognitive bias to not realize that the information we have is not enough.”
The research also parallels a psychological phenomenon called the “illusion of explanatory depth,” in which people underestimate what they know about a given topic, said Barry Schwartz, a psychologist and professor emeritus of social theory and social action at Swarthmore College in Pennsylvania.
The idea is that if you ask the average person if they know how a toilet works, they will likely say yes. But when asked to explain how a toilet works, they quickly realize that they don't know how it works, just how to use it by pressing a lever.
“The problem isn't just that people are wrong. The problem is that they are overconfident that they are wrong,” Schwartz says.
The cure for this problem, he added, is “curiosity and humility.”
The fact that study participants were then presented with new information and were willing to change their views, as long as the new information seemed plausible, was encouraging and surprising, the researchers and Schwartz agree.
“This gives reason to be a little optimistic that, even when people think they know something, they are open to changing their views as new evidence becomes available,” Schwartz says.