> [!tldr] Presented with *some* conflicting evidence, we tend to be **more sure** we're right, presented with a lot we can change our mind
A study was done in which some people were given the choice to cast votes for some fictitious political candidates after reading about them. They were then given a feed of new information about the candidates. The independent variable was the amount of *negative* information given about their chosen candidate. The results showed an interesting trend – it took roughly 15% of all news stories before people were likely to resolve their [[Cognitive Dissonance]] by updating their choice on the "best" candidate... and surprisingly they were **more** steadfast in their choice if they were given at least *some* negative news about their candidate.
> [!warning]
> Despite this result, the best way to change someone's mind is **not** an onslaught of facts. See [[Love and Community Changes Minds]]
My understanding of the phenomenon:
![[Cognitive Dissonance Threshold 2026-04-20 08.57.55.excalidraw.svg]]
%%[[Cognitive Dissonance Threshold 2026-04-20 08.57.55.excalidraw.md|🖋 Edit in Excalidraw]]%%
****
# More
## Source
- [[How Minds Change]]