False claims spread six times faster than the truth, according to a 2023 MIT study. This imbalance makes debunking misinformation an exhausting battle. A concept known as Brandolini’s Law explains why.
“The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.”
In 2013, programmer Alberto Brandolini stated that disproving nonsense takes far more energy than creating it. This principle, also called the “bullshit asymmetry principle,” affects professionals fighting misinformation daily.
Engineers, designers, and fact-checkers face this challenge. False claims demand time and effort to refute, while bad information spreads effortlessly. The digital world amplifies this problem.
Understanding this law helps professionals develop better strategies. The next sections will explore practical ways to counter misinformation efficiently.
Understanding Brandolini’s Law
Debunking false claims requires ten times more effort than creating them, a reality many professionals face daily. This imbalance, known as the bullshit asymmetry principle, highlights why misinformation thrives. Engineers, scientists, and journalists often spend hours dismantling claims made in minutes.
The Bullshit Asymmetry Principle Explained
Alberto Brandolini’s metaphor captures the struggle: “Arguing with a misinformation spreader is like playing chess with a pigeon. It knocks over pieces, struts around, and declares victory.” The energy needed to refute nonsense dwarfs the effort to create it.
In 2013, programmer Alberto Brandolini coined the term while observing online debates. Unlike Hitchens’ Razor—which shifts the burden of proof to the claimant—his law emphasizes the energy needed to correct falsehoods.
A 2024 Edelman Trust Report found 68% of engineers encounter weekly misinformation. Ultracrepidarianism—non-experts opining on technical topics—fuels this trend.
Ultracrepidarianism: an ultracrepidarian—from ultra- (“beyond”) and crepidarian (“things related to shoes”)—is a person considered to have ignored this advice and to be offering opinions they know nothing about. It is first attested in the English essayist William Hazlitt’s 1819 open “Letter to William Gifford”, the editor of the Quarterly Review: “You have been well called an Ultra-Crepidarian critic.” The editor of Hazlitt’s writings, however, offers that it might have been coined by Charles Lamb instead It was picked up four years later in Hazlitt’s friend Leigh Hunt’s 1823 satire Ultra-Crepidarius: A Satire on William Gifford. Occasionally the word ultracrepidarianism—the act or general practice of speaking beyond one’s knowledge—was used similarly later. (source: Wikipedia)
Why Fake Information Spreads Faster Than Truth
Neuroscience reveals why people share unverified claims faster than factual corrections. The brain favors cognitive ease—quick, emotionally charged content over complex truths. Social media exploits this bias, turning misinformation into a wildfire.
Fabricating false claims takes minimal effort. Debunking them demands exhaustive research. The Boeing 787 myth claimed composite materials were unsafe. The FAA spent weeks refuting it, while the rumor spread globally in hours.
- 74% of retweets happen without link verification.
- Facebook users average 3 seconds reviewing facts before sharing.
Social Media’s Role in Amplifying Misinformation: algorithms prioritize engagement, not accuracy. Sensational claims earn clicks, drowning out nuanced corrections. The attention economy fuels this cycle. Engineers and designers battle specs against viral fiction—where bullshit travels light, and truth carries baggage.
Recognizing Fake Data in Engineering and Design
Fake data in engineering can lead to costly recalls, as seen in high-profile product failures.
Professionals need tools to spot misleading claims before they escalate. Rigorous validation standards and peer reviews act as the first line of defense.
Common Red Flags in Technical Claims
Seven warning signs often signal fabricated data:
- Missing error margins. Reliable studies include statistical uncertainty ranges
- Non-peer-reviewed sources
- Overly simplistic metrics
- Inconsistent documentation
- Unverified prototypes
- Piles of unverified arguments rather than 1 or 2 confirmed facts. Quantity > quality
- Assertions presented as global rules (“it is well known that” … )
Tools to Identify Misinformation
Misinformation thrives when critical thinking tools are absent. Engineers and designers face fabricated data daily, making structured verification methods essential. Two principles—Hitchens’ Razor and Occam’s Razor—offer practical ways to assess claims efficiently.
Hitchens’ Razor: The Burden of Proof
This principle states: “Extraordinary claims require extraordinary evidence.” It shifts the burden of proof to the claimant, saving time on unsupported arguments.
SAE International’s FMEA framework applies this idea. Teams analyze failure modes before accepting design specs.
Occam’s Razor for Simplifying Complex Claims
The simplest explanation is often correct. NTSB’s root cause analysis uses this to pinpoint technical failures. Contrast FDA’s 510(k) process with CE mark’s self-certification (used for simplest less-regulated product) – fewer verification steps led to more recalls.
TRIZ’s contradiction matrix solves engineering problems by eliminating unnecessary complexity. Canadian climate claims about “natural cycles” collapsed when tested against ice core data. Structured tools turn chaotic arguments into actionable insights.
Debunking Strategies for Professionals
Professionals need structured approaches to dismantle false claims efficiently. Reactive corrections drain resources, while proactive frameworks save time and credibility. The key lies in shifting the workload back to misinformation creators.
Pushing the Burden of Proof Back
Requiring claimants to validate their arguments prevents wasted effort. For instance, Six Sigma’s DMAIC framework forces rigorous validation before accepting design specs. Or ISO 17025 checklists standardize verification, ensuring all evidence meets lab-grade criteria.
An “unbreakable” mobile phone protective glass claims faced scrutiny when consumers replicated drop tests. By demanding third-party data, engineers exposed overstated marketing claims.
Using Hypothesis Testing to Challenge Claims
Treat dubious claims as hypotheses needing falsification. ASME’s V&V 40 framework assesses risk by testing assumptions against real-world data. JIRA tracking systems log unverified claims as “technical debt,” prioritizing high-impact debunks.
“Hypothesis testing turns opinions into measurable variables.”
The Cost of Engaging With Bad Faith Arguments
Time spent correcting deliberate falsehoods is time stolen from real work. Professionals face a paradox: engaging with nonsense legitimizes it, while ignoring it risks unchecked spread.
When to Walk Away: The Chess vs. Pigeon Dilemma
Alberto Brandolini’s chess analogy holds: bad actors “knock over pieces and declare victory.”
The lesson? Calculating the Time/Gain Trade-Off and let data, not egos, decide when to disengage.
“Never wrestle with pigs. You both get dirty, and the pig likes it.”
Epistemic exhaustion is real. Allocate effort where it moves needles—not where pigeons strut.