Collection 7.6

Regression & Statistical Artifacts

Advanced lesson on fallacies arising from statistical phenomena and measurement issues. Students learn to recognize regression to the mean, reification of abstract concepts, the McNamara Fallacy of over-relying on quantifiable metrics, and various forms of data manipulation including overfitting and data dredging.

What to Notice

  • Understand regression to the mean and avoid attributing it to causal interventions
  • Recognize when abstract concepts are being treated as concrete entities
  • Identify over-reliance on measurable quantities at the expense of important unmeasured factors
  • Detect when statistical models are overfitted to noise rather than signal
  • Recognize data dredging and the multiple comparisons problem

Concepts in This Collection

F009

Regression to the Mean

Failing to account for the statistical tendency of extreme values to be followed by values closer to the average, and instead attributing this natural regression to the mean to an intervention, treatment, or causal factor. This leads to incorrectly inferring that changes following extreme observations are caused by actions taken rather than statistical inevitability.

1 of 14
F087

Reification

Treating an abstract concept, statistical construct, or theoretical entity as if it were a concrete, real thing with independent existence and causal power. This involves confusing the map (our conceptual or statistical model) with the territory (actual reality), or mistaking a useful abstraction for a physical entity.

2 of 14
F089

Overfitting

Manipulating data analysis, either consciously or unconsciously, to find patterns that appear statistically significant but actually reflect noise rather than signal. This includes trying multiple analyses until finding significant results, adjusting hypotheses after seeing data, or creating overly complex models that fit random variation in the sample but fail to generalize.

3 of 14
F090

Data Dredging

Searching through large amounts of data for patterns and relationships without prior hypotheses, then treating discovered patterns as if they were predicted in advance. This is related to overfitting but emphasizes the exploratory search through data without theoretical guidance, essentially guaranteeing that spurious patterns will be found and mistaken for real discoveries.

4 of 14
F305

Tautology

Presenting a statement that is true by definition or logical necessity as if it provides empirical information or explanatory power. The statement defines terms in terms of themselves or makes claims that cannot possibly be false, rendering them uninformative for understanding reality.

5 of 14
F306

Taxation is Theft

Categorizing taxation as theft by focusing on surface similarities (involuntary transfer of property) while ignoring essential legal, social, and institutional differences between taxation and theft. This treats categorization as an argument rather than as a starting point for analysis.

6 of 14
F307

Teleological Fallacy

Assuming that natural phenomena, biological features, or historical events exist for a purpose or have inherent goals, when they actually result from purposeless processes like natural selection, physical laws, or causal chains without intentional direction. This involves attributing intentionality or design where none exists.

7 of 14
F308

Tone Policing

Dismissing, derailing, or demanding modification of an argument based on its emotional tone, delivery, or perceived hostility rather than addressing its substantive content. This deflects attention from what is being argued to how it's being expressed, often as a way to avoid engaging with uncomfortable truths or valid criticisms.

8 of 14
F309

Trickle-Down Economics Fallacy

Assuming that policies benefiting wealthy individuals and corporations will automatically lead to widespread prosperity through voluntary redistribution via investment and spending, without requiring evidence of this mechanism's effectiveness or considering alternative causal pathways and empirical outcomes.

9 of 14
F310

Turtles All the Way Down

Attempting to explain or support a claim through a chain of reasoning where each explanation requires another explanation of the same type, continuing infinitely without reaching a foundation or first principle. This creates an explanatory structure that never actually explains anything.

10 of 14
F311

Type-Token Confusion

Confusing properties or statements about a category, class, or type with properties or statements about individual instances, members, or tokens of that category. This involves treating what's true of a general category as necessarily true of specific instances, or vice versa, when the properties in question don't transfer.

11 of 14
F312

Unfalsifiability

Constructing a claim, theory, or explanation in such a way that no possible evidence could disprove it, then treating this as a strength rather than recognizing it as an epistemological weakness. Unfalsifiable claims make no risky predictions and cannot be tested against reality.

12 of 14
F313

Worse Problems Fallacy

Arguing that we cannot or should not address a particular problem until some other allegedly more important or foundational problem is solved first, when the problems are actually independent or can be addressed simultaneously. This differs from relative privation in claiming a required sequence rather than just dismissing lesser problems.

13 of 14
F314

Zero-Sum Thinking

Assuming that one party's gain necessarily equals another party's loss, that resources or benefits are fixed in total amount, or that situations are necessarily competitive when they may actually allow for mutual gain, value creation, or variable-sum outcomes.

14 of 14