Bias, Inquiry, and the Analyst’s Discipline
Bias is unavoidable - but in many cases, simply being aware of how the brain works can help mitigate the impact.
People think of Critical Thinking as 'Thinking Hard' but its actually evaluating the quality of your thought.
Critical thinking is often misunderstood. Many people assume it means “thinking hard” or exerting intense mental effort. But in practical terms, critical thinking is something very different:
Critical thinking is thinking about your thinking.
It is the evaluation of the quality of your own thought process.
Throughout November, we drew heavily from the work of the Foundation for Critical Thinking, founded by Dr. Richard Paul and Dr. Linda Elder. Their framework outlines ten elements of thought—ten questions we can ask about the information, ideas, and reasoning involved in any analysis or decision.
Below is a quick restatement of those ten elements:
One point we emphasized throughout the month is that each element is independent.
Clarity does not guarantee accuracy.
Precision does not guarantee relevance.
Logic does not guarantee fairness.
Any given element can be missing—and when it is, the quality of the entire thought process is at risk.
Now that we’ve defined these elements, let’s discuss how to use them in practice. Three areas matter most:
Some elements naturally apply to data, while others align more closely with your analytic process. For example:
But these tendencies are not boundaries.
Clarity is important for requirements and data structure.
Breadth can describe information and analytic methods.
Fairness applies to sources and interpretations.
Thinking this way allows the ten elements to function like a set of lenses—each offering a different view into the quality of our reasoning.
Below is a structured mapping of each element across both data and process:
| Element | Question for Data | Question for Process |
|---|---|---|
| Clarity | Is the data presented in a way that can be clearly understood (definitions, units, formatting)? | Are my requirements, assumptions, methods, and conclusions stated clearly enough that another analyst could follow them? |
| Logic | Do the data points logically fit together, or do inconsistencies need additional explanation? | Does my analytic reasoning follow a coherent chain? Do my conclusions logically follow my method and evidence? |
| Accuracy | Is the information factually correct and from trustworthy sources? | Can I verify that the analytic method was applied correctly (no calculation errors, misapplied formulas, or incorrect assumptions)? |
| Precision | Does the data have sufficient granularity/resolution to support the decision? | Do my results meet the required level of precision (ranges, margins, intervals appropriate for the decision)? |
| Depth | Is my information rich enough, or is it superficial? Do I have enough detail to understand the underlying factors? | Has my analysis gone deep enough? Have I explored underlying causes, assumptions, and complexities? |
| Breadth | Is the information broad enough? Have I included multiple variables, perspectives, or categories? | Have I considered alternative methods, competing hypotheses, or different frames of analysis? |
| Relevance | Is this information actually pertinent to the decision, or does it merely appear interesting? | Is this approach appropriate for the stated goal? Am I focusing on the right problem? |
| Significance | Does this data meaningfully influence the outcome, or is it trivial? | Am I weighting my findings appropriately? Which conclusions actually matter for the decision-maker? |
| Fairness | Is the data unbiased in how it was collected or sourced? Is one viewpoint over-represented? | Have I evaluated perspectives and arguments impartially, or am I favoring certain views? |
| Sufficiency | Do I have enough data to justify a conclusion? What critical information is missing? | Can I justify my conclusions based on requirements, evidence, and reasoning? Do I know when “enough is enough”? |
You can apply the ten elements in the listed order:
Clarity → Logic → Accuracy → Precision → … → Sufficiency
It’s intuitive and often effective:
But real analysis is rarely linear.
Critical thinking is iterative. You might bounce between elements as new information comes in:
Every time you validate an element, you increase the defensibility of your results.
This is not a flowchart.
It’s a disciplined practice.
To close out November, here’s a full example showing how the ten elements appear in a real decision:
Clarity – What matters most?
I define my criteria: a moderately priced car with a strong safety record and solid performance.
Logic – Do these priorities fit together?
If price is most important, expecting top-tier performance may not be logical. So I accept the trade-off: I’ll choose the best performance I can get within my price range.
Accuracy – Are the facts correct?
Commercials aren’t enough. I verify accuracy with independent crash-test data and trusted reliability sources.
Precision – Do I need more specific information?
Two cars might have similar overall safety scores, but I compare specific sub-tests to differentiate them.
Depth – Have I explored the issue beyond basics?
I check multiple testing organizations, examine long-term reports, and look for underlying factors.
Breadth – Have I looked widely enough?
I compare many makes and models, alternative trims, and additional criteria such as ownership cost and reliability.
Relevance – Am I focusing on what matters?
Fuel economy or paint colors may be interesting, but they’re not relevant if safety is my top criterion.
Significance – What actually affects the decision?
A car scoring 82/100 vs. 85/100 is relevant—but is the difference significant? Does it materially impact safety? Not always.
Fairness – Is my evaluation impartial?
One review site favors my chosen model, but others agree. Cross-checking helps ensure fairness.
Sufficiency – Do I have enough to decide?
I have safety data from three sources, reviews from two, and general pricing. Once I get actual dealer quotes for my top three choices, I’ll have sufficient information.
This is what critical thinking looks like in practice:
A structured, disciplined, iterative evaluation of both information and reasoning.
Over the last three months, we explored Inquiry, Bias, and Critical Thinking. Together, these form the foundation of rigorous analysis.
Next month, we’ll start building on that foundation as we explore structured analytic processes—formal methods supported by inquiry, bias mitigation, and critical thinking working together in a systematic way.
Written in by Curtis Wideman Founder, Lead Instructor in Critical Thinking Fundamentals Analyst Training
Share this article:
Bias is unavoidable - but in many cases, simply being aware of how the brain works can help mitigate the impact.
People think of Critical Thinking as 'Thinking Hard' but its actually evaluating the quality of your thought.
Quiet analytical failure modes that accumulate over time—and how structure and guardrails help prevent them.