An evaluation of why some professional training doesn't work, and why some does.
Training That Works — and Why Some Training Doesn’t
Many aspects of life, culture, and professional practice involve a balancing act—finding the right equilibrium between competing priorities. Training and education are no different. Organizations across government, the military, nonprofits, and industry all face the same tension: how to deliver training that is effective while minimizing the time and resources taken away from operations.
On its face, this approach makes complete sense. Educators should not spend more time than necessary on a subject. Thinkers like Albert Einstein and Richard Feynman have famously echoed versions of the idea that if you can’t explain something simply, you don’t truly understand it. This is not an endorsement of short training so much as a statement about mastery: true understanding is demonstrated by the ability to distill an idea to its core elements and explain them clearly. At the same time, neither thinker was an advocate of oversimplifying genuinely complex concepts.
This is not a critique of short training, nor an argument that everyone needs a PhD-level understanding of everything they do. As with most things, the answer is balance. Training that works finds the right balance between efficiency and effectiveness. Training that fails does not.
More precisely, one common reason training fails is that it optimizes efficiency without adequately respecting effectiveness.
Most of my work focuses on analysis, inquiry, bias, critical thinking, structure, and—occasionally—psychology, philosophy, and statistics. Here, I’m venturing slightly into education itself, because much of the reason Pelorus exists is to address a gap in how analytical skills are taught and reinforced in professional environments. Because of that, I wanted to examine (you might even say, analyze) the issues I’ve seen across a variety of training programs over time.
Two Common Failure Modes
When training becomes efficient but ineffective, it tends to fail in two familiar ways:
- Button-ology: focusing on which buttons to press or steps to follow, without developing an understanding of why those steps work
- Death by PowerPoint: attempting to push too much information into too little time, overwhelming learners rather than equipping them
Button-ology is most often associated with tool-based training. One of Pelorus’ earliest LinkedIn posts featured Grady Booch’s observation that “a fool with a tool is still a fool.” The underlying fallacy is the belief that “I can use the tool” is close enough to “I understand the analysis.”
In these cases, nuance, theory, and context are stripped away in favor of procedural familiarity. Learners are taught what to do, but not why they are doing it. This does not automatically doom training to fail. When used as part of a broader learning continuum, button-level instruction can be valuable. It gets people productive quickly, allows them to build experience, and gives future training something concrete to anchor to.
The problem arises when button-ology is treated as sufficient on its own.
Death by PowerPoint, by contrast, is usually a failure of time and cognitive load rather than intent. It’s not necessarily the wrong information—just too much of it, delivered too quickly. Slides become dense “eye charts,” interaction is minimized, comprehension checks disappear, and learners are rushed from topic to topic. Participants often leave feeling overwhelmed, confused, or as though they were drinking from a firehose.
And, of course, it’s entirely possible to make both mistakes at once: stripping training down to procedures while still failing to allocate enough time for learners to absorb even those basics.
What Good Training Gets Right
Effective training respects both sides of the equation. Procedures matter—but people perform better when they understand what makes those procedures work, why they matter, and how they should adapt them when conditions change.
The challenge is providing enough underlying theory to build genuine understanding without turning training into a purely academic exercise. Too little theory, and learners are brittle—unable to adapt. Too much, and they disengage.
In many domains, success depends on balancing effectiveness with efficiency. Analysis is no different. Sometimes time horizons are compressed, requiring rapid analysis—but stakeholders still expect that analysis to be correct, defensible, and useful.
Pelorus is currently developing its first Analysis 101 Seminar. One of the guiding principles behind it is this balance. The goal is not to overwhelm participants, nor to reduce analysis to a checklist. Instead, we aim to provide practical tools grounded in the theories and frameworks that make them effective—so that what people learn actually sticks.