Quick Answer
A step-by-step guide to conducting a training needs analysis. Identify skill gaps, prioritize training investments, and avoid building content nobody needs.
A training needs analysis (TNA) is the disciplined process of figuring out whether a performance problem is really a training problem—and if so, what people must learn, who needs it, and how you will know it worked. Without a TNA, L&D becomes a content factory that ships beautiful courses to the wrong audiences. With a TNA, you invest in learning that moves metrics the business already cares about.
Training Needs Analysis Defined
A TNA connects organizational goals to human performance. It answers:
- What should people do differently on the job?
- What knowledge or skill gaps block that behavior?
- What non-training factors (tools, incentives, process) also matter?
Training is only the answer when lack of skill or knowledge is a binding constraint.
When to Conduct a TNA
Run (or refresh) a TNA when you see triggers such as:
- New product, market, or operating model
- Compliance or regulatory change
- Sustained performance dips or quality spikes
- Reorgs and role consolidation
- Major technology rollouts
We’ve found that the best L&D teams schedule a lightweight TNA quarterly for high-change functions—even if the output is “no new curriculum needed.”
The Three-Level Framework
- Organization — Strategic priorities, budget, risk, culture of learning.
- Job / role — Tasks, standards, tools, error patterns by role.
- Individual — Competency gaps, career paths, onboarding needs.
Skipping organization-level alignment is how you end up with perfect courses that nobody prioritizes.
Step-by-Step Process
- Clarify the business outcome with your sponsor (e.g., “reduce rework hours by 15%”).
- Document critical tasks per role; validate with managers and top performers.
- Gather evidence: interviews, surveys (use sparingly), performance data, QA samples, support tickets.
- Observe work where possible—surveys alone miss environmental barriers.
- Distinguish training vs. non-training fixes (bad UI, unclear SOP ownership, conflicting KPIs).
- Prioritize gaps (see matrix below).
- Define success measures before you commission content.
For how training fits into broader programs, see employee training programs.
Prioritizing Gaps: Impact × Urgency
| Low urgency | High urgency | |
|---|---|---|
| High impact | Plan next quarter | Build now |
| Low impact | Defer or job aid only | Quick win or delegate to self-serve |
High-impact, high-urgency gaps deserve curriculum design with clear objectives—often using ADDIE or SAM depending on change velocity.
From Analysis to Action
Translate prioritized gaps into learning assets: courses, videos, practice labs, coaching guides, performance support.
When source material already exists (policies, decks, SOPs), document-to-video tools like Knowlify can compress the path from approved text to narrated video—so you spend review cycles on accuracy, not on rebuilding media from scratch.
Pair video decisions with our training video complete guide.
Data Collection Methods: Choosing the Right Mix
No single method gives you the full picture. Effective TNAs triangulate across at least three sources:
- Structured interviews. One-on-one conversations with top performers, struggling performers, and managers. Keep interviews to 30 minutes with a consistent question set so you can compare across participants. Record (with permission) so you capture exact language—learners trust scenarios and content that sound like their real work.
- Job shadowing and observation. Spend time watching the actual work. Surveys tell you what people think they do; observation shows what they actually do. Even a half-day of shadowing in a call center or warehouse reveals environmental barriers (noisy workstations, missing reference materials, workaround habits) that no survey captures.
- Performance data mining. Pull QA scores, error logs, handle times, rework rates, and customer feedback. Look for variance—differences between sites, shifts, or tenure bands often point directly to training gaps. If one location outperforms another on the same process, study what their top performers do differently.
- Document and artifact review. Read the SOPs, job aids, and knowledge base articles people are supposed to use. Compare them against observed practice. Outdated or contradictory documentation is a non-training problem that no course can fix—flag it early so the TNA output includes process fixes alongside learning recommendations.
- Focus groups. Useful for surfacing shared pain points quickly, but moderate carefully—dominant voices can skew results. Limit groups to 6–8 participants from the same role level to encourage candor.
The best TNAs weight behavioral data (observation, performance metrics) more heavily than self-report data (surveys, focus groups) because what people say they need and what they actually need often diverge.
Stakeholder Alignment During TNA
A technically excellent TNA fails if stakeholders do not trust the findings or act on them. Build alignment throughout the process, not just at the end:
- Kick off with a shared problem statement. Before collecting data, get the sponsor, functional leaders, and L&D aligned on what business outcome the TNA serves. Write it in one sentence and reference it in every update.
- Share interim findings early. Do not disappear for six weeks and return with a 40-page report. Share emerging themes at the halfway mark so stakeholders can redirect data collection if the initial scope was off.
- Present non-training findings with equal weight. Stakeholders respect L&D teams that say "this is a process problem, not a training problem." It builds credibility for the recommendations that are training solutions.
- Co-prioritize gaps with business owners. Use the impact-urgency matrix collaboratively in a working session rather than presenting a pre-baked priority list. When leaders participate in ranking, they own the outcome and are more likely to allocate budget and employee time.
Common TNA Mistakes
- Solution before problem — “We need a course on X” before confirming X is the lever.
- Survey-only research — Low response bias and shallow answers.
- Ignoring managers — They own reinforcement and capacity.
- One-time event — Needs drift; TNAs should be recurring where change is constant.
Our team has observed that the fastest failures come from skipping job observation in operational roles—slides cannot replace seeing the actual workflow.
Sample TNA Questions That Actually Work
Use open-ended prompts in stakeholder interviews:
- “What does ‘good’ look like this quarter vs. last quarter?”
- “Where do new hires stall in their first 30 days—and what do they do instead of the documented process?”
- “Which errors cost us the most money or risk?”
- “What do people Google or Slack-search for because the official resource is missing?”
Pair qualitative answers with three numbers wherever possible: error rate, handle time, conversion, rework hours. Numbers turn debates into prioritization.
RACI for TNA Outputs
Clarify who owns the analysis vs. who contributes. A simple RACI prevents L&D from becoming the dumping ground for every performance problem. Legal owns regulatory interpretation; Ops owns SOP accuracy; L&D owns learning strategy and asset design—collaboration without role confusion.
Key Takeaways
- Start with business outcomes and job tasks, not with content format
- Use organization, role, and individual lenses so you do not miss structural issues
- Prioritize with impact and urgency; say “no” to low-value training requests
- Separate training fixes from tooling, process, and incentive problems
- Plan measurement when you plan the analysis—not after launch
