Last edition, I asked you what’s hardest to get right in your day-to-day work. The replies came in fast — and the most common answer, by a distance, was this: measuring learning impact beyond who clicked “complete.”
So that’s where we’re starting Issue #14. Not with a framework or a checklist — those exist, you’ve read them. But with an honest look at why smart L&D teams are still measuring the wrong things in 2026, and what it’s quietly costing them at the executive table.
We’ve also got a genuinely exciting announcement for anyone thinking about making AI a real part of their ID workflow.
Let’s get into it.
Your dashboards look healthy. Your learners aren’t improving. Here’s why.
There’s a quiet crisis running through L&D right now. Completion rates are high. Satisfaction scores are fine. Training hours are logged. And yet — business performance isn’t shifting. Managers are still making the same decisions. Teams still hitting the same walls.
The problem isn’t the learning. It’s what we’re choosing to measure. A striking finding from Learning Pool: only half of organisations have a structured process for measuring L&D effectiveness. The other half make investment decisions based on activity data alone — completions, hours, logins. Easy to collect. Almost entirely useless for proving impact.
The shift being asked of us in 2026 is genuinely hard. Building measurement into design from day one. Negotiating with stakeholders to define success in operational terms before a single slide is made. Being willing to report on things that don’t look great yet — because honest data builds more trust than polished activity reports.
AI-powered learning analytics are making this more accessible than ever. The capability barrier is lower. What remains is the will to ask harder questions — and then design for that, not for completion.
Read the full piece on Upside LearningThe AI skills gap isn’t an engineering problem. It’s a design brief.
DataCamp’s 2026 report: 88% of leaders say data literacy is essential, yet 60% report a significant gap. And 59% say their organisation has an AI skills gap — even though most are already investing in AI training. We’re spending money. The capability isn’t arriving. That is, at its core, a curriculum design failure.
Read the DataCamp 2026 ReportImmersive learning is getting easier to build. That’s exactly when design judgment matters most.
Columbia Southern University notes that VR-based learning works — but only when structured with clear goals, reflection opportunities, and meaningful assessments. As tools get cheaper, design quality becomes the differentiator.
Read on Columbia SouthernInstructional design is no longer a production function. It’s a strategic differentiator.
Training Industry argues that the era of treating L&D as an order-taking function is ending. The organisations getting this right treat ID as capability architecture, not content production.
Read on Training IndustrySix tools. Honest takes.
No fluff.
Analytics and measurement — tools that help you do more than track completions.
xAPI-based learning record store. Connects learning data to performance data.
ProMaps learning to capability frameworks. Best for skills-first L&D infrastructure.
ProDaily reinforcement, spaced repetition, and performance dashboards managers use.
AI-PoweredA resource library, not a course library. Pull-based learning in the flow of work.
Free TrialROI calculators, L3/L4 tracking, and dashboards non-L&D people can read.
ProDraft objectives, evaluation frameworks, scenario branches, and SME note summaries.
Free TierAI Masterclass for
Learning Developers
Not another “intro to AI” course. A 7-session live masterclass for practising L&D professionals — walk away with real AI systems running inside your actual workflow. Claude Workspace configured to your frameworks. A content pipeline generating storyboards, scripts, and facilitator guides on demand. AI Avatar Narrator in 65+ languages. Games and simulations — without writing code.
Cohort 1 is open now. Every Saturday, 6:00–7:30 PM IST, May 16 – July 4.
We are no longer just delivering training. We are managing a high-stakes capability crisis where work is changing faster than traditional training cycles can ever hope to keep up with.
SweetRush: What L&D Strategy Actually Looks Like When It’s Working
Most L&D writing stops at “here’s what to track.” SweetRush asks the harder question: once you’ve fixed measurement — what does a programme genuinely aligned to business outcomes actually look like?
If the cover story is the diagnosis, this is the treatment plan.
Read on SweetRushWhen your leadership asks “is the training working?” — what do you actually show them?
Click one option to see what the community says so far.
Community Response
Here’s what the community is saying so far.
May 16 – July 4, 2026. Partnering with The Learner Co.
Register NowExplore skill assessments, personalized insights, learning guidance, career tools, and more — all in one place.
Explore Marketplace