Fortnight #1414 May 2026

Stop measuringthe wrong things.

What’s actually worth tracking in L&D, why immersive design needs a rethink, and the AI literacy gap most training programmes are quietly ignoring.

6 min read·7,750+ Members·Issue #14
A note from the editor

Last edition, I asked you what’s hardest to get right in your day-to-day work. The replies came in fast — and the most common answer, by a distance, was this: measuring learning impact beyond who clicked “complete.”

So that’s where we’re starting Issue #14. Not with a framework or a checklist — those exist, you’ve read them. But with an honest look at why smart L&D teams are still measuring the wrong things in 2026, and what it’s quietly costing them at the executive table.

We’ve also got a genuinely exciting announcement for anyone thinking about making AI a real part of their ID workflow.

Let’s get into it.

This Issue · By the Numbers
50%
of organisations have no structured process for measuring L&D effectiveness. Activity tracked. Impact — not so much. (Learning Pool, 2025)
70%
of GenAI course development time saved when IDs use AI-assisted authoring — but only when prompts are designed with instructional intent. (GenAI LXD Trends, 2026)
64%
of employees say their company provides AI tools — yet only 25% strongly agree their employer has a clear vision for using them. (The Learning Network, 2026)
This Edition's Big Idea
Cover Story

Your dashboards look healthy. Your learners aren’t improving. Here’s why.

There’s a quiet crisis running through L&D right now. Completion rates are high. Satisfaction scores are fine. Training hours are logged. And yet — business performance isn’t shifting. Managers are still making the same decisions. Teams still hitting the same walls.

The problem isn’t the learning. It’s what we’re choosing to measure. A striking finding from Learning Pool: only half of organisations have a structured process for measuring L&D effectiveness. The other half make investment decisions based on activity data alone — completions, hours, logins. Easy to collect. Almost entirely useless for proving impact.

The real cost: When L&D can’t demonstrate impact in the language the business uses — productivity, error reduction, retention, revenue — it loses credibility in budget conversations. L&D leaders who cannot speak the language of outcomes will always be fighting for relevance. Those who can, rarely have to.

The shift being asked of us in 2026 is genuinely hard. Building measurement into design from day one. Negotiating with stakeholders to define success in operational terms before a single slide is made. Being willing to report on things that don’t look great yet — because honest data builds more trust than polished activity reports.

AI-powered learning analytics are making this more accessible than ever. The capability barrier is lower. What remains is the will to ask harder questions — and then design for that, not for completion.

Read the full piece on Upside Learning
Three Reads Worth Your Time
01
AI Literacy

The AI skills gap isn’t an engineering problem. It’s a design brief.

DataCamp’s 2026 report: 88% of leaders say data literacy is essential, yet 60% report a significant gap. And 59% say their organisation has an AI skills gap — even though most are already investing in AI training. We’re spending money. The capability isn’t arriving. That is, at its core, a curriculum design failure.

Read the DataCamp 2026 Report
02
Immersive Learning

Immersive learning is getting easier to build. That’s exactly when design judgment matters most.

Columbia Southern University notes that VR-based learning works — but only when structured with clear goals, reflection opportunities, and meaningful assessments. As tools get cheaper, design quality becomes the differentiator.

Read on Columbia Southern
03
ID Strategy

Instructional design is no longer a production function. It’s a strategic differentiator.

Training Industry argues that the era of treating L&D as an order-taking function is ending. The organisations getting this right treat ID as capability architecture, not content production.

Read on Training Industry
Tools Worth a Serious Look
Toolkit · This Edition

Six tools. Honest takes.
No fluff.

Analytics and measurement — tools that help you do more than track completions.

📊
Watershed LRS
Learning Analytics

xAPI-based learning record store. Connects learning data to performance data.

Pro
🧠
Degreed
Skills Intelligence

Maps learning to capability frameworks. Best for skills-first L&D infrastructure.

Pro
🎯
Axonify
Frontline + Analytics

Daily reinforcement, spaced repetition, and performance dashboards managers use.

AI-Powered
🔎
Looop by 360Learning
Performance Support

A resource library, not a course library. Pull-based learning in the flow of work.

Free Trial
📈
Intellum
Learning Operations

ROI calculators, L3/L4 tracking, and dashboards non-L&D people can read.

Pro
🤖
Claude for Work
AI Design Assistant

Draft objectives, evaluation frameworks, scenario branches, and SME note summaries.

Free Tier
Guild Spotlight
Guild Spotlight · AI Masterclass

AI Masterclass for
Learning Developers

Not another “intro to AI” course. A 7-session live masterclass for practising L&D professionals — walk away with real AI systems running inside your actual workflow. Claude Workspace configured to your frameworks. A content pipeline generating storyboards, scripts, and facilitator guides on demand. AI Avatar Narrator in 65+ languages. Games and simulations — without writing code.

Cohort 1 is open now. Every Saturday, 6:00–7:30 PM IST, May 16 – July 4.

📅 May 16 – July 4, 2026⏰ Saturdays · 6:00–7:30 PM IST💻 7 Live Sessions on Zoom🎓 Cohort 1 Open Now
Register Now
"

We are no longer just delivering training. We are managing a high-stakes capability crisis where work is changing faster than traditional training cycles can ever hope to keep up with.

— Ellie Beach, Liberate Learning · Training Industry, January 2026
This Edition's Pick
📘
One read. Genuinely worth it.

SweetRush: What L&D Strategy Actually Looks Like When It’s Working

Most L&D writing stops at “here’s what to track.” SweetRush asks the harder question: once you’ve fixed measurement — what does a programme genuinely aligned to business outcomes actually look like?

If the cover story is the diagnosis, this is the treatment plan.

Read on SweetRush
Talk to Us
One question for you

When your leadership asks “is the training working?” — what do you actually show them?

Click one option to see what the community says so far.

Community Response

Here’s what the community is saying so far.

Insights Worth Knowing
May
16
AI Masterclass for Learning Developers — Cohort 1 Begins
Live on Zoom · Every Saturday · 6:00–7:30 PM IST

May 16 – July 4, 2026. Partnering with The Learner Co.

Register Now
LIVE
🚀
LXD Marketplace is Officially Live
AI-Powered Career Growth Platform

Explore skill assessments, personalized insights, learning guidance, career tools, and more — all in one place.

Explore Marketplace