HOW TO PROVE THE VALUE OF YOUR L&D TEAM

Struggling to prove the value of your L&D team to stakeholders? Here are some practical pointers to help you with that.

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Being in L&D is a challenge. You’re responsible for the development of everyone in the organization, and yet no one takes you seriously.

Your budget is probably the lowest compared to other departments, and yet you’re expected to deliver the highest value. Well, maybe this last part is not true.

The thing that L&D teams around the world most struggle with is to prove the value of their work. That they are not a mere cost center in-charge of keeping the LMS populated with courses, but rather a critical division that’s worth its weight in gold. If they do their job well, that is.

If you’re in L&D, you know these struggles already. You are probably living them on a daily basis. In this article, we offer some pointers on how to overcome those.

But here’s the thing: If even one person in your team believed that value = number of courses churned out, or the latest technology being implemented, or even the number of people who completed your courses, then I’m sorry to tell you. You’re going to have a hard time convincing anyone in your organization about the value of your team. Now, onto the pointers.

Value, when it comes to the L&D department, is two-pronged:

1. Do the employees (i.e., your audience, your learners) see you as valuable?

It doesn’t matter if you’re giving them a bunch of courses. It doesn’t even matter if these courses are highly engaging. What matters is, are you solving their problems? Are you making their job a little bit easier? Are you opening their eyes to something they’ve not seen before?

If you answered yes to even one of these questions, then your audience sees your team as a treasured partner on their professional journey.

Here are a few things you can do to improve your team’s value from the employee’s point of view:

a) Critically review your offerings from the perspective of the learners.

Ask yourself, if I were to take time away from my work and spend time going through this program, what would I get from it? Try tweaking your program so that it answers this question, loud and clear. This applies to all your programs. Yes, even the compliance courses that everyone hates.

In L&D terms, we call this WIIFM, or What’s-In-It-For-Me. You already know this, of course. But when you are reviewing your courses, check to make sure that the WIIFM is from the learner’s perspective, and not from the organization’s perspective. Here is an example of a not-no-good WIIFM, and how that can be flipped and presented from the learner’s point of view:

  • Not so good: It’s important to know these sales techniques.
  • Good: These techniques will help you have better conversations with customers, which in turn can lead to quicker conversions.

When writing a WIIFM, make sure that it promises a specific benefit for the learner. In the first (not so good) example below, the WIIFM is decent, but only barely so. It does not touch upon the type of meeting the course helps the learner improve, and how exactly it does that.

  • Not so good: This course will help you learn all about conducting effective meetings.
  • Good: This course will help you conduct meetings that are focused on assisting your team improve their chances of growing their business.

Did you notice the difference? We have taken an ambiguous WIIFM that vaguely talks about conducting effective meetings, and turned it into something that the learner really cares about – helping their team to grow their business. The learner, a sales manager in this case, would immediately be interested to see what the course has to offer, because their team’s success = their success.

When communicating WIIFMs (or anything for that matter), honesty and transparency go a long way in establishing trust with your audience. You do not want to present your course as a panacea that will make all their problems go away. Instead, you want to acknowledge any limitations of what you’re promising. For example:

  • Not so good: On going through this course, you will become a time management pro!
  • Good: This course may not solve all your time-related problems, but it will present some solid techniques you can use to save at least a few hours off your busy week.

Here’s another example.

  • Not so good: This course will teach you all you need to know about online fraud and how to handle it.
  • Good: Mandatory compliance courses are boring, right? Well, we have good news for you. While you spend an hour ticking off a box, you will learn a powerful thing or two about how to protect your customers from online fraud. Imagine how it can improve your customer conversations, and the trust it can help you build with your customers.

What we’re trying to do here is to use the power of persuasion to inform learners of the benefits of the course, in a way that’s important to them.

Of course, the advantage (and challenge) of having a great WIIFM is that you’ve got to make sure that the course follows through on that. There is no greater letdown for a learner than looking at a WIIFM and getting into the course thinking that it would be useful, only to find that it doesn’t deliver on its promise.

b) For every course you plan to develop, decide which pieces of content need to go into the course, and which ones the learner can refer to in their moment of need.

Granted. Not every piece of content should be made available at the learner’s finger tips to be consumed in the exact moment of need.

But basically, make this part of your analysis.

Is it okay if the audience finds this information contextually when the need arises, and then forgets all about it once the need is fulfilled? Or will they benefit from building an understanding about the content that they can then retrieve quickly from their mind when needed? Sometimes, it could be a combination of both.

For example, pilots undergo extensive training on how to fly a plane which includes hundreds of hours of practice. At the same time, they rely on a series of checklists while flying the plane in real-time. There are checklists for all kinds of activities, including for take-off, before starting the engine, for landing, and so on. The initial training and practice get them to the point of building reflexes and using them to be able to fly the plane ‘automatically’, while the checklists (on-the-job performance support) are there as additional aids to help make sure they don’t forget any important points.

At other times, it may be sufficient to create only a learning program, or only performance support. The important thing is to keep an open mind in the initial discovery phase, and not default to one approach or the other.

There are countless other things you can do to improve the design of your courses, but the two points above are, in my opinion, the most critical to improving learners’ perception of your offerings. If you take care of these, you are clearly communicating to them that you value their time – their most precious commodity. And, when they perceive this, they will respond in kind with their attention.

2. Does the organization’s leadership see your team as valuable?

What are you measuring? Number of courses rolled out? Course completion stats?

These are not good metrics to measure or manage. They are okay in fact, but not nearly enough to prove your team’s worth to your organization’s leadership.

What they are really interested in is the kind of change your team has brought in to the workforce’s performance.

So basically, while you can measure the number of courses and completions, they are not sufficient metrics by themselves. You will need to prove the impact brought about by these programs. You can do this by measuring:

  • Behavior before the program, and the change that happened afterwards. A simple example I like to use is workplace safety. If historically, employees are averse to wearing PPE (Personal Protective Equipment) while walking into a safety zone, you want to measure the before and after numbers.

    Remember that there is more than one way to get the numbers you need. If you don’t have the means to obtain the number of people who wore PPE before and after the program, you could get the number who were spotted not wearing PPE, if a register is available to track this.
  • Impact brought about by the program. While behavior is tracked from the perspective of employees, here, we measure the result of this change in behavior from the organization’s perspective. In other words, we want to know the consequence of the behavior change in business terms. To take the same example of safety PPE, when we first tracked behavior, we looked at how many employees wore PPE before vs. after the program. To see the impact of this change in behavior, we want to see if the number of safety incidents has come down, and by how much.

To take another example, let’s say we’re talking about customer service training for your support staff. The training is about applying a drill-down questioning process to arrive at the problem quickly and accurately. What your leadership really values are the following metrics:

  • Behavior: After the training, what percentage of your support staff were able to effectively apply the questioning process? You can obtain this data either from call recordings, or from manager feedback.
  • Impact: All other things being equal, did the change in behavior observed above lead to improved customer satisfaction scores?

Again, measurement is very nuanced, and you can glean a lot of insight by simply observing these metrics. But these are good for starters.

To sum up:

  • Learners find the L&D team valuable when they see that you’re actively taking steps to not waste their time, and to add value to their lives
  • The two parameters your organization’s leadership is most interested in:
    • The change in behavior brought about by the program
    • The impact brought about by this change in behavior

So, there you go… these are the ways that I think you can immediately increase the value of your L&D team in the eyes of your stakeholders.

What would you add to this?


Written by Srividya Kumar, Co-Founder @ Learnnovators

(Visited 447 times, 1 visits today)

More To Explore

E-Learning

How Learning Analytics Shapes Effective L&D Programs

Learning analytics is transforming L&D by delivering data-driven insights that evaluate and enhance training effectiveness. By tracking key metrics, analyzing engagement patterns, and measuring ROI, organizations can align learning outcomes with business goals, achieving impactful results such as improved productivity and reduced attrition. Beyond optimizing current programs, learning analytics predicts trends, identifies skill gaps, and helps prepare a future-ready workforce. Success in L&D lies in understanding how learning drives performance—and analytics makes this possible.

E-Learning

ZSOLT OLAH – CRYSTAL BALLING WITH LEARNNOVATORS

In this enlightening interview with Learnnovators, Zsolt Olah shares his pioneering insights on the integration of technology and learning in the workplace. As an expert in blending gamification with psychological insights, Zsolt discusses the evolution of learning technologies and their impact on creating engaging and effective learning environments. He emphasizes the importance of not letting technology dictate our thinking and the need for learning professionals to master data literacy and ask the right questions to harness AI’s potential. Zsolt’s forward-thinking vision for utilizing Generative AI to create seamless, personalized learning experiences highlights the transformative power of these technologies.

E-Learning

MARGIE MEACHAM – CRYSTAL BALLING WITH LEARNNOVATORS (SEASON II)

In this engaging interview with Learnnovators, Margie, known for her innovative use of artificial intelligence in educational strategies, discusses the integration of AI and neuroscience in creating compelling, personalized learning experiences that challenge traditional methods and pave the way for the future of training and development. Margie’s vision for utilizing AI to facilitate ‘just-in-time’ learning that adapts to individual needs exemplifies her creativity and forward-thinking.

Instructional Design

INSTRUCTIONAL DESIGN BASICS – GOALS

This article emphasizes the importance of goals in instructional design. A goal, at the macro level, answers the WIIFM for the business. Broken down into a more micro level, it defines the specific actions learners need to take to reach the goal. This article focuses on the macro, business, goals and lists the characteristics of a good goal. It also discusses how to derive a good goal from a bad one by asking probing questions.

REQUEST DEMO