THE SNIFF TEST: WAS THAT REALLY A DESIGN SOLUTION?

The way vendors prepare proposals, even the topics they address, are very varied. Then how do you compare effectively and see if a vendor has given you a thoughtful proposal, with an actual solution?

Share This Post

Share on facebook
Share on linkedin
Share on twitter
Share on email

Some time back, we had talked about how to give your e-learning vendor a strong design briefing. In response to that brief, you would typically receive a proposal from the vendor. And if you’ve done this a few times, you know that there are proposals…. and proposals!

Each vendor provides a different type of proposal document. The level of detailing varies, the manner of presentation varies, the focus of selling and designing varies. Even in RFPs where vendors are told to submit their responses using a provided template, you’re likely not going to see uniformity in the response to the ‘solution’ or ‘approach’ section of your template. So then, how do you compare different vendors’ submissions one-to-one?

This post is to help you do that.  

Terminology Check

First of all,

Proposal = Approach (or solution) + pricing + terms

You must be told clearly what you’re getting (i.e., the approach or solution) and the price of that. Typically, most of us also include basic contractual terms that would apply, as well as the turnaround time for executing the described solution.

What Does Our Sniff Test Do?

The sniff test tells you if the vendor is showing that they’re capable of clear, design thinking to ideate solutions or if they’re only capable of producing buzzwords and colorful screens that may not serve much purpose.

Conditions Assumed

We’re going to assume you don’t have any ID experience yourself, but it is your responsibility to evaluate the proposal on behalf of your company. We’ll also take it that you’ve asked for a ‘proposal’ (i.e., with a solution or approach) and not merely a ballpark costing (i.e., a reply to something like ‘how much does an hour of e-learning typically cost’).

Realistically, How Strong An Evaluation Can You Perform?

In the assumed conditions, you may not be able to verify if the instructional approach is technically accurate, i.e., is the link between the problem and solution strong enough, is it supported by empirical evidence? (That would be a technical review, which requires domain expertise.)  

And this is perfectly fine! You’re calling in specialist vendors, why assume that you as an individual must be able to do everything that their team of specialists (with years of practice!) can collectively do? Your job is to find the right people for the job.

What we’re going to do is make ourselves vastly unpopular and offer you an effective sniff test you can use for just this purpose. Use this sniff test to gauge if the vendor seems to know their design stuff… or if you’re being taken for a scenic ride!

9 Critical Sniffs In The ‘Solution Or Not’ Sniff Test!

1. Promises about future states

  • “Your learners will be engaged!”
  • “This will be a highly interactive experience!”
  • “This learning will change behavior and impact business metrics!”

There has to be elaboration and justification for how these promises will be kept. Because they’re all of ‘em just that: promises, not approach elements or solution features.

2. Media design

  • “Highly animated module”
  • “Interactive, game-like experience”

Did you contact Pixar or EA? No. You contacted an e-learning vendor because the focus of your ask is on teaching. So, if you’re being sold just entertainment… welp.

There’s also the more sober version of this, in which media design masquerades as instructional strategy:

  • “Content will be presented using text and audio.”
  • “Animated introductions will be used.”

Such descriptors are also not enough in themselves. If you’re trying to evaluate a movie, you don’t decide its quality based on whether the camera zooms or pans, or if it’s in black and white or surround sound. All those may be used to tell a good story or a rotten one – they say nothing about the quality of the story itself.

So, what is being animated, and why? What about the nature of the content or teaching strategy selected indicates that text and audio are the best choices for representation? You need to see those reasons or else it’s a half-baked attempt at a solution.

3. Processed understanding of needs

The approach or solution must explicitly capture the needs shared, and ideally do more than simply repeat verbatim what you said in your brief. The understanding, elaboration or reframing of the needs tells you if the vendor has really put in effort into creating the solution. Shortcuts here won’t lead to strong solutions.

4. Reliance on buzzwords and trends

Naturally a solution relying on whatever’s the flavor of the moment isn’t likely to be a strong one, especially if that’s the bulk of the idea. Examples:

  • “Learning designed for generation Z” (or generation-what-have-you)
  • “Brain-based learning”
  • “Learning by doing”

If you’re not given a reason for why a particular training technique or conceptual framework is recommended, it’s really dubious especially if the recommendation happens to be in vogue.

How do you check if it’s in vogue? See what training folks on LinkedIn are talking about, and see what’s featured on a few e-learning vendors’ own websites. Over time you’ll get an idea of which vendors let trends dominate their discourse because each of us has a ‘voice’, just like in any other industry.

5. BS and myths

Pardon the bluntness, but it’s simply the most crisp and accurate way to caption stuff like:

Since we assumed you’re not a practicing instructional designer, how do you check if some concept asserted is true or not? Google the term. There are tons of blogs and websites dedicated to this stuff so <term> + “myth” should take care of it.

6. Presence of theory mumbles

Theory is essential for an ID to really know their stuff and to generate strong solutions. But, knowing theory well enough to use it properly is different from mumbling names like incantations in a proposal:

If it’s a case of theory mumbling, you’ll find that the cited theory is not connected in reasonable clarity and detail to the current context. (This is important: the connection should be detailed, not simply what the theory is about – that you can get off any random website!) So, it will be a separate chunk; it will typically be part of the introduction, and then not be referenced or tied in with the actual solution.

A solution hiding behind an ID textbook isn’t a safe sign to go with.

7. Promises about tools, processes and delivery formats

  • “Storyline course”
  • “Captivate module”
  • “Agile methodology”
  • “Microlearning”
  • “Nugget-sized”

These are not instructional design solutions. They are the means to implement a solution, similar to what we discussed regarding media. The solution must justify why a tool, process or delivery format has been recommended or is a good choice.

Now. If you noticed, all of what we’ve discussed so far is about things that are typically present in proposals. Here are 2 sniffs, for things which will often be missing and that we would consider the most important to have.

8. Curriculum design indications

A solution should give some indication of the curriculum (if you haven’t already provided one). This is what links the approach and the implementation, making the ideation of the solution complete*. Even an indicative flow as a placeholder (until discussions with SMEs can be had) is good. What’s iffy, is no indication of the curriculum or assumed types of content to be taught and yet instructional models and media being categorically prescribed.

*I’m a purist myself, so I will argue that a curriculum must not be designed in isolation from the approach. Whatever starting structure is suggested, the designer ideating the approach must still work on the curriculum. Otherwise, the pedagogical clarity behind the curriculum is meh.  

9. Teaching methodologies and strategies

This is the really the essence of a solution. A teaching strategy is not focused on media, delivery format, or broad models and abstractions. It is to do with the actual techniques used to teach the content. So, you should see a clear indication of the ways that concepts will be taught and clarified, how procedures will be made easier to retain, how practice will be made meaningful, how real-life triggers to use the knowledge will be taught, what support will be given to apply the learning in real-life situations, how practice exercises will be levelled up to build to realistic performance standards, etc.

When you check for these 9 things in a proposal, you can also discern the stronger design offerings. Here are 2 more ‘sniffs’ for that:

Spotting the better vendors

Better reframing of the problem

Some vendors will evolve the objectives you gave, add on to the needs you explicitly provided, or include additional outcomes to target because they can identify ways to deliver even more value with the solution. This is not always linked to a higher cost, many times it’s simply about more expert thinking.

Hot tip! – If you’re interviewing vendors, ask them to describe (even if they cannot show you an actual document) some cases where they’ve done this.

Insight demonstrated in the understanding and analysis

This can relate to the work done by the target audience, performance aspects, knowledge required to perform a task, or the relevant complexities of learning. Even if some of the details are not exactly right because they haven’t had a chance to check something about your company’s particular way of working, you will see a strong grasp of the basic situation and content.

You will see that they have done their homework to understand and analyze sufficiently to generate a strong solution. You will see coherence and clarity in the design rationale provided.

Hot tip! – It’s not the most obvious of places to look in, but always check the caliber of working assumptions made.

The expertise and exposure of designers will reflect in the kind of hypothesis or informed guesses they can generate for your validation (or whether they need to ask you for every single detail to be spelled out from your side).

There it is! A handy little sniff test with 9 things to look at to see if what you’ve got is a design solution or not. And then, 2 additional sniffs to ferret out the better vendors from the herd.


Written by Mridula R., Principal Learning Consultant @ Learnnovators

(Visited 390 times, 1 visits today)

More To Explore

E-Learning

How Learning Analytics Shapes Effective L&D Programs

Learning analytics is transforming L&D by delivering data-driven insights that evaluate and enhance training effectiveness. By tracking key metrics, analyzing engagement patterns, and measuring ROI, organizations can align learning outcomes with business goals, achieving impactful results such as improved productivity and reduced attrition. Beyond optimizing current programs, learning analytics predicts trends, identifies skill gaps, and helps prepare a future-ready workforce. Success in L&D lies in understanding how learning drives performance—and analytics makes this possible.

E-Learning

ZSOLT OLAH – CRYSTAL BALLING WITH LEARNNOVATORS

In this enlightening interview with Learnnovators, Zsolt Olah shares his pioneering insights on the integration of technology and learning in the workplace. As an expert in blending gamification with psychological insights, Zsolt discusses the evolution of learning technologies and their impact on creating engaging and effective learning environments. He emphasizes the importance of not letting technology dictate our thinking and the need for learning professionals to master data literacy and ask the right questions to harness AI’s potential. Zsolt’s forward-thinking vision for utilizing Generative AI to create seamless, personalized learning experiences highlights the transformative power of these technologies.

E-Learning

MARGIE MEACHAM – CRYSTAL BALLING WITH LEARNNOVATORS (SEASON II)

In this engaging interview with Learnnovators, Margie, known for her innovative use of artificial intelligence in educational strategies, discusses the integration of AI and neuroscience in creating compelling, personalized learning experiences that challenge traditional methods and pave the way for the future of training and development. Margie’s vision for utilizing AI to facilitate ‘just-in-time’ learning that adapts to individual needs exemplifies her creativity and forward-thinking.

Instructional Design

INSTRUCTIONAL DESIGN BASICS – GOALS

This article emphasizes the importance of goals in instructional design. A goal, at the macro level, answers the WIIFM for the business. Broken down into a more micro level, it defines the specific actions learners need to take to reach the goal. This article focuses on the macro, business, goals and lists the characteristics of a good goal. It also discusses how to derive a good goal from a bad one by asking probing questions.

REQUEST DEMO