<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=2332164&amp;fmt=gif">

Using Assessments to Gain Insights on Learning Programs

By Grace Chang, Ph.D. - August 30, 2017

Grace Chang, Ph.D.

In my last blog, I discussed why assessments aren’t advancing insights about your learning programs.  Although people in Learning and Development and related fields get excited when they hear the words "advanced analytics" because of its potential uses, analytics can only lead to meaningful insights when the foundation of data used in analysis is sound.

So, what kind of assessments are needed?

Organizations that go beyond utilizing smile sheets often rely on brand name, well-established off-the-shelf assessments to evaluate their learners. While very informative in some situations, if your learning and development training is focused on a specific group of learners (e.g., senior managers in the audit practice), these types of assessments tend to be too broad to target their unique needs and capabilities. And although there are some common challenges across organizations, we have found that each organization possesses its own unique culture, language, and philosophy that should be considered when assessing learners. These off-the-shelf assessments also tend to measure more general audience characteristics or competencies, rather than the critical thinking and the behaviors that directly affect the performance of a team, division, or organization.

This is where custom assessments can help.

While custom assessments are not a panacea, there are some distinct advantages when employing them. In addition to addressing the precise needs of the learners within the organization, custom assessments can be developed to address the determined data gaps throughout the learning program. Custom assessments can fill different gaps, including:

  • Evaluating different levels of learning to better understand the impact of professional development training programs on participants. For example, rather than just examining whether a learner’s behaviors and decisions have changed after a program as a demonstration of learning, it is also important to examine the mental models that are the underlying drivers of those behaviors. You can train someone to change how they behave, but that learning does not necessarily transfer to new situations; in order to drive this adaptive transfer, you need to impact the mental models that drive behavior.
  • Supplying ongoing data about learners. This involves thinking holistically about how assessments fit within the entire learning journey rather than using them as one offs that only provide information about learning at one point in time. For example, rather than just delivering assessments before and after a program to determine whether participants have acquired learning, also consider what other types of assessments are needed at other points in time to determine that people are transferring their learning into the field and maintaining the learning long after the learning and development training program is done.
  • Providing more than evaluative value. Although assessments are typically viewed simply as tools to evaluate whether learning has occurred, custom assessments can be developed for other useful purposes as well; they can be used to prime program participants to prepare them for learning, route learners to the materials they need to focus on most, help them to actually acquire learning, and also help to reinforce the learning over time. Even though the assessments are fulfilling primarily non-evaluative purposes in these cases, the data being collected through these assessments can be used to glean additional insights about both the learners and the learning program. I’ll dive deeper into this topic in a future blog post.

All this talk about developing custom assessments to fill data gaps may sound like I’m suggesting that we inundate learners with evaluation. I’m not.

It’s not about assessing the heck out of people; it’s about thoughtfully inserting appropriate, targeted assessments when they are needed to obtain the data you require to conduct the analyses that will be valuable to you. In some cases that may mean supplementing your current off-the-shelf assessments with custom tools as needed. In other cases, that may mean reconsidering your entire assessment approach.

It can be overwhelming to tackle all this, so what can you do to make this process easier for yourself? First, after you take a deep breath, start with the end in mind and then work backward from there. Here are a few key questions to ask yourself as you think through your data collection approach:

  1. What are my big picture analytics goals? What do I want to understand about the data?
  2. Looking across the holistic learning program, what analyses and data are needed to achieve those goals?
  3. What existing data sources (e.g., consumption data, data from current assessments, and performance data) do I already have or can I get?
    • Do they meet my data needs?
    • If so, how will I use data from those sources to draw insights?
    • If not, what are the data gaps?
  4. How can I best fill data gaps?
    • What type of custom assessments are appropriate?
    • When in the learning journey should they be used?
    • How will I use data from these custom assessments in combination with data from other sources to draw meaningful insights?

Although the word “assessment” may not be the attention-grabber that “advanced analytics” is, a solid assessment plan is critical in order to get the right data necessary to empower your analytics methods. And there is so much you can do to improve your assessment approach by considering how custom assessments can meet your needs.

I’m excited by this topic and look forward to expanding on it in future posts. Stay tuned!

We promise that we won't SPAM you.