Tips for Significantly Enhancing Your Learner Surveys
The odds are good your learner surveys aren’t so good. A harsh pronouncement, but probably true, unless you’re already following the advice of Will Thalheimer, founder of Work-Learning Research. He literally wrote the book on learner surveys: Performance-Focused Learner Surveys: Using Distinctive Questioning to Get Actionable Data and Guide Learning Effectiveness.
At the 2024 Learning Business Summit, Will shared tips for improving learning surveys. If you didn’t attend, here’s some of his advice for designing performance-focused learner surveys that deliver the data you need to improve the impact of your education programs.
The problems with traditional program evaluations
In Will’s research, 65% of learning teams aren’t happy with their learning measurement tools and want to make modest or substantial improvements. In his poll of session attendees, 45% said their learner survey data was somewhat useful in improving programs and 35% said their data was not very useful.
Why is this happening? When you measure learning at the end of a program, you’re mostly measuring comprehension. Learners haven’t had enough time to forget anything yet. In the session chat, attendees confirmed this, saying their evaluations measure reaction more than learning. Association leaders would rather know if learners liked the program, not if they learned anything.
Researchers say traditional program evaluations (aka smile sheets) have no correlation to learning. High marks could mean anything. Learners are overconfident about how much information they’ll retain. This bias affects evaluation.
Will told a story illustrating another learner bias. In sales training, tough instructors received low evaluation scores but produced the most high-performing salespeople. This finding reminded me of the learning science research presented by Brian McGowan in his Summit session: learners don’t always know what kind of instruction is good for them. They prefer easy, ineffective instructional methods over challenging, effective methods.
Traditional evaluation methods, like the Likert scale—with its “strongly agree,” “agree,” etc. options—don’t provide useful data. If you get an average rating of 3.8, what does that really mean?
What good learner survey questions look like
Learner surveys must ask unbiased questions focused on how people learn. Here’s an example from Will of a good learner survey question:
Question: HOW ABLE ARE YOU to put what you’ve learned into practice in your work? CHOOSE THE ONE OPTION that best describes your current readiness.
- My CURRENT ROLE DOES NOT ENABLE me to use what I learned.
- I am STILL UNCLEAR about what to do and/or why to do it.
- I NEED MORE GUIDANCE before I know how to use what I learned.
- I NEED MORE EXPERIENCE to be good at using what I learned.
- I CAN BE SUCCESSFUL NOW even without more guidance/experience.
- I CAN PERFORM NOW AT AN EXPERT LEVEL in using what I learned.
Why Will’s learner survey questions are more effective than traditional questions
What did you notice?
The options aren’t what the learner expects, so they attract more attention.
They elicit more valuable information. They give the impression you’re taking the learner’s experience more seriously, especially since the options range from negative to positive impact.
The questions are about the learner, not the program, instructor, or venue. The impact on the learner is what matters.
The options have more granularity. Will said he adds an over-the-top choice, like option F, to slow the learner down so they’ll more carefully consider all options.
Notice the lack of jargon, like “learning objectives.” Learner-friendly language helps them make the right choice. The use of the uppercase helps learners home in on the gist of each option.
The resulting data is more useful. You can see the percentage of learners who chose each option and decide if those results are acceptable. If 30% of learners need more guidance before using what they learned (option C), your program is failing them. If 10% are still unclear about what to do (option B), what’s going on there?
Good surveys focus less on learner satisfaction and course reputation, which aren’t correlated to learning impact. Instead, you need to find out if learners:
- Comprehend the material
- Remember it
- Are motivated to apply it
- Have follow-up support
Craft questions that send messages to your learners and instructors
Will suggests adding nudging questions to your surveys. The example below nudges the learner to follow through with what they’ve learned.
Question: After the course, when you begin to apply your new knowledge at work, which of the following supports are likely to be in place for you? (Select as many items as are likely to be true.)
- MY MANAGER WILL ACTIVELY SUPPORT ME with key supports like time, resources, advice, and/or encouragement.
- I will use a COACH OR MENTOR to guide me in applying the learning to my work.
- I will regularly receive support from a COURSE INSTRUCTOR to help me in applying the learning to my work.
- I will be given JOB AIDS like checklists, search tools, or reference materials to guide me in applying the learning to my work.
- Through a LEARNING APP or other means, I will be PERIODICALLY REMINDED of key concepts and skills that were taught.
- I will NOT get much direct support but will rely on my own initiative.
You can see how these options would prompt your team to think about the support you could give learners and prompt learners to think about the things they could do to make sure they don’t forget what they’ve learned.
Here’s another example of a nudging message.
Question: Compared to most webinars, how well did the session keep YOUR attention. Select one choice.
- I had a HARD TIME STAYING FOCUSED.
- My attention WANDERED AT A NORMAL LEVEL.
- My attention RARELY WANDERED.
- I was very much SPELLBOUND throughout the session.
You can ask questions to nudge a positive perception of your brand, like these questions about accessibility, belonging, and barriers:
- How well did the design and organization of the learning program enable you to fully participate?
- How much did you feel a valued and respected member of the group during the learning?
- How well did the learning experience prepare you to deal with the barriers that you may face as you use what you learned in your work?
Three questions to add to your learner surveys
Will suggests adding these open-ended questions to the end of your survey to elicit valuable insight.
- What aspects of the training made it MOST EFFECTIVE FOR YOU? What should WE DEFINITELY KEEP as part of the training?
- What aspects of the training COULD BE IMPROVED? Remember, your feedback is critical, especially in providing us with constructive ideas for improvement.
- Is there anything else we should have asked about? Is there anything else you want to tell us?
Notice the use of “we” and “us,” which makes these questions sound like a real person talking.
You need more useful learner data to improve your programs and rise above the competition. What’s the point of using the same old learner evaluations if they’re not eliciting the most essential information: did the program make the intended positive impact?
Debbie Willis
Debbie Willis is the VP of Global Marketing at ASI, with over 20 years marketing experience in the association and non-profit technology space. Passionate about all things MarTech, Debbie has led countless website, SEO, content, email, paid ad and social media marketing strategies and campaigns. Debbie loves creating meaningful content to engage and empower association and non-profit audiences. Debbie received a Bachelor of Business Administration in Marketing Information Systems from James Madison University and a Masters of Business Administration in Marketing from The George Washington University. Debbie is a member of Sigma Sigma Sigma sorority, American Society of Association Executives and dabbles in photography.
Filter resources
Explore Posts by Topic
- elearning (6)
- e-Learning (5)
- Learning Management System (5)
- Association LMS (6)
- LMS (3)
- Professional Development (27)
- marketing (33)
- Online education tips (2)
- Association (0)
- LMS Success (1)
- online learning (28)
- member engagement (24)
- planning (21)
- Continuing Professional Development (1)
- Digital Badges (11)
- TopClass (0)
- Certifications (8)
- virtual conference (20)
- employers (21)
- Integration (1)
- adult learning (6)
- online courses (23)
- membership (14)
- virtual education (19)
- Social Learning (10)
- program development (13)
- Design (3)
- young professionals (13)
- career resources (18)
- lifelong learning (16)
- skills gap (11)
- leadership training (16)
- innovation (13)
- WBT Systems (0)
- conference education (9)
- continuing education (1)
- learner engagement (15)
- culture (7)
- association career (8)
- digital credentials (8)
- sales (8)
- course development (11)
- instructional design (14)
- networking (9)
- pandemic (8)
- corporate (10)
- productivity (5)
- Certificate programs (8)
- membership benefits (8)
- sponsorship (10)
- strategy (8)
- generation z (5)
- AMS (0)
- annual conference (7)
- community (7)
- future work skills (4)
- micro-credentials (7)
- volunteers (10)
- TopClass Integrations (0)
- Training (6)
- hybrid conference (9)
- learning science (9)
- remote work (6)
- revenue (7)
- volunteer training (5)
- LMS implementation (1)
- LMS selection (0)
- corporate training (8)
- eCommerce (0)
- iMIS (1)
- millennials (0)
- non-dues revenue (8)
- online marketing (5)
- staff training (7)
- student engagement (4)
- AMS Bridge (0)
- User Interface (0)
- association growth (5)
- coaching (7)
- content marketing (4)
- credentialing (7)
- market research (4)
- micro-learning (3)
- project management (1)
- LMS Implementation Team (0)
- LMS project (1)
- change management (5)
- cohort programs (6)
- college students (3)
- data analytics (5)
- industry education (4)
- learning pathways (5)
- membership model (2)
- online community (1)
- trends (2)
- user data (4)
- webinars (5)
- ASAE (0)
- B2B sales (3)
- Careers (1)
- Competition (3)
- Implementation (1)
- OpenBadges (0)
- budget (2)
- content (1)
- diversity (4)
- eLearning sales (0)
- equity and inclusion (4)
- feedback (3)
- future (4)
- instructors (3)
- lead generation (2)
- learning business (5)
- peer-to-peer learning (5)
- subscriptions (4)
- sunsetting (3)
- trade associations (3)
- work (2)
- workforce development (4)
- AI (3)
- CEU Tracking (0)
- Licensing (3)
- assessment (2)
- catalog (2)
- course descriptions (2)
- evaluations (2)
- exhibitors (4)
- instructional designer (0)
- mentoring (3)
- mobile (0)
- motivation (4)
- wellbeing (4)
- AMS Integrations (0)
- Credits (0)
- Designations (0)
- Digital Transformation (3)
- Higher Education (0)
- Higher Logic (0)
- Integrations (0)
- LMS requirements (0)
- Online eduction tips (2)
- Pricing (3)
- Project Implementation (0)
- Reporting (0)
- behavioral science (3)
- business case (1)
- colleges and universities (0)
- content curation (1)
- experience (1)
- freelancers (2)
- learner support (3)
- learning data (3)
- loyalty (1)
- new industry professionals (3)
- online assessments (1)
- online discussion (1)
- program review (2)
- promotions (1)
- recession (1)
- responsive (0)
- subject matter experts (2)
- ASAE Annual Conference (0)
- CAE (1)
- Certification exam (2)
- ILT (0)
- LMS Integrations (0)
- LMS RFP (0)
- LMS consultant (0)
- LMS implementation tips (1)
- LMS roadmap (0)
- Layout (0)
- New Release (0)
- Partnership (1)
- SMEs (2)
- Zoom (0)
- chapters (0)
- compliance training (2)
- design thinking (0)
- event data (2)
- onboarding (2)
- podcasts (1)
- processes (1)
- report (0)
- selection (0)
- session recordings (2)
- sustainability (1)
- women (1)
- Association learning management system (0)
- CME (1)
- CRM (0)
- Community Platform (0)
- Continuing Medical Education (1)
- Digital Marketing Specialist (1)
- EMS (0)
- Features (0)
- Fraternal Organizations (0)
- GDPR (0)
- Greek Organization (0)
- LMS Costs (0)
- LMS development (0)
- LMS for cross-selling (0)
- LMS training (0)
- Member Retention (1)
- Microsoft Dynamics (0)
- NiUG (0)
- Personify (0)
- QA testing (0)
- RFP (0)
- Revenue boosting (0)
- SaaS (0)
- Salesforce (0)
- Sorority (0)
- TopClass Now (0)
- Web3 (1)
- adoption (1)
- advisory group (1)
- advocacy (1)
- association research reports (1)
- asynchronous (1)
- boosting-boosting revenue (0)
- brainstorming (1)
- change (1)
- cloud (0)
- cloud lms (0)
- consultants (0)
- eLearning revenue (0)
- email marketing (1)
- empathy (1)
- flipped learning (1)
- ideation (1)
- lms design (0)
- pilot (0)
- presenters (1)
- reading (0)
- requirements (0)
- sampling (0)
- student feedback (0)
- tagoras (0)
- team dynamics (0)
- typography (0)
- up-selling (0)
- valu (1)
- webinar (0)