10 tips to improve your language course evaluation questionnaires

In language training we tend to give out our end-of-course evaluation questionnaires routinely, yet if done correctly there are real opportunities to get a wealth of quality control and marketing information.

Don Kirkpatrick

4-levelsBack in 1959, the US Training and Development Journal asked Don Kirkpatrick to write an article on Training Evaluation, based on the ideas he had written about in his Ph.D. dissertation. He asked if he could write 4 instead, because he wanted to write one article per level of his 4-level model.

He wrote the articles and thought that was the end of it, but word spread, and bit by bit his 4-level model became gospel, and now training departments and schools around the world use this model when it comes to training evaluation. In 2010, his son Jim and daughter-in-law Wendy brought out the New World Kirkpatrick Four Levels® to expand and deepen the model.

Their work is dedicated to professional training, but even if your language courses are not for business people, there are some principles that I think are very interesting for any school or teacher. I encourage you to learn more about it on their website http://www.kirkpatrickpartners.com.

Some of their ideas are included in the following tips that can help you improve your questionnaires for quality and marketing purposes:

1. Make the questionnaire learner-centered and not teacher-centered

A teacher-centered questionnaire is a questionnaire that is misleading because it makes the learner want to please the teacher rather than give his or her real feedback on how useful the course actually was. For example, instead of using a 10-point scale and saying “How good was your teacher?” you should say “My learning was enhanced by the teacher”. By focusing on the learner, the real truth of the quality of the course comes out.

2. Ask how much the learner would recommend the course to another

The Net Promotor Score (NPS) is a score that companies use to find out much people are recommending their company to others. Developed by Fred Reichheld, it serves as a useful alternative to customer satisfaction, and should be used as the first question in any training survey. This can then become a useful indicator.

3. To satisfy or not to satisfy – go beyond “Happy Sheets”

Satisfaction is of course important, but you can also go further (including recommendations, see above). What are equally important are levels of confidence, commitment, knowledge and attitude. How much are you tracking what the learner feels they’ll be able to do once back home / at work? For example, tracking levels of confidence in language training is very important. Often, people refuse to engage in talking another language because they’re afraid of making mistakes, and because they lack confidence. Those learners that don’t lack confidence are the ones that succeed the most. Language learning is not just about content, it’s about the ability to transfer a message to somebody in another a language, and mistakes are allowed when doing that. So confidence is a key element to be evaluated.

4. Make the teacher walk out of the room after handing out the questionnaire

The presence of the teacher can severely affect how a questionnaire is filled in. To make it fairer, the teacher should walk out of the room and leave them to it.

5. Add in useful marketing questions

This is where it is a great time to ask marketing questions, either for research or for generating more business. You can have a whole page of questions if you like (as long as they are easy to answer). For example:

  • Where did you hear of us?
  • Would you like to come back to learn with us again?
  • If so, what would you like to do?
  • 90% of our business comes from word of mouth, please could you write down the names and email addresses of 3 people who you think would be interested in following a course with us. Somebody from our school will follow this up with you later.
  • Have you heard about this course / service / product that we also do?
  • Please could you give us your social network details so we can invite you to our lists.
  • Please could you write a few words on your time here so we can add this on to our testimonials in our web site.

Not everybody will fill in everything, but there is always a lot of useful data to capture.

6. Ask open-ended questions too

Using ratings are obviously important to be able to create indicators as proof of quality, but open-ended questions are just as important too. They allow the learner to say things you might not have thought of.

7. Hand out the questionnaires in the native language of the learner

This involves a bit more work (mainly translations and then photocopying) but whatever the end-level of the learner, having them fill in the questionnaire in their native language will ensure you have the honest truth and a lot more information. Obviously it becomes a little harder to analyse those open-ended questions though if they’re written in a language none of the members of your school understands!

8. Analyse and implement

Analysing the questionnaires is a lot easier if you have them done electronically (one day we’ll all be filling them in on iPads), but that’s no excuse for not doing it on paper. Ideally, you need someone to collect the data, and see the trends and indicators that are coming out per teacher, per room, per course, per season, per language. And of course if you DO see a trend coming through (e.g. complaints about the quality of the materials) and you DON’T implement anything, then you only have yourself to blame!

9. Follow-up with the learner afterwards

An underlying trend of this article as you can see is trying to get the truth from the learner. Another way is by contacting the learner a couple of months after the course has been completed. You can ask other questions to really see how things have changed, and also more marketing questions (for example making sure they link up to your social network pages).

10. Create indicators

Indicators are there to help you compare and analyse, but they’re also there as proof that you do what you say you do. Quantitative data always brings up things that are unexpected because we as humans, are irrational people, and we tend to do things on intuition. So we are surprised when the proof is there. Maybe a teacher is performing a lot better than you thought. Maybe the learners don’t mind the state of the common room but hate the lousy wifi. Maybe you’re getting a lot more renewals from Germany than you are from Spain. It takes work and time, but it’s always worth it in the end.

Conclusion

I loved writing this article. I absolutely love training evaluation, I’m a Kirkpatrick Bronze facilitator myself, and I’m 100% certain that good evaluation can lead to good training. It’s important to not go overboard, but making small changes can sometimes lead to big surprises.

I would love to hear some of your comments. Don’t be shy! It would be good to get your feedback. Furthermore, it would be great for you to share this article to with your networks by clicking on one of the buttons below. Thank you!

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *