Yesterday, we looked at allowing students to see what others have created for their assignment – how about extending this to getting students to set quiz questions for each other –

Today, we have  a guest post by Dr. Carolina Kuepper-Tetzel. about that.

An evidence-based strategy that promotes retention of taught material and fosters understanding of material is retrieval practice. Retrieval practice means that students actively bring information to mind – without looking at their notes. The idea it to test oneself on a regular basis on previously-taught material. Frequent quizzing is the key for long-term retention of knowledge. Engaging in retrieval practice shows what you know and what you still have not mastered. However, it is much more than that: Successfully retrieving information strengthens the memory of that information! It is a real memory booster – so to say. One way for lecturers to make use of this strategy is to include in-class quizzes and have students generate answers. Another way is to provide students with practice questions that they can work on at home. But there is yet another option that I have explored last semester in one of my classes: Student-generated questions.

At the end of the semester and before the final exam, I tasked the students with a challenge: To create two questions per lecture and submit them the following week. The key here is that students should already have prior knowledge of the material. It is impossible to generate good questions, if you don’t have a basic understanding of the material. Thus, giving students the question-generating task after having provided input, is a good idea. Students submitted their questions through Assignments on MyDundee as a Word document. The challenge component was that if more than 50% of the class submitted their questions, I would release all questions to them (alongside my questions).

This requirement was imposed as an incentive and out of fairness. First, it would be unfair to make student questions available for everyone, if only a few students are doing the hard work. Second, I wanted as many students as possible to engage with the task. For sure this challenge component made a huge difference: Compared to the year before, where only 2 (!) out of 100 students submitted questions, this year 27 out of 106 students submitted their questions. Many of the student-generated questions were brilliant – clearly showing that they have obtained a good understanding of the material. Unfortunately, still not enough students contributed to take advantage of getting access to the large pool of questions, but more students engaged with the task.

Still, I wanted all students to benefit from having good practice questions. For that reason, I made representative questions (questions generated by many students) available to them during my revision lecture.

Possible improvements for the future:

  • Use a better way to collect the questions. Is there a good free software that can support this strategy?
  • Prepare students during the semester. Explain to them that generating questions should be a weekly revision activity. This way they are already using this strategy continuously during the semester.
  • Find a way for students to work collaboratively on the questions, swapping questions, and – most importantly – generating answers and give each other feedback. Again, is there a software that can support this in a smooth and nonintrusive way?

In part 2 of today’s post, we’ll look at getting students to peer mark each other, and how they might look at developing their own grading criteria.