Anonymous Course Feedback: 20 Questions That Actually Improve Teaching
Anonymous course feedback that actually improves your teaching. 20 questions that work, mid-term vs end-of-term, what to read, and how to close the loop with students.
Hushwork Team
End-of-term course evaluations are mostly useless. Students fill them out the night before they're due, recency bias takes over, and the comments are short and vague. The teacher reads them once, internalises the worst comment, and changes nothing structural.
Anonymous course feedback can be much better than this. Two changes: run a mid-term survey when you can still fix things, and ask the questions that actually map to teaching changes. Here's the playbook.
Mid-term beats end-of-term
The end-of-term survey arrives after the course is over. Anything you learn is for the next cohort. Mid-term surveys are running while you can still adjust.
Run both:
- Week 5-6 of a semester course: 10-question mid-term, focused on what to change for the second half
- Week 14-15: full 20-question end-of-term, for next year's design
The mid-term improves this cohort's experience. The end-of-term improves the course design. Different surveys, different jobs.
The 20 questions that work
Mix Likert scales and open text. Hush AI groups the open-text answers into themes for you so you don't read 200 individual responses.
Content (4 questions)
- The course content was at the right level for me. (Likert)
- The pace of the course was appropriate. (Likert)
- I'd recommend this course to a friend in my programme. (Likert)
- What part of the content was most valuable? (Open)
Teaching (4 questions)
- The instructor explained ideas clearly. (Likert)
- The instructor seemed prepared for each lecture. (Likert)
- The instructor responded well to questions. (Likert)
- What teaching method worked best for you? (Open)
Assessment (4 questions)
- The assignments helped me learn the material. (Likert)
- The grading criteria were clear in advance. (Likert)
- Feedback on my work was useful. (Likert)
- Was anything graded that you didn't feel prepared for? (Open)
Engagement (4 questions)
- I felt comfortable asking questions in this course. (Likert)
- The course environment was respectful. (Likert)
- I learned from other students in this course. (Likert)
- What would have made you more engaged? (Open)
Improvement (4 questions)
- What's one thing about this course you'd keep exactly the same? (Open)
- What's one thing you'd change for next year? (Open)
- What was missing that you wished was covered? (Open)
- Any other feedback you'd want the instructor to know? (Open)
Eight to ten minutes for students to complete. Hush AI summaries cut your reading time from a full afternoon to ten minutes.
What to read
In a typical end-of-term batch, here's where the signal lives:
The Likert scales: aggregated trends. Use them to spot dimensions that scored low. Don't read individual responses.
The themes Hush AI surfaces from the open text: this is the gold. Repeated phrases ("the third week was confusing", "the textbook didn't match the lectures", "I learned more from the projects than the readings") are where you can act.
Anything specific and actionable: "the rubric for Project 2 was unclear" is more useful than "the assignments were okay."
What to ignore
The single-comment outliers: one person hated the course. One person loved it. Both are noise without supporting pattern.
Comments about you personally: anonymity makes these less filtered. Read them once. Move on. The professional growth feedback is in the patterns, not the digs.
The Likert outliers: the 1s and 5s on a single dimension don't tell you anything. The mid-trend matters.
Closing the loop with students
This is what separates a feedback-informed course from a feedback-collected course:
Mid-term feedback (next class): take 5 minutes. "Here's what came up in the survey. Here's what I'm changing." Specific. Limited to two or three things. Don't promise everything; promise what you'll actually do.
End-of-term (next year's first class): open the new semester by acknowledging last year's feedback. "Last year's class said the third week was rushed. We've moved a topic and added a problem set. Tell us if it works."
This simple loop changes student behaviour. They start filling out the next survey because they saw their answers actually changed something.
What students don't tell you (and how to surface it)
Three things students rarely volunteer:
- What's confusing in the prerequisites. They came in with gaps and felt embarrassed. Ask: "Was there material from a prerequisite course that wasn't as solid as we assumed?"
- The team-project free-rider problem. They don't want to name names. Ask: "How fair was the workload distribution in your group? What would have made it fairer?"
- The pace mismatch. They thought going too slow looks demanding; going too fast looks like they can't keep up. Ask both: "Were there topics you wished we'd spent more time on? Less time on?"
Hush AI helps you cluster these into themes if you ask the questions specifically.
A simple template
Hush AI will draft this in 30 seconds. Send the link via your LMS. Set a 7-day window. Send one reminder at day 4. Read the themes the day after closing.
What good feedback looks like over time
Three signals that the feedback loop is working:
- Response rates climb (students see action and trust the process)
- The "what's missing" question gets shorter answers (because you're closing the gaps)
- The "what to keep" answers get longer (because you've identified what works)
If responses feel sparse year-over-year, the issue is usually that students don't believe the feedback gets used. Closing the loop visibly fixes that.
Get started
Sign in to Hushwork and ask Hush AI for a course feedback template. Customise for your course, send the link, and read themes by Monday. Free for any class size.
Related reading: