Course evaluations provide valuable student feedback

With the spring semester coming to a close, course evaluation emails have been sent out to DMACC students to ask their input on their courses and comment on how the instructor can improve the course.

Joe DeHart is the Executive Director of Institutional Effectiveness and Assistant to the DMACC President. He primarily oversees the course evaluation process and makes sure that the evaluation data is sent out to the faculty and supervisors properly.

According to DeHart, the class evaluations were started by the Dean and Provosts group of DMACC. Course evaluations were previously done by paper and pencil. DeHart mentioned that it was difficult for the DMACC faculty to tally, re-type the comments, and track the evaluations in the systems through this process.

Two years ago, DeHart started the new electronic system of course evaluations to simplify the process and make it more anonymous for students. The electronic system is managed by a third-party company called Smartevals.

DeHart gave more information on how the data is distributed to faculty:

“The evaluations close the week before finals. Two weeks after finals, all of grades go into the system and get resolved. Once the grades are in, then the [evaluation] data is released to the faculty and to the faculty supervisors. We make sure that the grading system and the evaluation system do not conjoin at any point and are always kept separate,” he said.

DeHart said that the course evaluations are not used as a job evaluation for professors and that it is a separate part that is covered by faculty supervisors.

“The supervisors use the class evaluations as more of a coaching tool. More than anything else, it has been reinforcing the good faculty,” he said.

The anonymity of the course evaluations is an important factor for how DeHart oversees the evaluations. DeHart assures that the course evaluations are completely anonymous.

“Faculty only see the aggregate and numerical rating and they see the unidentified comments. So the faculty will see every students input, but they will always be de-identified. They won’t ever see even which students took or did not take the survey,” he said.

Another member of the course evaluation staff is Chelli Gentry, Director of Assessment. Starting next year, Gentry will take over as the primary supervisor of the course evaluations.

Gentry indicated the importance of the course evaluations:

“I think any time you get any opportunity to have a say in what faculty are doing in the classroom, I think you should take that opportunity. If you want student learning to improve, then you need to tell people what’s wrong or what’s right with it. The only way we can improve our process is if students speak up and take advantage of that opportunity,” she said.

Chad Davidson, Professor of Philosophy, Religion, and Ethics, skims through his evaluations and looks for things that he can improve on as a professor.

“I don’t go through every detail of the evaluation but I certainly try to catch patterns and comments or certain items if they are continuously brought up in the evaluation, I try to make some changes,” he said.

Davidson also thinks that student’s comments are most helpful in the evaluation.

“If a student makes a comment, those are helpful and probably the most helpful because it details a little more from the student’s own words what they liked or disliked about the class,” he said.

Davidson also believes the professors should actually be able to also evaluate the students as a whole.

“I think it will be helpful. The class dynamics is different. You have a different badge, a different group each time, so the dynamics of the group play a huge part in determining the overall atmosphere and the cohesiveness, how well students are learning. Right now my two ethics class are completely opposite classes. I can see the strengths and weaknesses of both. One group is quiet and the other is a little more talkative, which is helpful for discussion. It will be helpful for that, only to give the students constructive criticisms to help their own learning and to engage more in certain areas and not be afraid to speak more,” he said.

According to DeHart, the average response rate for the course evaluations is around 40% of DMACC students.

The goal for the staff is to have the response rate at around 60%. DeHart and Gentry are currently working to look for new ways to increase the evaluation response rate in the future.

Leave a Reply

Your email address will not be published. Required fields are marked *