AI poses new challenges and opportunities at DMACC

Photo illustration created by Bing AI

Over the past year AI, or artificial intelligence, has taken the world and classroom by storm. In higher education, AI could be seen as a beneficial resource or a means of cheating. Different professors have different policies regarding AI, which complicates the matter.

One popular AI tool is ChatGPT, which can produce almost any kind of text — an essay, research paper, or even a poem by giving the chat box a prompt. 

Another example is Grammarly, which is an AI-generated tool that fixes grammar within writing software such as Microsoft Word and Google Docs. 

ChatGPT and Grammarly can be seen as a resource for students to use when writing, but some professors may view it as a form of cheating, especially when used as a form of plagiarism. 

Professor Perspectives 

Many professors at DMACC are embracing the use of AI and integrating websites like ChatGPT for certain assignments. 

Some professors are accepting the usage of AI while using caution, including DMACC English Professor Bethany Sweeney, who has a Ph.D. in literature. 

Sweeney said she worries that if students rely too much on AI, they won’t have the skills to identify how to approach and complete class work: “I find that AI has a lot of potential for being a useful tool for students and for me, and [there are] some pitfalls that we need to look out for.”

Sweeney explained that programs such as Grammarly and ChatGPT can be useful for small things like fixing grammar and coming up with prompts for discussion posts but she worries that students might use AI because they do not have the skill set to do the work on their own. 

Sweeney said that she has noticed different reactions from her colleagues: “I have some colleagues who are a part of that 30 percent of people who never in any way helped a friend on a paper or they have really rigid ideas about cheating and honesty.”  Sweeney said that most of her colleagues do not mind the idea of AI or are at least trying to embrace it.  

One of Sweeney’s colleagues, Colin Hogan, is an example of a DMACC professor who is trying to embrace AI, but still expresses concerns about the rapidly changing technology. Hogan, who has a Ph.D. in American literature, teaches composition and literature classes at DMACC and has seen assignments that have used ChatGPT, Snapchat AI, and Grammarly.

“I’ve seen it used in some good ways. But mostly students seem to be either not using it or using it in simple and not engaged ways, or they are relying on them, and I worry about the reliance part,” said Hogan. 

Hogan also described his policy on AI: “Students can use them. They have to cite them, and then I also want to see their chat record so that I can see the process and understand how we got to the final product.” 

Hogan expressed his concerns about using AI too, explaining that he worries about students not retaining information and how future generations could be negatively affected.  

Hogan said he would like to see a college-wide AI policy to relieve some of his and his other colleagues’ concerns. This policy has only been in a discussion phase due to the large scope. 

Hogan said, “We should have had a policy last summer. We should have already had a policy in place. I have seen comparable policies from other institutions. This technology has changed a lot, even in the past year. We have to figure out how to respond to [rapid advancements in AI].” 

Andy Langager, a journalism professor at the Ankeny campus, said he has encountered a number of students handing in work by AI as their own: 

“Out of 180 or so students in the past year, I have had about 10 who have submitted AI work. If a student hands in work written by someone else — whether a friend wrote it, they copied it from Wikipedia or ChatGPT, my policy is an automatic zero and a deduction of two letter grades for the final grade in the class.”

Langager said he does allow for some AI in the classroom. “In the syllabus I have a list of things they can use AI for, including brainstorming topic ideas, helping to create an outline or finding sources,” Langager said.

Langager added, “I think students will need to be fluent in the use of AI in the future, but I don’t think there is any learning going on if they let it do all the thinking and work for them. Learn the core skills, then you’ll be able to recognize when and how to use AI.”

Langager said that AI has also added work when it comes to grading. “If a student turns in suspicious writing, now I have to spend extra time cross-checking it with what ChatGPT would write. It’s definitely added to my time spent grading and giving feedback,” Langager said.

Student Perspectives 

DMACC students have stepped forward to share their thoughts on AI usage at DMACC as well, including communication major and Campus Chronicle writer Dashae Engler. Engler said she does not mind software such as Grammarly, but she does not think sites such as ChatGPT should be used by students because it is cheating and does all the work. Engler also mentioned that she thinks AI can be used as a tool for learning if it isn’t coming up with every idea for the student. 

Lynn Bousman, a sophomore and theater major, said that she believes AI could be useful for generating discussion and project topics, but anything more would count as cheating and plagiarism. 

“As long as you are able to prove what you are doing you should be allowed to use it for certain things,” Bousman said. She added that most of her professors have a zero-tolerance policy for AI for essays, but some allow it for prompting ideas.  

Judicial Perspectives

DMACC Judicial Officer Deborah McKittrick oversees plagiarism and academic dishonesty cases. 

Academic dishonesty policies vary from professor; some might make a student rewrite a paper, and for some, it will result in a 0 and impact the student’s final grade. Regardless, students can be sanctioned for academic dishonesty, which McKittrick said she sees frequently.

In an email interview, McKittrick said, “Sanctions are determined by the Instructor. What I have typically seen is an award of an ‘F’ for the assignment for a first violation. Failure for the course for any form of a second academic misconduct violation.” 

McKittrick added: “One of the reasons students are held accountable for cheating is to preserve the integrity of the course and the degree they are seeking. While AI can be helpful, it is not a replacement for doing the work and gaining the knowledge that comes from doing the work.”

McKittrick explained that she encourages both faculty and students to communicate the usage of AI in the classroom to create transparency about the expectations and consequences of using AI. 

McKittrick noted that even unintentional plagiarism is subject to a sanction. “A student who does not disclose their use of AI could be accused of an unauthorized collaboration, and thus a conduct violation. Using another’s words, works, or thoughts, whether it be from Google, AI, another student’s work, or even the student’s own previous work, without citing the source and giving it credit, is plagiarism.” McKittrick said. 

”Students should also be aware that AI is not without its flaws and often is wrong therefore I would caution them depending on it,” McKittrick said.


Leave a Reply

Your email address will not be published. Required fields are marked *