Gen AI When Used Honestly Offers Potential for Advancing Students’ Critical Thinking Skills

A recent sampling of Minnesota State Mankato students showed that 59 percent of responding students never used ChatGPT to assist with their assignments. On the other end of the spectrum, 7 percent said they used it more than 20 times.  

With 2024 predicted to be a watershed year for the integration of Generative Artificial Intelligence in education, faculty concerns surrounding ChatGPT focus on academic cheating and plagiarism among students, another camp of instructors trust, with proper guardrails, that it can be a progressive and dynamic learning tool. 

In the second installment of a two-part series, Dr. Rajeev Bukralia, an AI expert, shares how Generative Artificial Intelligence can be used to advance students’ critical thinking skills.

By LENNY KOUPAL, CSU Communications Coordinator

Dr. Rajeev Bukralia

For Dr. Rajeev Bukralia, associate professor and graduate coordinator for Computer Information Science at Minnesota State Mankato, the use of Generative AI in higher education demands proper training and transparency. 

“It can definitely enhance student learning if it is used properly–and properly means keeping academic integrity intact and having transparency,” he said. “If those two things happen. I am OK with it.”  

Still, he admits that many faculty members object to AI.  

“Many faculty members will say, ‘Well, this is terrible. Students are going to just plagiarize and cheat.’ There is some truth to that. They already do that,” Bukralia said. “When a student goes out and copies something from the web and they don’t cite it. It is just one piece of writing that a lot of people are copying.” 

The dynamic capabilities of Gen AI make detecting such dishonesty a main faculty concern.  

“With ChatGPT, you change the prompt a little bit and the result is different. And that is what, I think, is challenging from a plagiarism detection perspective,” he added. 

To combat that, Bukralia outlines a two-prong approach to the positive use of Gen AI that starts with well-defined and implemented policies at the institutional and classroom levels. 

“Students should be aware of the policies up front,” Bukralia explained. “That’s very important; that the students are fully aware of how a professor will address the questions related to academic honesty.” 

The second piece is adopting a teaching philosophy that views the academic application of Gen AI as an opportunity for growth. Faculty may permit the use of ChatGPT as a credited source or to gain a better understanding of something.  

“I see a definite educational purpose for ChatGPT. It could be a good catalyst for learning,” he said. “Think about this; when students don’t understand something, they come to the professor. They talk to their classmates to get more insights. Couldn’t they do the same thing with ChatGPT?” 

Rather than stem the tide, faculty may decide to go with the flow provided they set the parameters. Once those guardrails are established, faculty should view Gen AI as a tool for enhanced learning.   

For Bukralia, ChatGPT use in his classes is a way to tap into critical thinking among his students.  

“I want to ask students to use (ChatGPT) because they’re going to use it anyway,” he added. “However, I will tell them, ‘Look, here’s the answer you get from ChatGPT. Tell me, how would you improve that answer? How would you ensure that the answer ChatGPT has given you is a quality answer?'” 

Knowing that Gen AI such as ChatGPT is littered with biases and inaccuracies, Bukralia believes critical thinking comes from not believing everything you read or hear.  

“You need to verify your sources. You need to cross-examine things. ‘So ChatGPT is giving you the answer? Why do you believe that is the correct answer?'”  

Rather than opposing the use of Gen AI, requiring students to defend their AI-gained information provides an opportunity for academic interaction and lifts the fear factor. 

“You cannot push against technology and make it disappear. That cat is out of the bag,” he said. “Students should be taught how to use ChatGPT in a transparent manner while disclosing if they used it.  So attribution is important. Attribution can only work if there is no retribution. If there is no punishment.” 

That means advancing ChatGPT as a tool, not as a shortcut.  

“If students just copy-paste directly and pretend that it is their own work, that is dishonesty,” he said. “They should look at ChatGPT as a guide by their side, so to speak. When it becomes an authentic learning tool where students are not only trained to use it; they are trained to go through the answer and verify information, and they are trained to enhance ChatGPT’s responses, and they are trained to cite them properly. Then it is not a bad thing.”  

Click here to view the first installment of this two-part series.

Leave a Reply

Your email address will not be published. Required fields are marked *