Op-ed: AI does not belong in academia

11 hours ago 6

A student navigates OpenAI’s ChatGPT platform, where the homepage reads “What can I help with?” Feb. 23. Northeastern acknowledged faculty concerns around AI and issued a curriculum guide to help address its academic application and potential misuse.

The views expressed in this piece are the personal views of the author, a faculty member at Northeastern, and do not reflect those of his department or the institution.

On the first day of class each semester, instead of students asking, “What will be on our exams?” their new opening question has become, “Can we use AI?” 

Artificial intelligence, or AI, is oxymoronic. Intelligence solely rests in a sentient being. Meanwhile, the definition of “artificial” connotes that which is made or produced by human skill; not occurring naturally.  

Yet, many students and educators remain oblivious to this intrinsic contradiction, with one Digital Education Council survey finding that 86% of students and 61% of faculty use AI in their courses. Swimming against the current push for AI, I have specifically prohibited AI use within my courses. I do not use it, and students may not use it. End of story. 

Students, overwhelmed with their other commitments and responsibilities, often use AI to look for a shortcut or as a way to save time on classwork. Students wanting to cut corners is not a new phenomenon. Thinking back to my primary and secondary school education, I remember many of my classmates lamenting, “Why do I need to learn math if I will never use it after I am out of school?” 

Students may not understand that the value lies in the process of learning itself. To paraphrase astrophysicist Neil deGrasse Tyson, it is not the subject itself, but the methods, tools and tactics you develop through learning it that will help you solve problems throughout your lifetime. 

Without this process, how can students make their educational journey their own? I strongly believe in author and inspirational speaker Simon Sinek’s assertion that, “As you gain experience, you lose fear.” By relying on AI, are students just trading away this everlasting benefit for immediate gratification?

College courses build the foundation for a lifetime of learning, especially for students who are studying a discipline for the first time. It is critical that professors impart knowledge by encouraging students to work through complex problems — problems which do not include deciding on the best AI prompt to use.

As experts in our field, our own use of AI raises the possibility that our course materials may not be up to our own rigorous standards. In an internal document made available to faculty in August 2023, Northeastern acknowledged the issues that AI may pose for its faculty. The university first recognized that AI may “create new questions about what constitutes academic integrity.” Northeastern then advised faculty that, in cases where we suspect inappropriate use of AI, we should “collect more information by talking with the student about their process in creating the work.” 

Given that many of us have more than 120 students each semester, how much time will each professor end up investing in this? Even if only 15 students are suspected of using AI to cheat on an assignment, there will be a logistical nightmare of meetings and follow-ups with each of them, easily becoming dozens of hours between all the students. I would rather be assisting students who need it as opposed to becoming an AI policeman.

AI use in higher education may have pervasive societal implications. Once students leave college and enter the workforce, they will encounter unusual problems that require creative solutions. For students to become future leaders, they must know how to reason and think. Technology is already encouraging students to act reflexively and without thought, as demonstrated when I ask a class what 10% of 100 is, and the majority of students pull out a smartphone or calculator. In almost every context, students are terrified not to have their “comfort tool” at hand. When given a writing assignment, students should not need a ready-made “AI prompt” just to get started and generate ideas. 

As professors, we owe it to our students to teach them to think, not search the internet. We owe them the respect of providing error-free course materials and the human element at their graduation ceremony. 

I remind and challenge my peers to be mindful of the future that we are helping shape. From  Aldous Huxley’s “Brave New World” to the movie “The Matrix,” the science fiction genre has already warned us about the dangers of machines wielding unrestrained power. If I’ve taken anything from this genre, it is this: Once we outsource our ability to think critically, it is just a matter of when, not if, we humans will become obsolete.

My concerns may seem hyperbolic, but the stakes of using AI in education are incredibly high. I would much rather err on the side of developing critical thinkers as opposed to creating a society tempted by flashy technology, mired in groupthink. We owe it to ourselves but more importantly, we must do it for posterity.

Ronald C. Zullo is a senior lecturer of accounting and taxation. He can be reached at [email protected]

If you would like to submit a letter to the editor in response to this piece, email [email protected] with your idea.

Read Entire Article