Three Tips for Ethical AI Use in Higher Education

In the last post, What AI Can (And Can’t) Do for Colleges and Universities, we explored the positive impact AI is having on higher education. From accurate admissions predictions to clearer financial aid communication, AI is leaving its mark on historically traditional institutions.

However, there’s one area that remains a bit gray: the use of AI by students. Generative AI tools like ChatGPT are undoubtedly useful, but can college students be trusted to use them ethically? And how should educators ensure accountability – especially when AI detection isn’t always reliable? While the answer is layered, here are three tips to encourage ethical AI use in college classrooms.

Consider the real-world value of teaching students how to use AI responsibly  

According to an article in The Harvard Crimson, Assistant Dean of Science Education Logan S. McCarty encourages Harvard University students to use generative AI as long as they’re transparent about it. His other requirement is that students must substantially edit any AI-generated content rather than use it verbatim. “Part of what we’re doing, in all of these cases, is trying to model how people in the professional world, in the academic world, are going to be using these tools in a realistic and responsible and ethical way in the future, and try to establish norms about that now,” he said.

McCarty brings up a good point. AI use will only become more widespread. Ultimately, college students should be prepared for modern work environments. Part of that preparation must involve educational objectives around the technologies graduates will end up using. A rigid “no AI allowed” policy does a disservice to students, the institution, and the workforce as a whole.

This isn’t to say that every educator will feel comfortable adopting a policy like McCarty’s right away. Even in the legal field where AI use has spiked internationally, law schools remain divided on whether to fully embrace the technology. Adoption will take time, and it will likely look different across institutions.

Be open to the idea that essay assignments will change – in more ways than one

To the chagrin of many, some educators question whether the college essay is becoming obsolete. This may be partially due to concerns about an “AI takeover” of sorts. Still, there are valid points of consideration. In an article for The Atlantic, Stephen Marche wrote: “The essay, in particular the undergraduate essay . . . is the way we teach [students] how to research, think, and write. That entire tradition is about to be disrupted from the ground up.”

Marche goes on to reveal a silver lining: AI’s rise in academia is an intriguing invitation for humanities and scientific scholars to connect in new ways. Computer scientists and engineers will need more knowledge of language, history, philosophy, and the like. Humanists and their literary counterparts will need more knowledge of machine learning, automation, and their underlying technologies. This could potentially reshape the way students in different departments work and cross paths with one another. The overlap between these two worlds “will be essential in determining the ethical and creative use of [AI],” said Marche.

In practice, educators may indeed assign fewer essays in favor of other projects. In a response to Marche’s article, Higher Education Expert Christopher Rim wrote for Forbes, “While an essay or dissertation is often the product of learning . . . it is not the core substance. Seminar discussions, theoretical inquiries, stages of peer reviewing, oral defenses—these are the foundation upon which essays are constructed.”

In other words, if students don’t truly understand the material, AI-generated essays won’t save them. As Rim pointed out, “If students . . . cannot formulate a nuanced and original argument based on primary source material, how are they to judge whether the output of ChatGPT offers them a more compelling paper than what they have (or could have) written themselves?” It would be helpful for educators to consider alternative assignments that can accurately assess whether students have mastered a subject.

Know that tough standards and AI innovation can coexist

Educators must balance a strict set of standards with a willingness to engage students. In the context of AI, this juxtaposes rigid, integrity-based rules with creative freedom that garners student interest. It’s up to individual institutions and instructors to determine what this juxtaposition looks like. In fact, a growing number of educators have experimented with AI-based assignments for students or introduced courses focused solely on ChatGPT. Educators who are still skeptical about AI should look to their colleagues for real-life examples of what’s working (or what’s not).

As colleges and universities continue to adopt AI in various forms (because ignoring it or not allowing it aren’t viable options), these institutions must clearly communicate campus-wide policies regarding academic integrity. Once instructors decide to bring AI into their classrooms, they’ll also need to communicate their individual policies – and remain open to those policies evolving over time. While the future of AI in higher education remains to be seen, it’s an exciting time for educators who are willing to venture into the unknown and generate new possibilities.


Three Tips for Ethical AI Use in Higher Education

Like What You Read? Share It With Others!