AI is having a positive impact on many areas of higher education – from enrollment to graduation rates. In the last blog post, The Rise of Ethical AI in Higher Education, we touched on the idea that some higher education professionals are wary of using AI in college classrooms. And rightfully so. While generative AI tools like ChatGPT can help students develop outlines or beat a bout of writer’s block, some students have already taken it too far and submitted essays generated entirely by AI.
Fortunately, there are ways for students to use AI ethically. More on that later. In the meantime, higher ed leaders who are skeptical of AI can get more comfortable with this new technology by looking at how it’s being used outside the classroom. Here are some real-life examples of what AI can (and can’t) do for colleges and universities:
AI can streamline the applicant review and admissions process
According to an article in Higher Ed Dive, institutions such as Rutgers University and Rocky Mountain University have used an AI tool called Student Select to predict admissions decisions. The algorithm analyzes a university’s admissions rubric alongside any historical admissions data. Applicants are then sorted into three separate tiers based on the likelihood of admission. Student Select also scans applicant essays and/or interview transcripts for certain keywords that could indicate critical thinking or certain personal characteristics. Although applicants in the top tier tend to get approved more quickly, staff members do review all applications no matter which tier they end up in.
AI can communicate directly with students through two-way text messaging
The College of the Desert used a two-way text messaging system from Ocelot to reduce financial aid application processing time by about a month. The tool’s reminder system also increased the response rate of verification requests so students could get answers sooner. Likewise, a community college in California used AI texting to remind a subset of students about a fund available to help them pay off their tuition. Thousands of students were able to re-enroll despite a previous hold on their accounts.
AI can act as a virtual tutor to increase student success
Impressively, The Georgia Institute of Technology was an early adopter of AI. The virtual teaching assistant affectionately referred to as Jill Watson has tutored Georgia Tech students since 2015. Since then, the school has developed two more promising AI tutoring tools: AskJill and Agent Smith. AskJill serves as a social facilitator that promotes online interactions and community building for students and instructors. Agent Smith is a web-based tool that enables instructors to “train a Jill” so their course can have its own customized AI tool.
AI can free up valuable time for faculty and staff
Nova Southeastern University uses the Aible tool to identify which students are more likely to leave the school. Since the tool streamlines the data-gathering process, staff members have more time to prioritize retention efforts for those at-risk students. In general, AI’s core benefit is efficiency. By automating some tasks, educators (a term that refers to both faculty and staff) can devote more time to improving the student experience.
AI can never fully replace humans
Despite AI’s potential, New York University’s Julia Stoyanovich urges higher ed leaders to be cautious when using it in areas where bias may exist, like admissions. For example, unless it’s monitored and adjusted, an AI algorithm trained on past admission data will repeat any existing biases. It’s up to educators to monitor this and make meaningful changes on the diversity, equity and inclusion front. “I would be very careful to understand how the tool works, what it does, how it was validated. And I would keep a very, very close eye on how it performs over time,” said Stoyanovich.
This leads to another critical point: AI will always lack a human touch. It can review information, feed data back into itself, and get “smarter” at an impressive speed, but it can’t fully replace interpersonal interactions. Too much reliance on AI could devalue the faculty-student relationship and make things feel too transactional. As long as it remains supplemental and doesn’t dominate the college experience, the future is bright for institutions that innovate through AI.
In the next post, we’ll address the elephant in the room: how to encourage ethical AI usage in college classrooms. Until then, read part one: The Rise of Ethical AI in Higher Education.