How AI is transforming higher education: FT
The use of generative AI tools like ChatGPT raises concerns over plagiarism, misuse, and the need for new academic policies and approaches.
-
Students take part in a summer math boot camp on August 1, 2023, at George Mason University in Fairfax, Virginia (AP)
More than a quarter of the prompts submitted by college-age users in the US are for educational purposes, revealed an internal research by OpenAI, the company behind ChatGPT, the world’s leading large language model.
While chatbots can support research and learning, many educators worry that students are using them to shortcut assignments. A recent survey by the Digital Education Council, a global university network, flagged plagiarism as one of the top concerns, alongside ethical issues, the erosion of academic standards, and the difficulty of integrating AI tools into existing institutional systems.
In Lynn Rogoff’s business communications class, final assessments go beyond crafting entrepreneurial pitches. Students must also defend their ideas in a live, on-camera Q&A session with professors and classmates.
'Perfect exercise for the real world'
Rogoff calls it a "perfect exercise for the real world," detailing how the challenge isn't just about presentation skills, it’s also a way to gauge how deeply students have engaged with the material and whether they’ve overly relied on artificial intelligence. “I discovered that the more novel and unique the proposition is, the harder it is for [them] to use AI,” she details.
Rogoff’s strategy reflects a broader response among educators to the rapid rise of generative AI and its disruptive potential. While some professors are integrating AI tools into their teaching, others are moving to restrict or ban their use entirely. Still, finding the right approach is complicated by one key fact: students are already using AI extensively.
Dodging detection through word spinners
With issues of plagiarism, ethical concerns, the devaluation of degrees, and the complexity of integrating AI into existing data and tech systems raised, some students are becoming more resourceful in evading detection, turning to tools like word spinners—software that reorders or alters words—to bypass plagiarism checks or AI detectors.
“The online AI detectors are useless . . .The easiest to spot is when students reference things that didn’t happen in text,” says a graduate teaching assistant at a US college who spoke on condition of anonymity.
“The easiest giveaways are when students refer to things that didn’t actually happen in the text.” Their personal policy is strict: if AI use is blatantly obvious, the paper fails.
In response, some professors have started requiring students to submit the prompts they enter into AI tools alongside their essays or assignments. Still, while academic institutions continue to grapple with how to detect AI usage and draft policies around it, tech companies have been quick to fill the gap with tailored solutions for education.
Rethinking learning: AI as a companion, not a crutch
Several firms are now refining the AI models behind chatbots to better support learning. Their goal: capture a share of the educational market by appealing to the next generation of professionals. For instance, OpenAI has released ChatGPT Edu, a version specifically designed for academic use.
Anthropic, a San Francisco-based company, is offering an educational version of its Claude AI assistant. The company says this version is built to guide students with gentle nudges, rather than supplying direct answers or essay templates. OpenAI is also reportedly testing a similar tool aimed at encouraging student learning.
According to tech leaders, AI should be seen as a learning partner, not a shortcut. “Before calculators, how math tests were scored looked really different [and we] expect that we’re going to see something really similar with the advent of AI,” says Daniela Amodei, co-founder and president of Anthropic.
“It’s really much more about explaining, like, ‘How did you come to this conclusion? What’s the supporting evidence? What are the citations?’ And I think that that’s a place where AI can be a powerful companion versus sort of a replacement for actually doing critical thinking work.”