News | Education
20 Feb 2026 10:17
NZCity News
NZCity CalculatorReturn to NZCity

  • Start Page
  • Personalise
  • Sport
  • Weather
  • Finance
  • Shopping
  • Jobs
  • Horoscopes
  • Lotto Results
  • Photo Gallery
  • Site Gallery
  • TVNow
  • Dating
  • SearchNZ
  • NZSearch
  • Crime.co.nz
  • RugbyLeague
  • Make Home
  • About NZCity
  • Contact NZCity
  • Your Privacy
  • Advertising
  • Login
  • Join for Free

  •   Home > News > Education

    The greatest risk of AI in higher education isn’t cheating – it’s the erosion of learning itself

    Automating knowledge production and teaching weakens the ecosystem of students and scholars that sustains universities, raising existential questions about their mission.

    Nir Eisikovits, Professor of Philosophy and Director, Applied Ethics Center, UMass Boston, Jacob Burley, Junior Research Fellow, Applied Ethics Center, UMass Boston
    The Conversation


    Public debate about artificial intelligence in higher education has largely orbited a familiar worry: cheating. Will students use chatbots to write essays? Can instructors tell? Should universities ban the tech? Embrace it?

    These concerns are understandable. But focusing so much on cheating misses the larger transformation already underway, one that extends far beyond student misconduct and even the classroom.

    Universities are adopting AI across many areas of institutional life. Some uses are largely invisible, like systems that help allocate resources, flag “at-risk” students, optimize course scheduling or automate routine administrative decisions. Other uses are more noticeable. Students use AI tools to summarize and study, instructors use them to build assignments and syllabuses and researchers use them to write code, scan literature and compress hours of tedious work into minutes.

    People may use AI to cheat or skip out on work assignments. But the many uses of AI in higher education, and the changes they portend, beg a much deeper question: As machines become more capable of doing the labor of research and learning, what happens to higher education? What purpose does the university serve?

    Over the past eight years, we’ve been studying the moral implications of pervasive engagement with AI as part of a joint research project between the Applied Ethics Center at UMass Boston and the Institute for Ethics and Emerging Technologies. In a recent white paper, we argue that as AI systems become more autonomous, the ethical stakes of AI use in higher ed rise, as do its potential consequences.

    As these technologies become better at producing knowledge work – designing classes, writing papers, suggesting experiments and summarizing difficult texts – they don’t just make universities more productive. They risk hollowing out the ecosystem of learning and mentorship upon which these institutions are built, and on which they depend.

    Nonautonomous AI

    Consider three kinds of AI systems and their respective impacts on university life:

    AI-powered software is already being used throughout higher education in admissions review, purchasing, academic advising and institutional risk assessment. These are considered “nonautonomous” systems because they automate tasks, but a person is “in the loop” and using these systems as tools.

    These technologies can pose a risk to students’ privacy and data security. They also can be biased. And they often lack sufficient transparency to determine the sources of these problems. Who has access to student data? How are “risk scores” generated? How do we prevent systems from reproducing inequities or treating certain students as problems to be managed?

    These questions are serious, but they are not conceptually new, at least within the field of computer science. Universities typically have compliance offices, institutional review boards and governance mechanisms that are designed to help address or mitigate these risks, even if they sometimes fall short of these objectives.

    Hybrid AI

    Hybrid systems encompass a range of tools, including AI-assisted tutoring chatbots, personalized feedback tools and automated writing support. They often rely on generative AI technologies, especially large language models. While human users set the overall goals, the intermediate steps the system takes to meet them are often not specified.

    Hybrid systems are increasingly shaping day-to-day academic work. Students use them as writing companions, tutors, brainstorming partners and on-demand explainers. Faculty use them to generate rubrics, draft lectures and design syllabuses. Researchers use them to summarize papers, comment on drafts, design experiments and generate code.

    This is where the “cheating” conversation belongs. With students and faculty alike increasingly leaning on technology for help, it is reasonable to wonder what kinds of learning might get lost along the way. But hybrid systems also raise more complex ethical questions.

    A college student in discussion in a classroom
    If students rely on generative AI to produce work for their classes, and feedback is also generated by AI, how does that affect the relationship between student and professor? Eric Lee for The Washington Post via Getty Images

    One has to do with transparency. AI chatbots offer natural-language interfaces that make it hard to tell when you’re interacting with a human and when you’re interacting with an automated agent. That can be alienating and distracting for those who interact with them. A student reviewing material for a test should be able to tell if they are talking with their teaching assistant or with a robot. A student reading feedback on a term paper needs to know whether it was written by their instructor. Anything less than complete transparency in such cases will be alienating to everyone involved and will shift the focus of academic interactions from learning to the means or the technology of learning. University of Pittsburgh researchers have shown that these dynamics bring forth feelings of uncertainty, anxiety and distrust for students. These are problematic outcomes.

    A second ethical question relates to accountability and intellectual credit. If an instructor uses AI to draft an assignment and a student uses AI to draft a response, who is doing the evaluating, and what exactly is being evaluated? If feedback is partly machine-generated, who is responsible when it misleads, discourages or embeds hidden assumptions? And when AI contributes substantially to research synthesis or writing, universities will need clearer norms around authorship and responsibility – not only for students, but also for faculty.

    Finally, there is the critical question of cognitive offloading. AI can reduce drudgery, and that’s not inherently bad. But it can also shift users away from the parts of learning that build competence, such as generating ideas, struggling through confusion, revising a clumsy draft and learning to spot one’s own mistakes.

    Autonomous agents

    The most consequential changes may come with systems that look less like assistants and more like agents. While truly autonomous technologies remain aspirational, the dream of a researcher “in a box” – an agentic AI system that can perform studies on its own – is becoming increasingly realistic.

    A biotech researcher working on a computer in a lab
    Growing sophistication and autonomy of technology systems means that scientific research can increasingly be automated, potentially leaving people with fewer opportunities to gain skills practicing research methods. NurPhoto/Getty Images

    Agentic tools are anticipated to “free up time” for work that focuses on more human capacities like empathy and problem-solving. In teaching, this may mean that faculty may still teach in the headline sense, but more of the day-to-day labor of instruction can be handed off to systems optimized for efficiency and scale. Similarly, in research, the trajectory points toward systems that can increasingly automate the research cycle. In some domains, that already looks like robotic laboratories that run continuously, automate large portions of experimentation and even select new tests based on prior results.

    At first glance, this may sound like a welcome boost to productivity. But universities are not information factories; they are systems of practice. They rely on a pipeline of graduate students and early-career academics who learn to teach and research by participating in that same work. If autonomous agents absorb more of the “routine” responsibilities that historically served as on-ramps into academic life, the university may keep producing courses and publications while quietly thinning the opportunity structures that sustain expertise over time.

    The same dynamic applies to undergraduates, albeit in a different register. When AI systems can supply explanations, drafts, solutions and study plans on demand, the temptation is to offload the most challenging parts of learning. To the industry that is pushing AI into universities, it may seem as if this type of work is “inefficient” and that students will be better off letting a machine handle it. But it is the very nature of that struggle that builds durable understanding. Cognitive psychology has shown that students grow intellectually through doing the work of drafting, revising, failing, trying again, grappling with confusion and revising weak arguments. This is the work of learning how to learn.

    Taken together, these developments suggest that the greatest risk posed by automation in higher education is not simply the replacement of particular tasks by machines, but the erosion of the broader ecosystem of practice that has long sustained teaching, research and learning.

    An uncomfortable inflection point

    So what purpose do universities serve in a world in which knowledge work is increasingly automated?

    One possible answer treats the university primarily as an engine for producing credentials and knowledge. There, the core question is output: Are students graduating with degrees? Are papers and discoveries being generated? If autonomous systems can deliver those outputs more efficiently, then the institution has every reason to adopt them.

    But another answer treats the university as something more than an output machine, acknowledging that the value of higher education lies partly in the ecosystem itself. This model assigns intrinsic value to the pipeline of opportunities through which novices become experts, the mentorship structures through which judgment and responsibility are cultivated, and the educational design that encourages productive struggle rather than optimizing it away. Here, what matters is not only whether knowledge and degrees are produced, but how they are produced and what kinds of people, capacities and communities are formed in the process. In this version, the university is meant to serve as no less than an ecosystem that reliably forms human expertise and judgment.

    In a world where knowledge work itself is increasingly automated, we think universities must ask what higher education owes its students, its early-career scholars and the society it serves. The answers will determine not only how AI is adopted, but also what the modern university becomes.

    The Conversation

    The Applied Ethics Center at UMass Boston receives funding from the Institute for Ethics and Emerging Technologies. Nir Eisikovits serves as the data ethics advisor to MindGuard, a startup focused on AI integration into companies' workflow.

    Jacob Burley receives funding from The Applied Ethics Center at UMass Boston.

    This article is republished from The Conversation under a Creative Commons license.
    © 2026 TheConversation, NZCity

     Other Education News
     20 Feb: Why the ‘Streets of Minneapolis’ have echoed with public support – unlike the campus of Kent State in 1970
     19 Feb: An investigation into how an Auckland boy was injured falling from a school bus emergency exit, has found neither students or the driver were at fault
     19 Feb: A Christchurch Catholic boys' school says it's committed to addressing concerns about historic sexual abuse by former staff appropriately
     13 Feb: Police will be out in force across Dunedin's student quarter as Flo-Week draws to a close
     12 Feb: A Christchurch Catholic boys' school was alerted to allegations of sexual offending by a priest nearly 20 years before his conviction
     12 Feb: A temporary education hub has opened for Northland secondary students, cut off from their schools by a massive slip
     11 Feb: Area school teachers are heading back into pay talks - due to a disagreement between unions
     Top Stories

    RUGBY RUGBY
    A Sky Sport Now subscription's about to become more costly More...


    BUSINESS BUSINESS
    Confidence among Auckland businesses has surged to its highest level since 2022, according to the latest survey from the Auckland Business Chamber More...



     Today's News

    National:
    Why Michelangelo’s ‘Last Judgment’ endures 10:07

    International:
    Donald Trump suggests a decision on Iran is likely within 10 days 10:07

    Business:
    Confidence among Auckland businesses has surged to its highest level since 2022, according to the latest survey from the Auckland Business Chamber 10:07

    Law and Order:
    What we know about ex-prince Andrew's arrest on suspicion of misconduct in public office 9:27

    Law and Order:
    A man's been arrested in Hamilton, allegedly stealing tools worth more than 44-thousand dollars 9:27

    National:
    More women are professors, but gender gaps continue to plague NZ universities 8:27

    Law and Order:
    Three men will appear in court today, following a gang-related altercation in Christchurch's Northcote  8:17

    Education:
    Why the ‘Streets of Minneapolis’ have echoed with public support – unlike the campus of Kent State in 1970 8:17

    Business:
    An urban planning specialist is advising the government to be careful changing a development plan he believes was already working 8:17

    Cricket:
    Giddy heights for Zimbabwe at cricket's T20 World Cup 8:07


     News Search






    Power Search


    © 2026 New Zealand City Ltd