How professors can use AI without encouraging cheating

Feb 14, 2026

— Romain

Learning

Event

How professors can use AI without encouraging cheating

When ChatGPT launched in late 2022, higher education panicked. Within months, universities rushed to implement AI detection software like Turnitin, fearing widespread cheating and diminished critical thinking. But two years later, the data tells a different story: 38% of faculty at USC Marshall School now use AI in their classrooms, and 86% of professors globally believe AI will significantly transform teaching.

The question is not anymore about whether AI belongs in education, but rather how professors can harness AI to support pedagogy without compromising academic integrity.

If you are a program director or instructional designer, it is likely that you are caught between faculty resistance and institutional pressure to "do something about AI." We wrote this guide to help you bridge the gap, offering concrete ways to integrate AI into platforms like Moodle while keeping professors firmly in control of the learning experience.

Why banning AI doesn't work (and what actually does)

Banning AI tools might feel like the safest option, but it is not.

In reality, over 80% of students at elite colleges already use generative AI, mostly for augmenting learning and getting explanations, feedback, and summarization ; rather than for outsourcing work. When institutions ban AI, they do not stop its usage; they drive it underground and they loose touch with their students.

More critically, 63% of professors say recent graduates are unprepared to use generative AI at work and 71% report graduates lack understanding of AI ethics. By banning AI in academic settings, institutions are graduating students into an AI-first workforce without the skills to navigate it.

Here what we believe that works instead: Policy frameworks that balance encouragement with guardrails. The World Economic Forum reports that 71% of teachers and 65% of students view AI assistants as essential for learning and workforce preparation (link), and we think campuses are an important place for students to start understanding how to make the most out of AI.

AI assistance vs. AI replacement: the critical distinction

Faculty anxiety often stems from a blurred line: Is AI helping students learn, or doing the learning for them?

AI assistance is when AI actually augments the learning process:

  • Providing personalized explanations of complex concepts

  • Offering formative feedback on drafts (not final submissions)

  • Generating practice questions for self-assessment

  • Summarizing lengthy readings to improve comprehension

Instead, AI replacement is when AI circumvents learning:

  • Writing essays or completing assignments without student input

  • Solving problem sets automatically

  • Bypassing critical thinking requirements

The difference lies in pedagogical intent. When AI handles routine cognitive tasks like formatting citations or generating basic summaries - it frees students for higher order thinking. When AI replaces analytical work, it undermines the learning outcomes professors are paid to deliver.

A key insight we noticed is that students with higher AI proficiency actually perceive less threat to their jobs than those with lower proficiency. Familiarity reduces fear, and our objective is moving professors from anxiety to competency through structured, pedagogically sound AI integration.

How to support pedagogy with AI tutors inside Moodle

Moodle remains the dominant LMS in higher education and has never stopped evolving. In particular, Moodle 4.5 introduces native AI capabilities through an AI panel that manages providers (Azure AI, OpenAI) and placements (where AI actions appear)

Here's how purposeful AI integration inside Moodle can enhance your pedagogy:

  1. Personalized learning at scale: AI teaching assistants can analyze individual student performance and adapt content pacing in real-time. For program directors managing large introductory courses, this means identifying at-risk students weeks earlier than traditional grade-book reviews

  2. Automated administrative tasks: AI can handle routine questions about deadlines, assignment requirements, and resource locations; that can free time from instructors while ensuring students get immediate answers. This is not replacing teaching, but only removing friction from the learning environment.

  3. Formative feedback loops: In this case, students receive instant AI-generated suggestions on drafts instead of waiting for instructor feedback for days. The professor remains the final arbiter of grading, but students get more revision cycles that means improving final output quality.

  4. Content transformation: AI assistants can convert existing materials (SCORM packages, PDFs, even handwritten notes) into interactive learning modules without instructional designers having to rebuild courses from scratch.

Practical teaching use cases (cheat-proof)

Based on current adoption patterns among ELA and science faculties (who use AI tools at nearly double the rate of math faculty, here are proven use cases that can enhance learning without compromising integrity:

  1. The socratic AI tutor: Configure AI assistants to ask guiding questions rather than provide answers. When students struggle with concepts, the AI engages in Socratic dialogue prompting critical thinking rather than delivering solutions.

  2. Transparent AI drafting: Require students to submit AI-generated first drafts alongside their final work, with reflection on what they changed and why. This teaches AI literacy while maintaining accountability.

  3. Scenario-Based Learning: Use AI to generate infinite variations of case studies or problem scenarios. Students can't copy answers because each iteration differs, but they get personalized practice matching their skill level.

  4. Peer review augmentation: AI assistants can guide students through structured peer review protocols, ensuring feedback quality without replacing human judgment.

  5. Accessibility support: For students with learning differences, AI can provide real-time transcription, simplified language versions of complex texts, or audio explanations of visual content.

Addressing the real faculty concerns

If you are presenting AI integration to skeptical professors, we recommend acknowledging the legitimate concerns:

  1. AI will make students lazy thinkers → 81% of faculty rank facilitating critical thinking as the most essential skill in the digital age. AI integration must explicitly preserve and enhance critical thinking. Socratic AI tutors can help doing this, as well as transparent AI use policies and assessments that require synthesis beyond AI capabilities.

  2. I don't have time to learn new technology → 68% of faculty say their institutions haven't prepared them to use AI. Embedding AI features into existing workflows (such as Moodle integrations) are a good way to make a smooth transition before larger-scale deployment.

  3. How can I prevent cheating if AI can write essays → 54% of faculty believe current evaluation methods are inadequate in the AI age. We believe this is an assessment design challenge rather than an AI problem. Shift toward process-oriented evaluation, oral exams, and in-class synthesis tasks.

Getting started: a roadmap for program directors

  • Phase 1: Audit current AI usage (month 1). Survey faculty and students on current AI use. You'll likely find 80%+ already using generative AI, whether it is guided or hidden.

  • Phase 2: Establish an AI literacy baseline (months 2-3). It is important to keep in mind that 40% of faculties are just beginning their AI literacy journey, and only 17% consider themselves advanced in it. Invest in faculty development before platform deployment.

  • Phase 3: Pilot in low-stakes environments (months 4-6). Test AI integration in formative assessments and student support before high-stakes implementation.

  • Phase 4: Scale with governance (month 7+). Develop clear policies on AI use in coursework - 93% of teachers and 79% of students believe regulations are needed on this matter.

The strategic advantage

We strongly believe that Institutions that thoughtfully integrate AI now will differentiate themselves from the pack in the upcoming three years. Those that ban or ignore it will graduate students unprepared for an AI-augmented workforce.

For program directors and instructional designers, we see a clear opportunity there: be the internal champion who bridges faculty concerns with institutional innovation. Teachers may not hold procurement budgets, but they champion tools internally. When professors see AI as a pedagogical ally rather than an integrity threat, they become your strongest advocates.

Good luck!

All rights reserved ©2025 Raison, SAS.

All rights reserved ©2025 Raison, SAS.