Blog

The Calculus of the Shortcut: Lessons from a Conversation with a College Junior

January 30, 2026 Author: Barbara Kenny Read time: 5 min
learn about how AI can be used to promote productive struggle

I recently spoke with a college junior who offered a painful, yet essential, truth about learning today.

She confessed that for some assignments, the work felt “painfully tedious”.

Not because the material was difficult, but because the assignment itself was a high-effort, multi-step task designed only to achieve a surface-level check for understanding. In her mind, the effort was completely misaligned with the reward. She didn’t want to use AI, but she recognized the pressure

This student didn’t want to cheat, she wanted to learn. But the task design made the shortcut irresistible.

This is the hidden crisis of generative AI in higher education: It’s not just an academic integrity problem; it’s an assignment design problem. The threat isn’t plagiarism; it’s that AI has exposed how often assignments weren’t asking students to engage in meaningful productive struggle in the first place and now we have an opportunity to design tasks that truly spark curiosity, effort, and intellectual growth.

The Effort-Reward Matrix: Why Students Opt for the Shortcut

To solve this, we must first understand the student’s calculus. Learning, especially complex, critical thinking, is work. Students are only compelled to do that work when the reward is worth the effort.

The student conversation reveals a spectrum where students decide whether to engage in the work or take the shortcut:

  • When the Reward is Small, Keep the Effort Small: For a frictionless, completion-based assignment like a quick multiple-choice check for understanding, the output is surface-level. Students won’t be compelled to use AI because the effort required didn’t motivate it.
  • When the Reward is High, Demand Productive Struggle: For assignments that are hard, challenging, and meaningful, the student is gaining something from the task and wanted that reward. This intrinsic motivation means they won’t use AI.

The real danger lies when there is a misalignment between effort and reward. This is the Painstaking Task, a complex assignment designed only to gauge surface-level understanding. When the effort is high but the reward is low, the student knew they could use AI to spit out the answer without seeing it as a risk because it wasn’t challenging them to do anything besides demonstrate understanding.

Integrating AI: From Suspicion to Shared Responsibility

We can no longer design policies and assignments around the assumption that students want to cheat, as this will erode trust between students and teachers, which is foundational to any sort of learning environment.

Instead, we must shift from a framework of suspicion and detection to one of trust and integration.

This requires three strategic shifts for higher ed leaders:

1. Teach AI Literacy Through Shared Vulnerability

AI literacy is not about citing sources or defining policy—it’s about discernment and critical thinking.

  • Transparency is Key: AI integration without transparency will breed distrust. Faculty should model responsible AI use by openly sharing when they used AI, when it worked, and when it failed.
  • Embrace Uncertainty: Instructors don’t need to be AI experts. They can navigate the uncertainty with their students, inviting open conversation about the power and limitations of the models. Clarify what skills are important to remain very human, and why.

2. Redesign the Learning Process: AI as a Voice, Not a Vault

We must bring AI into existing structures to elevate the human conversation, not replace it.

Use cooperative learning strategies like “Think, Pair, Share” to weave AI literacy into the core curriculum.

  • Think: Students can prompt AI on their own.
  • Pair: Students partner to compare and critique the AI output, analyzing what perspectives are missing and how the model mirrors the user’s framing.
  • Share: The class shares its findings and critiques, positioning the AI output as just another voice in the class that needs to be evaluated.

This approach forces students to engage in critical thinking to analyze the answer, not just receive it.

3. Lead with the Belief That Students Want to Learn

The ultimate advice for higher education leaders is to never let go of that belief that students want to learn.

If an assignment demands a lot of effort, make sure it is meaningful. Make it personal, tie it to current events, and give students agency and choice in how they demonstrate their understanding.

The rise in AI elevates the need for productive struggle. Our job is to use Instructional AI deliberately, not universally, to ensure that while AI may gain a voice in the classroom, the human voices are 10 times more important. Learning is a social act that must be kept human-centered.

The threat of generative AI is the erosion of the critical skills necessary for student success. The shift from a reactive strategy of suspicion and detection to a proactive strategy of assignment design is essential.

We must make the reward worth the effort.

Ready to see how Instructional AI can transform your assignment design and prevent student skill decay?
Watch Our On-Demand Webinar: Designing for Depth: Assignment Strategies That Prevent Cognitive Offloading