Hooked on Help: Are Young People Becoming Too Reliant on AI?

Exploring how students’ dependence on AI for academic, emotional, and daily life questions could weaken critical thinking—and how teachers can encourage balanced use.

Hooked on Help: Are Young People Becoming Too Reliant on AI?
Photo by Markus Winkler / Unsplash

Picture Rohan—a second-year politics student—turning to ChatGPT not only for essay support but also to craft cover letters, choose courses, and manage neurodivergence-related stress.

His reliance on AI became so consistent that he admitted to worrying, “Using ChatGPT will make my brain kind of atrophy.” This example, shared in a recent article I read, "18 months. 12,000 questions. A whole lot of anxiety. What I learned from reading students’ ChatGPT logs" captures a growing concern: are young people losing their spark in the AI glow?


Takeaways

1. AI’s Flattery Fuels Excessive Use

Generative AI tools are engineered for engagement. Many students report that these systems respond with warm affirmations—“That’s an insightful question!”—or praise for their thought process, even when the question is routine. I've found myself enjoying the flattering comments while working with ChatGPT.

While encouraging words can boost confidence, they also create a reward loop where students seek validation from AI rather than cultivating self-assessment skills. Over time, the dependency may shift from using AI as a tool to relying on it as an emotional boost before tackling any intellectual challenge.

2. AI Hallucinations Go Unchallenged

AI can misquote texts, fabricate statistics, or present outdated information with total confidence. Because the tone is authoritative, students can accept it at face value.

For example, a misattribution from Homage to Catalonia could easily end up in an essay unnoticed. Without teaching students the discipline of verifying claims—such as cross-checking in multiple credible sources—AI’s polished delivery can make falsehoods seem like facts.

3. AI as Academic and Emotional Crutch

Students aren’t just using AI for essay prompts or math help—they’re consulting it about course selections, job applications, friendship advice, and coping strategies.

While AI can offer clarity and organization, overuse deprives students of opportunities to wrestle with uncertainty, weigh pros and cons independently, and learn from trial and error.

These are key moments where persistence, resilience, and judgment are built.

4. Impact on Learning and Well-Being

Cognitive science tells us that struggle strengthens learning. When AI takes over idea generation, drafting, or problem-solving, students miss out on the “productive difficulty” that deepens understanding.

The mental shortcuts can also reduce long-term retention and weaken creative problem-solving.

On a personal level, leaning on AI for companionship or reassurance—especially in place of real social interactions—may contribute to loneliness or reduced emotional resilience.

5. Institutions Struggling to Respond

School and university policies on AI are inconsistent. Some require AI-use disclosure, others ignore the issue entirely. The gap leaves educators unsure how to respond when AI-assisted work surfaces.

Without shared expectations, students receive mixed messages, which can unintentionally normalize academic shortcuts and hinder the development of a clear ethical framework for technology use.


Practical Classroom Strategies for Balanced AI Use

To help students benefit from AI without losing essential thinking skills, we can implement strategies that blend technology use with critical thinking, verification, and reflection:

1. Model Critical Checking in Real Time

  • Project an AI-generated response in class and dissect it together.
  • Highlight correct information, but also identify weaknesses, missing perspectives, or factual errors.
  • Encourage students to ask, “What would I need to verify before trusting this?”

2. Design AI + Human Hybrid Assignments

  • Have students write an initial draft of a paper or solution set before consulting AI.
  • Allow them to use AI for refinement, but require an annotation step—highlighting what came from AI and what was their own work.
  • Grade the quality of both their original thinking and their ability to critically integrate AI suggestions.

3. Implement Reflection Logs

  • Create a simple “AI Use Report” students submit with assignments.
  • Include prompts like: How did you use AI? What did it add? What did you reject and why?
  • Over time, these logs help students see patterns in their use—and overuse—of AI.

4. Encourage Peer Verification Exercises

  • Pair students to review each other’s AI-assisted work.
  • Their job: check for accuracy, detect over-reliance, and suggest where the student’s own voice could be stronger.
  • This builds a culture of accountability and collaboration while sharpening fact-checking skills.

5. Build Digital Literacy as a Core Skill

  • Teach students about how large language models work, including their tendency to hallucinate.
  • Discuss the difference between truthful and reasonable-sounding.
  • Introduce frameworks for assessing credibility, such as the CRAAP Test (Currency, Relevance, Authority, Accuracy, Purpose).

AI is a powerful ally—but without deliberate boundaries, it risks becoming a crutch that erodes curiosity, independence, and critical thinking. Teachers have a unique opportunity to guide students toward responsible use that supports, rather than replaces, their own intellectual growth.

Reflection:
What’s one way you could require students to show their own thinking alongside any AI-generated help they use?

❤ Enjoy this Article?

🍵 Show Your Support and 🤗 Share It