×
Study finds AI tools may be eroding our cognitive abilities through “offloading”
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Growing research suggests AI tools may be eroding our cognitive abilities through excessive “cognitive offloading,” where people outsource mental tasks to technology. A Wall Street Journal reporter’s personal experience with language deterioration after heavy ChatGPT use illustrates how AI dependency might be harming rather than helping our intellectual capacities.

The big picture: A tech journalist discovered his French language skills noticeably deteriorated after relying on ChatGPT to handle his communication, highlighting broader concerns about AI’s potential negative impacts on cognitive functioning.

  • Sam Schechner, a WSJ reporter living in Paris, found himself “grasping for the right words” after habitually using AI to draft texts and emails in French.
  • His experience aligns with emerging research showing how outsourcing thinking to AI systems may weaken critical cognitive skills.

What experts are saying: Psychologists and neuroscientists warn that overreliance on AI for cognitive tasks follows patterns seen with other technologies that diminish human capabilities through disuse.

  • “With creativity, if you don’t use it, it starts to go away,” cautioned Robert Sternberg, a Cornell University psychology professor.
  • Neuroscientist Louisa Dahmani explained that “tools like GPS and generative AI make us cognitively lazy,” drawing parallels to her 2020 research showing GPS dependency impairs spatial memory.

The supporting evidence: Recent scientific studies suggest AI dependency correlates with declining cognitive performance across multiple domains.

  • Research from Microsoft and Carnegie Mellon found that critical thinking skills weakened as subjects increasingly relied on and trusted AI responses.
  • A separate study discovered a troubling connection between heavy ChatGPT use among students and both memory loss and declining academic performance.

Why this matters: Language models specifically target language processing—a fundamental component of human thinking—making their cognitive impact potentially more profound than previous technologies.

  • While cognitive offloading can be beneficial when it frees mental capacity for more important tasks, language models blur the line between outsourcing tedious work and outsourcing thought itself.
  • The tendency to follow “the path of least resistance” makes mindful, balanced AI use challenging for most people, even when they recognize the cognitive risks.
Man Alarmed as His Cognitive Skills Decay After Outsourcing Them to AI

Recent News

Large Language Poor Role Model: Lawyer dismissed for using ChatGPT’s false citations

A recent law graduate faces career consequences after submitting ChatGPT-generated fictional legal precedents, highlighting professional risks in AI adoption without proper verification.

Meta taps atomic energy for AI in Big Tech nuclear trend

Tech companies are turning to nuclear power plants as reliable carbon-free energy sources to meet the enormous electricity demands of their AI operations.

AI applications weirdly missing from today’s tech landscape

Despite AI's rapid advancement, developers have largely defaulted to chatbot interfaces, overlooking opportunities for semantic search, real-time fact checking, and AI-assisted debate tools that could transform how we interact with information.