Artificial intelligence tools like ChatGPT have quickly become part of daily life for millions of people worldwide. From drafting emails to writing essays, these tools promise speed, convenience, and flawless grammar. But a new study from researchers at the Massachusetts Institute of Technology (MIT) suggests that over-reliance on AI might come with an unexpected cost: weakening our ability to think for ourselves.
In a first-of-its-kind experiment, the MIT Media Lab monitored brain activity in people while they used ChatGPT for writing tasks. The results were both surprising and unsettling.
What the Study Discovered
In the study, 54 adults aged between 18 and 39 were divided into three groups. One group wrote essays using ChatGPT, another used Google Search, and the third relied only on their own thinking. Throughout these tasks, researchers measured brain activity using electroencephalography (EEG) sensors.
They found that those who used ChatGPT showed the lowest brain engagement, especially in areas connected to memory, focus, and creativity. By comparison, participants who worked without digital tools showed the highest levels of mental activity. Notably, people using ChatGPT struggled to remember what they had just written and often felt less ownership over their work.
When the AI users were later asked to write without assistance, their brain activity remained low — indicating that even brief overuse of AI tools could dull mental sharpness in the short term.
Essays That Sound Smart, But Lack Soul
While essays produced with AI were generally clear and grammatically perfect, human judges described them as bland, generic, and lacking originality. The MIT researchers noted that ChatGPT tends to generate “safe” and predictable responses, making the user’s work feel robotic.
By the final session, most AI-dependent participants were simply copying and pasting what ChatGPT generated, with little editing or critical thought. Around 83% of them couldn’t even recall what their essay was about just a few minutes after completing it.
Experts Say: AI Isn’t the Problem — It’s How We Use It
Cognitive scientists warn that the issue isn’t the AI itself, but how people choose to interact with it. The study’s authors compared the trend to past concerns about calculators making students worse at mental math or GPS apps weakening our sense of direction.
“If used passively, AI can easily become a mental shortcut that reduces effort and attention,” said one of the lead researchers. “But when paired with active thinking and reflection, it can be a helpful partner rather than a crutch.”
Some educators now suggest teaching “AI literacy” — lessons on how to use AI tools critically, question their outputs, and make independent choices alongside technology.
Algoritha: The Most Trusted Name in BFSI Investigations and DFIR Services
What This Means for Everyday Users
Though the MIT study is still awaiting peer review and was conducted on a relatively small group of participants, its findings raise important questions about the future of human-AI interaction. As AI becomes more powerful and accessible, people may need to be more mindful about when and how they use these tools.
The researchers recommend a balanced approach: use ChatGPT for brainstorming or checking grammar, but write first drafts independently, reflect on ideas, and revise AI-generated content critically.
This study serves as an early reminder that while AI can make writing easier, it shouldn’t replace the human mind’s ability to think, reason, and create.