All Articles

The Skill Was Never Essay Writing

Brandon Gadoci

Brandon Gadoci

January 26, 2026

A new MIT Media Lab study found that people using ChatGPT to write essays showed weaker brain connectivity than those writing without AI assistance. The LLM users couldn't quote their own work minutes after finishing. Teachers called the output "soulless." The conclusion being drawn: AI creates "cognitive debt."

But this framing asks the wrong question.

The study measures whether AI reduces cognitive engagement during essay writing. The more interesting question is whether essay writing was ever the point.

What We Actually Learned From Essays

Nobody writes essays to write essays. Essay writing is a vehicle for developing something else: clarity of thought, the ability to structure an argument, the discipline of making an idea land for a reader. The act of wrestling with blank pages and forcing scattered thoughts into coherent paragraphs builds these underlying capabilities.

The same is true for doing math by hand. We didn't make students work through long division because long division is intrinsically valuable. We did it because the process develops logical reasoning, pattern recognition, and the mental discipline to work through problems systematically.

The vehicle is not the skill. The vehicle develops the skill.

The Calculator Parallel

When calculators became widespread, critics worried they would make us worse at math. And in one narrow sense, they were right. Students who use calculators show less brain activity during computation than those doing it by hand. They often can't explain the underlying concepts as fluently.

But here's what actually happened: mathematical reasoning didn't disappear. It moved upstream.

Instead of spending cognitive effort on arithmetic, mathematicians and engineers spend it on knowing which operations to apply and interpreting what the results mean. The grunt work of calculation got abstracted away. The higher-order skill of mathematical thinking found new expression at a higher level of abstraction.

Nobody argues we should go back to slide rules. The calculator freed us to do more sophisticated mathematics. The underlying capability, the thing that arithmetic was actually developing, just found a new vehicle.

What's Happening Now

The MIT study shows that offloading essay writing to an LLM reduces cognitive engagement during the writing task. This should not surprise anyone. When you delegate work to a machine, your brain does less of that work. That's the point.

The question is: what happens to the underlying skill that essay writing was developing?

If someone uses AI to bypass thinking entirely, they won't develop clarity of thought. They'll produce output without engaging the cognitive muscles that essay writing was meant to build. That's a real risk, and the study provides evidence for it.

But if someone uses AI as part of a more sophisticated cognitive process, they can develop the same underlying skill through a different vehicle. Directing an AI toward a clear outcome requires knowing what you want to say. Evaluating whether the output captures your thinking requires the ability to recognize good argumentation. Refining and pushing back on AI drafts requires the same clarity of thought that essay writing develops.

The skill abstracts up a level. Just like it did with calculators.

The New Vehicle

What does this look like in practice?

Consider someone who prompts an AI with "write an essay about climate policy." They copy the output, submit it, and move on. This person is not developing clarity of thought. They're skipping the cognitive work entirely.

Now consider someone who spends twenty minutes articulating exactly what they want to argue, prompts the AI with specific direction, reads the output critically, identifies where it missed their point, pushes back with more precise guidance, and ultimately produces something that reflects their thinking but was assembled with AI assistance. This person is doing the cognitive work. The vehicle changed, but the underlying skill development is still happening.

The second person will be able to quote their essay. They'll understand their own argument. They'll have developed the clarity of thought that essay writing was always meant to build. They just did it through a different process.

The Abstraction Pattern

This is a pattern we've seen before, and we'll see it again.

When writing moved from handwriting to typing, we worried about losing something. We did lose beautiful penmanship. We gained the ability to produce and revise text faster, which enabled more sophisticated written communication.

When research moved from library stacks to search engines, we worried about losing deep engagement with sources. We did lose some of that. We gained the ability to synthesize across vastly more information, which enabled more comprehensive analysis.

Each abstraction trades one form of engagement for another. The question is never whether the old engagement disappears. It always does. The question is whether the underlying capability finds adequate expression in the new vehicle.

Sometimes it doesn't. There are genuine losses in each transition. But the response isn't to refuse the abstraction. The response is to be intentional about how we develop the underlying capability in the new environment.

What Organizations Should Focus On

This reframing has practical implications for how we think about AI adoption.

Stop asking whether people are using AI. Start asking whether they're developing the underlying capabilities that matter for their role.

For knowledge workers, that means clarity of thought, sound judgment, the ability to evaluate quality, the skill of directing work toward good outcomes. These capabilities can be developed through AI-assisted processes or AI-free processes. The vehicle matters less than whether the cognitive engagement is actually happening.

Training programs should focus less on "how to prompt" and more on "how to think clearly enough that your prompts produce good results." The latter requires the same underlying skill that essay writing was always developing. It just expresses through a different vehicle.

The Real Risk

The MIT study points to a real risk, just not the one the headlines suggest.

The risk isn't that AI will make us dumber. The risk is that some people will use AI to skip the cognitive work that builds capability, while others will use it as a new vehicle for the same underlying development.

Over time, this creates divergence. People who engage deeply, even through AI-assisted processes, will continue developing clarity of thought and sound judgment. People who delegate without engagement will plateau or atrophy.

This divergence already exists. Some people used essays and math homework to develop real capability. Others did the minimum to get a grade and never built the underlying skills. The vehicle was always less important than the engagement.

AI doesn't change this dynamic. It just makes the choice more visible.

What Was the Skill?

The skill was never essay writing.

The skill was thinking clearly enough to communicate, reasoning through problems systematically, developing judgment about what constitutes quality work.

Those capabilities still matter. They'll always matter. The vehicles we use to develop them will keep evolving.

The MIT study shows that one particular vehicle, writing essays with heavy AI assistance, produces less cognitive engagement than writing essays unaided. This is roughly as surprising as finding that calculators produce less cognitive engagement than doing arithmetic by hand.

The interesting question isn't whether the old vehicle still works. It's what the new vehicle looks like, and whether we're being intentional about using it to develop what actually matters.

The skill was never essay writing. It was always something deeper. Our job now is to make sure we're still developing it.

Want to Learn More?

Explore our full library of resources or get in touch to discuss how we can help your business.