Keep Doing Your Own Writing

Holland & Hart - Your Trial Message
Contact

Holland & Hart - Your Trial Message

I’m old enough to feel okay claiming full “curmudgeon” status when it comes to A.I. as a writing tool. I know some will say that puts me behind the times, and others will say that I’m missing out on opportunities. But the early social science on the effects of A.I. on writing and writers is showing that I may be right. Composing drafts based on large language models may be offering a convenience gain that comes at a steep cognitive cost. Based on a number of studies reviewed in a recent article in The New Yorker, “A.I. Is Homogenizing Our Thoughts,” reliance on composition aids like ChatGPT appears to limit the variety of ideas and constrict the intellectual development that comes from generating them on our own.

For me, the concern comes down to not just the risk of the euphemistically-named “hallucinations,” or a lack of trust for composition purely based on probability algorithms from an entity with no regard for (or even concept of) “the truth,” but to a real concern that in neglecting and outsourcing our creative ability to compose text, we are endangering something deeply important. Older readers will remember a time when we not only could read a paper map, but we used them to navigate our way across town or around the country. Now, just a short time later, those habits and skills seem to be mostly lost because it is too easy to simply follow turn-by-turn instructions from our phone. What if the same thing happens to our ability to write?

The Research: What Does A.I. Do to Our Writing?

The article in The New Yorker, by Kyle Chayka points to several early and troubling research findings. In studies looking at both the products and the process of writing (picture a room full of students writing either with or without ChatGPT while having electrodes monitoring their brain activity) researchers have found several effects:

Less Engagement

In a study conducted by the MIT creativity lab (Kosmyna et al., 2025) researchers measured brain activity in participants writing either on their own, with the use of a search engine, or with the use of ChatGPT. They noted a “dramatic discrepancy” with the latter group, measuring less brain activity, fewer connections across different regions of the brain, and a reduced engagement of the subjects’ working memory.

Less Creativity

The same MIT study also specifically measured less of what is called “alpha connectivity” which is associated with creativity. A Cornell study (Agarwal, Naaman & Vashistha, 2025) backed that up by looking at the resulting content variety within the written product of those using or avoiding A.I. tools. Participants from India and America were asked to write about their favorite food or holiday, and researchers found that the use of an A.I. aid resulted in greater homogeneity in the answers with results being more similar to each other and also geared more toward Western cultural norms (e.g., those using the tool were significantly more likely to offer “pizza” as their favorite food). An analysis from Santa Clara University (Kreminski, 2024) similarly compared creative thinking tasks, noting that the ideas generated by the ChatGPT-using group were more homogenized and semantically similar. The New Yorker article concluded, “when people use A.I. in the creative process they tend to gradually cede their original thinking.”

Average Expression

This has an effect on the written product, and this is a problem that could accelerate if a greater proportion of text on the internet moves toward being A.I. generated. The MIT study referenced above notes that for the ChatGPT-using research subjects, “the output was very, very similar for all of these different people.” For example, participants were given a prompt on philanthropy (“Should people who are more fortunate than others have more of a moral obligation to help those who are less fortunate”), and all of the ChatGPT-aided responses ended up supporting philanthropy for similar reasons, while some of those without that aid included critiques of philanthropy. With the A.I., “you have no divergent opinions being generated,” the researchers noted. The Santa Clara research (Kreminski, 2024) also argues that A.I. use in writing pulls the user “toward the center of mass for all the different users that it’s interacted with in the past.”

Less Ownership and Knowledge

Finally, the MIT study measured the sense of ownership or responsibility that participants felt for the written product, and observed that the ChatGPT users felt “no ownership whatsoever” over the resulting work, with fully 80 percent being unable to quote from their own essays. The New Yorker concludes, “With A.I., we’re so thoroughly able to outsource our thinking that it makes us more average, too”

All of these are effects that lawyers should seek to avoid. In response to these concerns, experienced writers might think, “It is just a tool, I still decide what goes into the writing and what doesn’t; I am still the writer in the sense that I am the one ‘curating’ the content.” This, however, may be wishful thinking. One of the researchers in the Cornell study, Aditya Vashistha, compares the use of A.I. as a writing aid to a teacher sitting with you and constantly reminding you “this is the better version.” Over time, it is likely that new writers, especially, will simply start to defer to that voice. “Through such routine exposure,” Vashistha argues, “you lose your identity, you lose the authenticity. You lose confidence in your writing.”

Ultimately, our ability to express is so bound up in our ability to think that any level of displacement of authorship seems profoundly dangerous. In legal writing in particular, courts and clients are looking not just for knowledge, but for judgment and perspective. At present, A.I. can do a passable-to-good job of creating communication that seems like it could have come from at least a moderately informed and thoughtful person. But when we rely on A.I., it seems like we are directly reducing our own chances of being that person.

I write frequently for this blog, I also author other articles and of course generate frequent reports as part of my work. Like many of us, I will also spend a good part of the day composing emails. Through all of that, and despite the nagging from something new called “Copilot,” I have not succumbed to any temptation to ask A.I. to write anything for me. No posts, no articles, no reports, no emails — not even a topic list or a brainstorm.

In present time, there are powerful commercial interests aligning on the side of pushing the working public to rely more and more on A.I. tools in our basic expression. For now at least, I plan to resist.

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Holland & Hart - Your Trial Message

Written by:

Holland & Hart - Your Trial Message
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Holland & Hart - Your Trial Message on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide