About the Author: Hello, readers! My name is Loki, and I’ve been writing on Write the World for a few years now. I’m 14, and I started doing fantasy writing in first grade, though now my genre is typically nonfiction. I enjoy imagery and unconventional poem styles, as well as sculpting and birdwatching. My goal as Write the World's teen AI Liaison is to educate, not for the purpose of discouraging all artificial intelligence in writing, but to help you understand the effects of such tools and safe ways to use them.
Without further ado, I hope you find this first post helpful.
--
Welcome to the age of AI, my fellow writers. We’re all hooked into the threads of what if, should I, how, why. It’s what some may think of as the age of anti-inquisition, the age of anti-intellectualism. But are we facing the end of critical thinking, or an era of endless literary potential?
Artificial Intelligence has made its way into every aspect of the modern teen’s life, or at least the parts relevant to one's success in academics. I’ve been an English tutor, an editor, a poet, and now Write the World's teen AI Liaison, which is to say that I've watched LLMs grow and change into a fearsome beast. At the same time, from a long look into other authors' takes on AI and its science, I’ve reached a consensus about artificial intelligence in writing, the roots of anti-intellectualism, and insights into how we can use these tools responsibly and ethically.
Perhaps with these in mind, it can be an age of AI for us writers as well?
How AI Can Harm Critical Thinking
“When will we ever use this?”
This popular refrain sometimes seems, from my perspective as a high school student, like a fair enough argument against many topics covered in school. Elementary covers foundational skills, and college gives you specialization in the field you're going into. But the abridging classes don’t always feel immediately relevant.
And so, many of us ask: Will we ever use this? Why would we need to when there's a machine that can replace the function of our brains to complete the very same task? Perhaps we never really needed to learn the way to light up neurons, perhaps it’s best to streamline poems to prompts in a generator?
Is that what the new writer's merit should be?
If you’re a writer, artist, or creative, I certainly hope you didn’t blindly agree with the above statements. As someone who is critical of many academic systems, I feel obligated to let you know that there is still a benefit to doing your own work. It’s enrichment; it’s training yourself to learn. While mathematics and English, in the grand scheme of careers, once seemed niche to me, it’s important to realize that these subjects lay a strong foundation for your future after all. Not only for knowing basic math facts or how to write an Op-Ed, but for learning how to learn.
While the common calculator does a far better job with your grocery prices or taxes than the average mental calculation could, using the tool does not strengthen your mind. In fact, such dependency can reduce your potential for creating, analyzing, and thinking for yourself. We can apply the same thinking to AI. Artificial intelligence has become commonplace in the writing of essays–both short and long responses in the classroom. I see the appeal. However, total surrender to artificial intelligence in place of authentic human writing causes a loss more than a letter grade. It compromises the ability to write, research, and form opinions. Caution and moderation are key.
AI and the Risk of Reducing Independent Thought
I encourage readers to do their own research, outside of this blog post, to learn more about how convenience–though enticing–has a cost. Do not let boredom and procrastination turn into compliance with any answer or statement AI doles out. Remember that writing is not only about having perfect grammar or formatting, despite what many classes might teach you. The content of your writing is important. Human thought and effort can never be triumphed over by LLMs.
Knowing this–that storytelling is uniquely important and powerful–it is imperative that other young writers be aware of anti-intellectualism and how AI can foster it. Anti-intellectualism is a term that refers to a general distrust for intellectual thought and those who engage in it, and it is far more dangerous than it may at first sound.
A significant majority of students are now using AI tools for essays and papers: 92% in the UK (up from 66% the previous year), with 88% using generative AI for assessments. Depending on how students use AI–for example, to generate essays versus receive feedback on their own drafts–they may develop a reliance on these machines for critical thinking, research, and even to complete entire assignments. Although this may sound convenient, it almost completely scratches out the goal of teaching analytical thought and the content/topic of the written assignment.
What’s more, having so much information readily available makes it easier to take things at face value. Easier to not do one's own research, to read biased information and absorb it as such. In the end, AI tools are products, and if they can sense your bias, they will lean into it. (Or worse, the tool can be biased by training data, towards a political or scientific agenda, thus perpetuating these thought patterns).
Another cornerstone of anti-intellectualism is believing, as a population, that practical experience and intuition are somehow “better” than formal education and the arts. This might mean dismissing scientific papers as pretension, or viewing fields like history, literature, and philosophy as irrelevant or unproductive. Dismissing these topics is a kick in the face to individual and collective knowledge and evidence-based reasoning. When a culture or society condemns the act of study and the merits of being well-read or having a formal education, it leads to a devaluing of the members of society whose lives are dedicated to spreading their research and information for the greater good of our society; it is a dismissal of their research and expert statements, which means we may miss urgent and important facts and discoveries.
This phenomenon leads us into a state where we dismiss ethical discussions, empathy, and our capacity to understand history and science in a nuanced way. If these patterns dominate, we could see a time when the general population places little value on thinking for themselves and making informed, objective decisions.
With these concerns in mind, AI could prove to be a slippery slope, providing information at your fingertips, summarized and possibly biased (or completely false), eliminating the ‘need’ to put effort into your own papers and research.
But this doesn’t have to be the case.
Responsible Uses of AI Tools
Now, given that I’ve shared some of the worst-case scenarios related to AI in our communities, I should clarify that AI itself is not inherently evil, nor are people who use it. It is a tool, and there are use-cases that do not pose nearly as much risk, and that actually provide helpful and productive assistance.
Some examples include adapting content for people who otherwise couldn’t access it, such as artificial intelligence assisted translations and transcriptions–think text-to-speech and the recent tools on major social media platforms that give a much higher percentage of posts with closed captioning. It is, of course, important to note that the scenarios above involve AI but not large language models themselves.
Regarding the use of AI in schools, there are scenarios for ethical application, such as offering personalized practice tools and helping teachers quickly form visualizations of instructional concepts. In writing, while it is imperative for students to learn how to draft, research, and summarize information themselves, some AI tools, like Write the World’s Socratic AI writing companion, Clara, can provide assistance in ways that do not compromise the human voice. Clara, for example, can collaboratively come up with prompts and themes, only with direct engagement from the user, ensuring you have a jumping-off point for your piece while your writing remains yours.
Why We Write
An important question to ask before engaging with AI for any work is: Why do we write in the first place?
We write to learn, perceive, and change our worlds. We write to share ideas, information, thoughts, emotions. This is writing, the strongest tie between art, humanity, and information. Artificial intelligence is here to stay, but we are humans, and AI is not here to shape us into its form; it is here for us to use it as a tool and, rather than strip us of our ability to learn, to enhance it.
If your learnings about AI have discouraged you from making art, poetry, or essays in any way–because a machine can do it faster and more efficiently–realize now: Your voice is needed. Despite what the world seems to say, despite how LLMs may seem to blow your skills out the window, understand that your unique skills and voice are our first line of defense against anti-intellectualism, blind obedience, and illiteracy. I see it, and so do all of us other writers. Artificial Intelligence is ours to use. Not the other way around.
I leave you with this: Create. Learn. Think. And Write.