Perils and Opportunities of ChatGPT: A High School Perspective
By Katherine McKean, Junior and President of my High School AI Club
ChatGPT jumped into the spotlight so fast it felt like someone hit “fast forward” on school‑life. Suddenly, it was in group chats, in last‑minute homework panics, in coding club Slack threads, and even in our class discussions about academic honesty. As the president of my school’s AI club, I’ve watched how ChatGPT has become both a lifeline and a lightning rod — helping some students stay afloat while making others question what “doing your own work” even means anymore.
This article is my snapshot of how ChatGPT fits into high‑school life right now: what it offers, what it risks, and how we — students, teachers, parents — might navigate it together.
when ChatGPT works like a helpful study buddy
One of the simplest perks of ChatGPT is how it breaks down complex ideas into digestible pieces — like a tutor who never tires of questions at midnight. I’ve seen classmates paste dense biology paragraphs or history readings into ChatGPT and get back summaries that make sense. For students juggling sports, after‑school jobs, and heavy course loads, that clarity can be a godsend. It doesn’t replace reading or thinking — but it’s a tool that helps bridge comprehension gaps.
Some students use it to draft outlines for essays or projects. Others ask it to help with grammar, tone, or structure. For English‑language learners, it feels like a patient editor who catches awkward phrasing or suggests more natural wording. For students with busy schedules, it sometimes transforms a late‑night scramble into a manageable revision session.
According to recent data from College Board, around 69 % of high school students reported using generative AI tools like ChatGPT to help with schoolwork in 2025. [College Board Newsroom](https://newsroom.collegeboard.org/new-research-majority-high-school-students-use-generative-ai-schoolwork?) That shows how widespread the use has become — not some niche club experiment, but a part of daily student life.
when helpful becomes risky
But ChatGPT doesn’t always get it right. Sometimes it gives wrong facts, mis‑dates historical events, or invents sources that don’t exist. If a student trusts those mistakes without verifying them, the result can be worse than no help at all. It looks polished — but it might be misleading. That’s a real risk when reliance on AI replaces genuine research or critical thinking.
Because ChatGPT-generated text often escapes traditional plagiarism detection tools, it can feel tempting to pass off the output as one’s own. That’s caused a wave of concern among educators. According to a 2023 article by the National Education Association (NEA), more than one‑quarter of teachers surveyed said they had caught students using ChatGPT to cheat. [National Education Association](https://www.nea.org/nea-today/all-news-articles/chatgpt-enters-classroom-teachers-weigh-pros-and-cons?) For many schools, that has triggered new policies, stricter grading rubrics, or outright bans.
academic integrity: is ChatGPT just the latest cheat method?
There’s a long history of students seeking shortcuts. Essay mills, crib notes, copying from friends — it’s nothing new. ChatGPT just makes those shortcuts easier to disguise. If you turn in an AI‑generated essay that looks polished and passes originality checks, how can a teacher tell if you wrote it or not?
One concern is that assignments designed decades ago — five‑paragraph essays, take‑home essays, out‑of‑class research — are suddenly easy to outsource completely. That weakens the integrity of assessments meant to test student thinking and learning. It becomes harder for teachers to see what the student truly understands.
unequal access, unequal opportunity
But using ChatGPT isn’t equal for everyone. Some students have fast home internet, personal laptops, and a quiet space to work. Others rely on school computers, slow Wi‑Fi, or don’t have consistent online access at home. That gap can widen inequalities, especially if some students depend heavily on AI while others don’t have the option.
Then there’s digital fluency. Knowing how to prompt ChatGPT effectively — phrasing questions well, checking output carefully, verifying facts — requires a certain comfort level with technology and critical thinking. Students without that background may misuse or over-rely on AI without getting good results.
teachers under pressure: changing rules, changing expectations
For teachers, ChatGPT adds complexity. Not only do they need to worry about detecting misuse, but they also face pressure to redesign assignments and evaluations. Some teachers are increasing in-class writing, oral presentations, or creative projects that require personal insight — hard for a chatbot to fake. Others are requiring process‑logs: outlines, drafts, reflections that show how the student got from prompt to final answer.
That extra layer means more grading, more oversight, more time. For a teacher already juggling assignments, lesson plans, and student support, asking them to track AI use feels like a heavy lift. Many simply avoid it — which may lead to inconsistent enforcement.
what ChatGPT can teach us about learning — if we let it
Used responsibly, ChatGPT can teach valuable skills. Learning how to phrase good prompts, when to question an answer, how to fact‑check. It can show students the difference between explanation and understanding. It can offer practice for rewriting, paraphrasing, or refining ideas.
It can also support students who struggle with traditional formats: students whose first language isn’t English, students who need help organizing thoughts, students balancing school with jobs or family obligations. For them, ChatGPT can be a supplemental resource — not a replacement for learning, but an aid to access it.
a quote from teachers and researchers engaging with the dilemma
“ChatGPT is a crutch that will prevent students from actually needing to learn content.” — K‑12 teacher, as reported by the National Education Association. [National Education Association](https://www.nea.org/nea-today/all-news-articles/chatgpt-enters-classroom-teachers-weigh-pros-and-cons?)
That warning isn’t coming from technophobes. It comes from educators who care about how students learn. It’s a call to consider not just what students turn in — but what they walk away knowing.
moving beyond bans: adapting learning and assessment
Some schools are banning ChatGPT, but more are exploring alternative approaches. For example, a recent article’s analysis of GenAI integration suggests that “forward‑thinking schools … focus on a few vetted tools rather than chasing every new product.” [Edutopia](https://www.edutopia.org/article/how-forward-thinking-schools-are-shaping-the-future-of-ai-in-education/?) That might mean defining a small set of approved AI tools, training teachers and students on ethical use, and redesigning assignments to emphasize critical thinking, creativity, and personal insight.
Suggestions include more in‑class, timed writing; oral or project‑based assessments; portfolios; and creative assignments that demand personal reflection or original thought. These formats shift the emphasis from regurgitation to reasoning and creativity — areas where AI still struggles.
what students really think (and do)
In my school’s AI club, we’ve asked dozens of classmates how they actually use ChatGPT. Many say they use it only for brainstorming or grammar help. A few admit to using it for first drafts when they’ve got several deadlines and little sleep. A small group treats it like a cheat engine — a way to get by with fewer late nights.
Most agree that if they do use ChatGPT, they should be honest about it — mark it as “AI‑assisted,” cite where it helped, revise and polish the output themselves. That approach makes it a tool — not a shortcut.
adapting for equity, ethics, and real learning
The key question isn’t whether ChatGPT should exist — it already does. The question is how we handle it. Banning it may slow down misuse, but it doesn’t help students who actually need help. Ignoring it doesn’t make the problems go away. Teaching responsible use, building new kinds of assignments, rethinking how we evaluate — that might actually move things forward.
Teachers need support: training, time, resources to redesign curricula. Students need guidance: what counts as acceptable use, how to fact check, how to use AI as a partner — not a shortcut. Schools need policies that are transparent, flexible, and enforceable.
And parents? I’d say: talk to your kids about what they’re using, ask them to show you their process, encourage honesty. If AI changes school, let it change how we think about learning — not just what we turn in.
How to Launch a Bay Area High School AI Club in 10 Easy Steps















