AI in the Classroom: Friend or Foe?

By a Bay Area High School Junior and AI Club President

When my AP English teacher caught a student using ChatGPT to write a personal narrative, there were gasps. Not because it was a scandal — honestly, some kids kind of expected it — but because it raised a question most of us hadn’t thought about clearly: Is using AI cheating, or is it just a new kind of tool?

As president of my high school’s AI Club, I think about this a lot. We live in a district that encourages innovation. Our teachers use digital whiteboards, online quizzes, and tools like Desmos and Khan Academy regularly. But artificial intelligence — especially tools that generate text or solve problems — still seems like the wild west. And with all the discussion around what counts as “learning” versus what counts as “cutting corners,” it’s become one of the most interesting ethical debates in school right now.

how ai showed up in class

It started quietly. Some students used AI to brainstorm essay ideas. Others asked it to summarize a novel before a quiz. Then came the wave of auto-generated math explanations and Google Docs comments that definitely sounded…too helpful. Eventually, we all started asking: Where’s the line?

Teachers noticed too. One of our science teachers now has a policy where if you submit an essay, you have to explain your process — what sources you used, how you organized your ideas, and what role AI tools played (if any). It’s not to shame anyone. It’s to encourage reflection and transparency.

And honestly, that might be the future of AI in education. Not banning it. Not blindly adopting it. Just being real about how we use it — and why.

when help starts to feel like cheating

The tension comes down to this: If I ask ChatGPT to outline a historical argument, and then I write the essay myself, is that cheating? What if I ask it to write a draft, and then I rewrite half of it? What if I turn in the draft as-is?

Different students draw the line in different places. One friend of mine uses AI for practice questions before a test, like digital flashcards. Another says even asking it to suggest phrasing for a thesis feels wrong.

Some teachers have tried locking things down — banning phones during tests, blocking access to AI sites on school Wi-Fi, or adding plagiarism detection software. But that often misses the point. If students are turning to AI because they’re overwhelmed, burned out, or unsure how to start, then the root problem isn’t technology. It’s about support, trust, and how schools help students learn to manage complexity.

teaching ai, not fearing it

In our AI Club, we’ve talked with teachers who want to introduce AI lessons but worry about making things worse. Some feel they don’t know enough. Others worry about sending mixed messages — like, “AI is cool, but don’t use it for homework!”

So we’ve been helping. This semester, our club partnered with a history teacher to do a week-long activity where students used ChatGPT to compare different historical interpretations of a primary source. Then, students had to critique the AI’s output, find errors, and rewrite a better version using real sources.

It was eye-opening. Some students were surprised at how confidently wrong the AI was. Others appreciated seeing what bias and oversimplification looked like in action. It sparked more engagement than most textbook readings ever did.

ai as tutor, editor, or coach?

One of the coolest uses we’ve seen is using AI as a low-stress practice buddy. A lot of students in our school are multilingual. For them, having a chatbot to practice English conversation or revise grammar is huge. It’s like having a patient tutor on demand.

Other students use it like an editor. You can paste in an essay and ask, “Where are the weak spots?” Or ask it to rephrase a paragraph in simpler words. It’s not perfect, but it gives you feedback when a teacher isn’t available.

Still, not every student agrees this is learning. One classmate says if you rely on AI to find your flaws, you’ll never develop your own eye for writing. Another says it’s like training wheels — helpful at first, but eventually you need to ride solo.

ai isn’t one thing

There’s no single “AI” that schools are dealing with. Some tools just autocomplete sentences. Others generate essays, solve math problems, or create slide decks. What they have in common is that they simulate fluency — they sound smart, even if they’re not actually reasoning.

That’s why some of us joke that AI doesn’t know stuff — it just fakes it better than we do. But that makes it tricky to spot when it’s helping or hurting your thinking. You might feel like you understand something because you read the AI version, but then freeze up when you try to explain it yourself.

Teachers say the same thing: AI can make it harder to tell when a student is actually learning or just mimicking. So now, some teachers are shifting toward project-based assessments, in-class discussions, and oral explanations — anything that makes it clear the student has thought things through.

what high schoolers really think

There’s definitely no single student opinion. Some students use AI like a search engine. Others treat it like a secret weapon. And some just ignore it because they’re not sure how to start.

At our last AI Club meeting, we had a mini debate: “AI is more helpful to high schoolers than Google ever was.” Half the room said yes, because AI can explain things in plain English, not just give you links. The other half said no, because you never really know if it’s right.

The takeaway? We’re still figuring it out. But we’re more curious than scared. And that’s probably a good thing.

teacher perspective: cautiously optimistic

I interviewed three teachers from my school. One said she thinks AI will become like calculators — not allowed on every test, but accepted once you’ve learned the basics. Another said he sees AI as a tool for equity: “Some students can afford tutors. Others can’t. AI levels the field.”

The third teacher was more skeptical. She worries that students might lose confidence in their own writing. “If you never struggle through the process, you don’t build resilience,” she told me. But even she admits there’s potential if it’s used with intention.

we’re learning about learning

Maybe the biggest shift is how AI has made us ask deeper questions: What counts as effort? What’s the goal of schoolwork? Is it the product or the process?

And while we’re all still working through those questions, one thing is clear: AI isn’t going away. So pretending it doesn’t exist doesn’t help anyone.

Some students will always push boundaries. Some teachers will always experiment. But most of us — students and educators alike — are somewhere in the middle, trying to adapt without losing what matters. And that means having honest conversations, testing ideas, and maybe even co-writing a few essays with a bot… just to see what happens.

If anything, AI in the classroom is less about replacing human thinking and more about rethinking what learning looks like in 2025 and beyond.

How to Launch a Bay Area High School AI Club in 10 Easy Steps