What AI Looks Like Right Now (And No, It’s Not Just ChatGPT)

What AI Looks Like Right Now (And No, It’s Not Just ChatGPT)

By Katherine McKean, Junior and President of my high school AI Exploration Club

If you live in the Bay Area, chances are someone you know works in AI. Maybe it’s your dad. Maybe it’s your dentist’s cousin. Maybe it’s the barista who casually mentioned large language models while handing you an oat milk latte. Either way, it’s everywhere. But knowing that AI exists and understanding what it’s actually doing are two different things.

It’s not all robots or smart assistants that accidentally call your ex. A lot of the current action is in tools students already use without thinking about it—writing help, art generation, and voice models that somehow know when you’re mumbling. Here’s a look at where things are in 2025 from someone who still has to take pre-calc at 8 a.m.

Models That Talk, Draw, And Guess Your Homework Struggles

The most visible AI tools are the ones that talk back. ChatGPT from OpenAI, Claude from Anthropic, and Gemini from Google all work like conversation partners that can also summarize articles, explain tough topics, or generate haikus about mitochondria. ChatGPT-4o (the “o” stands for omni, not “oh no”) now handles text, voice, image, and video in one place. It can read a photo of your handwritten notes and suggest edits. It can talk in real-time without sounding like a robot from 2007. It can even translate your math homework into something that resembles English.

On the visual side, tools like DALL-E and Midjourney turn descriptions into images. Sometimes they get weird about hands. But most of the time, they can generate decent visuals for slide decks or random creative projects. A few students use these tools to brainstorm poster designs or mock up yearbook graphics. Others just try to get them to draw a penguin eating ramen on the moon. Both approaches seem valid.

Video Generators That Make People Say “Wait, That’s Not Real?”

One of the newer tools that gets a lot of attention is Sora, also from OpenAI. It creates short video clips from written prompts. Think movie trailers made out of sentences. The videos look surprisingly real, which is both cool and kind of unsettling. You could type “a golden retriever riding a skateboard through downtown San Jose” and get something that makes you do a double take. It’s not perfect, but it’s a lot closer than anyone expected two years ago.

Voice Tools That Sound Way Too Human

AI voice tools are also getting more advanced. Some can mimic different accents, tones, and even emotions. Others translate what you say into another language while keeping your voice. These tools show up in language learning apps, accessibility tech, and a few very ambitious group projects. One student used a voice AI to narrate a slide deck with different characters. It was both impressive and slightly distracting, but it worked.

Everyone Wants To Build Their Own Version

A lot of companies are developing their own chatbots and models. Meta, Microsoft, and even startups with names that sound like space agencies are launching tools. Some are open-source, some are designed for research, and others just exist to compete with ChatGPT. The Bay Area has a front-row seat to all of it, which means students here get to hear about the updates before they even finish rolling out. It also means some students end up testing tools their parents are working on, which makes for strange dinner conversations.

Where AI Shows Up In Daily Life

Even if you’re not asking chatbots to write limericks about Shakespeare, AI shows up in apps you probably use. Spotify recommends songs. Google sorts your emails. Instagram decides what reels show up first. Tools like Grammarly and Notion AI help with writing and organization. Some apps analyze your workouts. Others schedule your week based on due dates and chaos levels. Whether or not students call it AI, they’re using it.

Teachers are using it too. Some schools have started experimenting with AI for creating reading quizzes, lesson plans, and practice problems. It doesn’t always go smoothly. One AI-generated quiz included three answers that were technically correct but none that matched the teacher’s key. But it’s being used, and students are starting to expect that teachers at least know what these tools are.

Bay Area Bonus: Everyone Has Opinions

Living in the Bay Area means being surrounded by people who either build AI or think about it constantly. That makes things interesting. Some people are excited. Others are worried. Most are somewhere in the middle. Students here often hear both sides and then go back to asking AI to write a skit about a confused history teacher and a time-traveling squirrel. Not every use has to be deep.

It’s also easier to try new tools here. A lot of companies test locally. There are free trials, school programs, and club projects. Some libraries and community centers even host workshops for students. If you’re curious, there’s usually a way to explore without needing a supercomputer or a coding background.

What AI Feels Like Right Now

In 2025, AI doesn’t feel like one big thing. It’s a lot of small things that keep changing. Some tools are polished. Others feel like science experiments. Students use what works, ignore what doesn’t, and share links that are helpful, weird, or both. Whether it’s explaining calculus, creating soundtracks, or helping with club flyers, AI is just one of the many tools people reach for during the day.

Want to bring the power of AI to your school? Check out this step-by-step guide on How to Launch a High School AI Club in 10 Easy Steps.