AI in My Classroom
by John Kim, Assistant Professor of Finance
As AI continues its global takeover (politely, for now), I kicked off last semester determined to bring more of it into the classroom, this time with a clearer plan and more structure. Some ideas worked. Some flopped spectacularly. All were educational.
Last year, I wrote Pulse posts warning everyone to be both excited and afraid. Last semester, I dove in headfirst. I rolled out AI-generated chapter notes, study guides, study questions, and even chapter-summary podcasts. I booked a guest speaker. I built an AI-based case study. I was ready. Here are a few things I learned along the way, some expected, some surprising, and some that made me adjust my thinking.
- Should I give students AI-generated study materials?
In theory, variety helps different learners. In practice, I accidentally created a study buffet that no human could finish. My overachievers tried, bless them, but even they begged for something simpler, shorter, or just… survivable.
And yes, every AI-generated item had to be checked by me, which meant I was studying more than the students. Podcasts were fun but useless for exam prep unless I made them longer, which then made them unlistenable. Note to self: podcasts should stay short and only handle bite-sized topics.
AI also struggled to write reasoning-based questions. If I needed logic tied to specific lecture nuances, the model shrugged. Quant questions? Great. Critical thinking? Not so much.
2. “Do you actually know how AI works?”
Before diving in, I surveyed the students. For the first time ever, several said they were worried about using AI. Not because robots are coming, but because they felt dumber. A few even quit using AI altogether after realizing they were retaining almost nothing.
When I asked if they understood how AI works, no one even pretended. And that explained everything. If you think AI is just Google with a gym membership, of course your brain is taking early retirement.
3. Can you craft a proper prompt?
Previously, when I assigned AI-based financial projections, students typed things like:
“Project sales for the next 5 years.”
And that was it. No context. No assumptions. Nothing.
As anyone who uses AI knows, one-shot prompts produce one-shot answers, and usually bad ones. So I taught them the structure of a real prompt. They still struggled. Then I handed them a good prompt, nearly a page long, that they could adapt.
Suddenly everything clicked. Students began prompting like people who understood what AI actually does. It was like watching someone upgrade from a flip phone to a smartphone in real time.
4. What’s really at stake?
Between news stories and our guest speaker, students understood that learning AI is not optional. They just did not know how to use it beyond “ask it for something.” Once they saw the difference between a vague prompt and a well-crafted one, the transformation was instant. It became obvious that AI is not magic; it is a tool that performs exactly as well as the instruction it is given.
Students understand what is at stake. They just do not know what to do about it. AI feels like one of those tools that has to be learned hands-on more than most.
Last semester reminded me that bringing AI into the classroom is not about flashy tools or endless content. It is about teaching students how to think with AI, not instead of it. They do not need more materials; they need better ones. They do not need to fear AI; they need to understand it. And they certainly do not need ten-page prompts, though apparently one good one can change their entire approach. If nothing else, this experience proved that AI can make us better teachers and learners, as long as we pause long enough to ask not just what the technology can do, but what we actually want it to do for us.
Image created with Google Gemini