Charleston School of Business Faculty & Staff Updates

AI in the Classroom, AI in the Boardroom: How Business Students Are Learning to Use Artificial Intelligence Responsibly

by Debby Marindin, Ed.D., PMP
Project Management Instructor, School of Business, College of Charleston

This past term, I had the pleasure of teaching a small but delightful class: three Professional Studies students who were completing or working toward completing both their bachelor’s degree and Project Management certification in  a capstone course. As working professionals, these students brought maturity, curiosity, and refreshing honesty to discussions about how Artificial Intelligence (AI) is shaping their academic work and professional lives.

Because the class was so small, we were able to have rich, candid conversations—not just about schedules and risk registers, but about how AI is actually being used in practice. What emerged was not fear or overreliance, but something far more encouraging: intentional, ethical, and highly strategic use of AI. Each student completed a bonus activity, adding their reflections on the use of AI as they completed the course.  Their thoughts align closely with current research, which suggests that AI can enhance productivity, writing quality, and decision-making when used thoughtfully, but can undermine learning when used as a shortcut.

One student described using AI as a productivity partner in the classroom. Rather than asking AI to write or think for them, they relied on it for templates, outlines, frameworks, and formatting. For example, when building a Work Breakdown Structure (WBS), the student provided task categories and criteria and asked AI to generate the initial structure. They then revised the content, refined the organization, and improved clarity. A similar approach was used for a project review assignment. What might normally take 20 minutes took closer to eight—freeing up time for higher-order thinking instead of formatting. As the student explained, “I don’t use AI to think for me or write for me. I use it for structure so I can think better and faster.”

A second student applied AI in a very different—and higher-stakes—context: a real corporate project involving financial and operational analysis across organizations under the same private equity firm. Balancing full-time work with coursework, this student used Microsoft Copilot, ChatGPT, and specialized AI agents connected to an enterprise data warehouse to organize thinking, validate assumptions, and generate financial metrics for executive-level discussions. The key takeaway was not efficiency, but governance. The student emphasized that AI requires precise prompting, rigorous human review, and must never replace decision-making. Ungoverned AI use, they noted, carries real legal and reputational risks. Today, that same student is leading additional automation initiatives within their organization—a clear example of course learning translating directly into workplace innovation.

A third student framed AI as a thinking partner for risk and stakeholder management, borrowing advice from a previous professor: “Treat AI like an unpaid intern.” Early in the capstone, AI was used to simulate project risks, time constraints, and potential disruptions. Over time, back-and-forth dialogue with AI prompted the student to explore new perspectives, including community engagement risks and stakeholder resistance. They likened the relationship to a “Holmes and Watson” partnership—one that sparked deeper reflection rather than replacing judgment.

Together, these three experiences highlight complementary models of responsible AI use: tactical productivity support, strategic enterprise decision support, and cognitive assistance for risk and stakeholder analysis. All align with PMBOK® 7 principles and emerging AI literacy frameworks emphasizing human-centered design, ethical judgment, critical evaluation, and risk awareness.

For working adult learners—especially those in Professional Studies—responsible AI use is not an abstract academic issue. It is a career mobility tool. These students did not use AI to cut corners, but to sharpen thinking, expand perspective, and strengthen decision-making in real workplaces with real consequences. Their stories remind us that when learners are trusted with meaningful tools and guided by strong ethical frameworks, they do more than complete a degree—they build momentum for the next chapter of their professional lives. And that is the promise of business education at its best.

Image created with Google Gemini

GenAI Subcommittee

Erika LeGendre • January 8, 2026


Previous Post

Next Post

Leave a Reply

Your email address will not be published / Required fields are marked *

Skip to toolbar