Vince Castillo is an assistant professor of logistics at the Max M. Fisher College of Business. His research interests include last mile logistics, supply chain sustainability, simulation modeling and artificial intelligence. He holds a PhD in Supply Chain Management from the University of Tennessee.
What made you want to start using AI in your course?
To be frank, my initial motivation stemmed from academic integrity concerns. Seeing early GenAI tools like LEX easily predict and complete text highlighted the potential for misuse. Then ChatGPT’s rapid adoption solidified these concerns but also made me realize the possibilities for new and reimagined assignments. I realized that students would need new skills to thrive in this evolving landscape, and I decided to help them discover and develop those skills. Ultimately, I believe education in the age of GenAI is about learning to lead the AI, not being led by it.
Can you share specifics on how you have students use it?
I’ve created new assignments to leverage GenAI’s capabilities. For example, in my Logistics and Supply Chain Analytics course, students build interactive data visualizations using AI to generate the code — something previously inaccessible to them as business students without a computer science background. This allows them to focus on the principles of data visualization and, importantly, become better “boundary spanners” that effectively communicate between technical and business teams. The goal is not to turn them into coders, rather, it’s about understanding the possibilities and limitations of data visualization. We also spend significant time analyzing AI output, emphasizing the importance of critical evaluation and verifying results. I’ve even developed a custom AI platform, SupplyChainBrutus.com, to act as a virtual tutor, providing personalized support and guidance. Ultimately, these assignments aim to equip students with the skills to leverage AI tools effectively and critically in their future careers.

What were you hoping students would get out of it?
My primary goal is for students to experience the limitations of GenAI firsthand — to understand its “jagged frontier.” While it’s great for brainstorming and quick reference, it often struggles with nuanced tasks requiring real-world context. I want them to learn to critically evaluate AI output, recognize that it’s not a replacement for domain expertise, and understand the importance of human oversight. Ultimately, they need to learn when to trust AI and what tasks are appropriate to outsource, while maintaining responsibility for the final outcome.
What’s the most important thing you’ve learned since starting to teach with AI? Have there been any surprises?
I was surprised by how readily students treat GenAI like a traditional search engine, simply inputting keywords. But it requires a different approach — communication and providing sufficient context. I’ve realized GenAI is best thought of as a personal assistant, excelling at tasks you already know how to do, but requiring a “human-in-the-loop” to ensure accuracy and validity. It’s about enhancing capabilities, not replacing them.
Is there a moment when you realized AI was actually working better than you expected in your class? What happened?
Yes! A turning point was when students began to stop using AI altogether on certain assignments. They realized that correcting the AI’s errors was often more time-consuming than completing the task manually. This demonstrated they were truly understanding the technology’s limitations — and, crucially, learning when not to use it. That realization, I believe, is the first step towards effectively leveraging AI in the future.
Some people are nervous about AI, thinking it might hold students back from developing critical thinking or creativity. What’s your take on that?
That’s a valid concern. I allow students freedom to explore AI — I call my classroom “Amsterdam for AI” — but I’m seeing signs that some are simply using it to “check the box.” That’s why I designed Supply Chain Brutus to act as a virtual mentor, employing the Socratic method to encourage critical thinking and guide discovery. The challenge is ensuring students recognize AI as a tool for augmenting education, not a shortcut.
Will embedding AI in the curriculum actually help students prepare for their careers after graduation?
Absolutely. Embedding AI is essential for preparing students for their careers. While many companies are still exploring AI’s potential, our students need to understand both its capabilities and limitations before entering the workforce. By emphasizing responsible, ethical, and transparent usage, we can instill those norms and equip them for long, meaningful careers in the age of generative AI.