Larry Hurtubise, a curriculum and instruction consultant at the Michael V. Drake Institute for Teaching and Learning, is at the forefront of a wave of change in education, driven by generative artificial intelligence (GenAI). Since joining the Drake Institute in the spring of 2021, Larry has tapped into his 20-plus years of instructional design experience to make this emerging technology feel less intimidating and more empowering. Whether he’s helping faculty enhance teaching and learning with tools like Microsoft Copilot or leading a GenAI community of practice, he works to make the use of AI in education practical and meaningful, cutting through the hype to enhance student learning.
“Traditional” AI versus Generative AI
Non-generative AI is also called traditional or predictive AI. It is programmed with strict rules or instructions to do specific tasks. For example, traditional AI might be good at recognizing pictures of cats, but it can’t create a new picture on its own. It needs someone to give it the rules or tell it what to do.
In Generative AI, the user enters prompts, and the tool can create new things like drawings, stories or even music all by itself. It's like an artist robot that can think up ideas and make something from scratch. In the same example, Generative AI can not only recognize a cat, but can create a new image of a cat that has never existed before, based on images it has used to learn.
The rise of generative AI is fueled by three key factors: massive amounts of data available online, the concentrated power of cloud computing, and advancements in neural networks, which mimic the human brain’s learning patterns.
Learn more about AI, approved tools for Buckeyes and how AI can be used ethically and responsibly at ai.osu.edu.
How do you help faculty get started with GenAI?
I start by gauging their experience with AI, then I introduce them to tools like Copilot with simple activities, such as typing "O-H" to see its predictive responses. This helps faculty understand how GenAI generates responses based on prompts. I also guide them through helpful tasks, like prompting the system to write letters or generate learning objectives. To develop prompting skills we introduce the TRACI model, which considers the task, role, audience, criteria and intent. It can help faculty create effective prompts and begin to understand the power of AI to assist in teaching activities. The Drake Institute has a variety of programs covering topics of interest to educators such as prompting, designing assignments and academic integrity.
Why use GenAI as a teaching assistant?
GenAI can significantly enhance efficiency and effectiveness by quickly drafting documents that educators write. This streamlines the process allowing them to reflect, refine and iterate more quickly. While teachers certainly apply evidence-based practices without AI, integrating technology makes these processes more efficient, saving time for instructors to engage with students.
What GenAI initiatives are you leading at the Drake Institute?
Our new AI-Infused Course Design Institute is one of our key programs. This institute teaches faculty to use GenAI to support course-design tasks such as generating learning objectives, creating assignments and developing rubrics. Participants use prompts to guide the tool in supporting these tasks and have opportunities to reflect on and critique the AI-generated responses. We know that there’s a lot of interest in this area, so it was a natural fit for the Drake Institute to help faculty learn more about teaching and learning with GenAI.
Additionally, we collaborate with various units across the university, including the Center for the Study of Teaching and Writing, University Libraries and the Office of Technology and Digital Innovation. These partnerships allow us to integrate various perspectives and expertise into our programs, ensuring a comprehensive approach to GenAI in education. For example, the Committee on Academic Misconduct (COAM) provides valuable insights into ethical considerations, helping us promote a responsible use of GenAI. By working together, we can offer faculty robust support and resources to navigate the complexities of this technology in their teaching practices.
How do you address ethical concerns regarding AI in education?
Ethical concerns are a significant part of the conversation. We promote a transparent model where faculty consider their learning goals and how AI tools impact student learning. COAM has developed a set of icons to clearly communicate what's permitted and what's not, encouraging ongoing conversations between faculty and students. I use an analogy from Amy Webb, a futurist, who compares engaging with GenAI to driving on ice: just as you need to turn into a skid to regain control, we must lean into AI to ensure our voices and perspectives are heard. Additionally, tools like Copilot provide security safeguards, such as protecting student data and intellectual property, ensuring everyone has access to a secure AI tool for educational use.
What are your thoughts on the future of AI in teaching and learning?
GenAI's ubiquity means it's something educators need to engage with thoughtfully. Faculty, staff and students will need to develop fundamental knowledge, skills and attitudes to effectively, ethically and sustainably use AI. Students will also need to be prepared to use AI creatively to solve problems in their future careers. As Ohio’s flagship institution, it's important for Ohio State to lead the way in educating future leaders with and about GenAI. By diligently exploring teaching and learning with AI, Ohio State is preparing its faculty and students for the future of thoughtful, ethical and sustainable AI integration.