Main navigation

Q&A with Molly Downing

Molly Downing is an assistant professor of practice in the Division of Pharmacy Education and Innovation. She received Ohio State’s Provost’s Award for Distinguished Teaching by a Lecturer in 2023. She earned her PhD in Pharmacology from Vanderbilt University in 2010.


What made you want to start using AI in your course?

In autumn 2024 and spring 2025, I piloted a module to help students learn and practice using AI tools in a responsible and appropriate manner. This module helps students understand our course policies around AI use, focusing on idea generation, brainstorming and improving grammar/mechanics without altering meaning. During spring 2024, I noticed that including an AI syllabus statement was not enough — I needed to help students understand what appropriate and responsible AI use looked like in my specific courses, and then give them opportunities to practice applying those guidelines. 

Can you share specifics on how you have students use it?

Students complete a 30-45 minute interactive learning activity during Week 1 that teaches them appropriate and responsible AI use in higher education. They then apply that learning in two weekly discussions. As the course progresses, I may emphasize or make connections to the AI learning activity, such as when they complete the check-point exercises for the final project that involves brainstorming ideas or developing a project outline.

What were you hoping students would get out of it?

The module’s learning outcomes are:

  1. Explain what generative AI (GenAI) is and how it works
  2. Analyze ethical AI use
  3. Recognize Ohio State policy requirements around AI use
  4. Describe how to effectively leverage AI as an educational tool

We completed a research study evaluating the module’s impact, which indicated students made several positive learning gains in both objective and subjective measures. Anecdotally, students shared that the “best part of the module” included learning how to use AI appropriately and responsibly in a course-specific manner, engaging in the practice exercises, as well as enjoying the interactive learning design of the module itself.

What’s the most important thing you’ve learned since starting to teach with AI? Have there been any surprises?

I’ve been surprised that some students are hesitant to use AI. In our research, some students shared an appreciation for learning how to appropriately and responsibly use AI, but they hoped that AI wouldn’t be required in assessments. Others were grateful that a course took the time to teach them this content, as it increased their confidence that they could ethically use AI tools for other academic purposes.

It’s really important for our university to creatively and strategically think through how we are going to support faculty, staff and students toward becoming AI fluent. We should not expect students to achieve these skills on their own, and we should not expect faculty and staff to absorb the effort required to help our students graduate with AI fluency.

Is there a moment when you realized AI was actually working better than you expected in your class? What happened?

Yes, in spring 2025, we deployed the research study across the course’s seven sections. Anecdotally, all instructors felt that there were fewer concerns around inappropriate AI use, with noticeably fewer Committee on Academic Misconduct submissions. This observation was not investigated with any scientific rigor, but it did allow me to reflect on the benefits of the AI module both for our faculty and students.

Some people are nervous about AI, thinking it might hold students back from developing critical thinking or creativity. What’s your take on that?

I hear this concern and initially held similar thoughts, but my thinking evolved as I dove into the research and gained more experience with generative AI tools. Appropriate and responsible AI use will likely evolve for our students throughout their educational and professional journeys. At each point, students must apply their critical thinking skills to evaluate the accuracy, reliability, relevance and effectiveness of AI outputs. Therefore, AI use is not a replacement for critical thinking — we still should help learners develop these skills through a variety of methods to ensure they can critically evaluate its output and recognize ethical considerations around AI use throughout their careers.

Will embedding AI in the curriculum actually help students prepare for their careers after graduation?

Absolutely. I worked with a College of Pharmacy task force to identify ways to help our graduates become AI fluent. Colleagues in the field inform our work, sharing examples of how AI technology is impacting healthcare, research and pharmacy practice. We learned that our graduates should have a technical understanding of these tools, skills to critically evaluate AI outputs, as well as gain practice using AI tools to accomplish relevant tasks. They also encouraged us to create space for students to explore the capabilities of AI tools while also developing an understanding of its limitations. Lastly, our research study asked students their thoughts related to this question. Below are our findings:

Because of the GenAI instructional module and practice discussion:

  • 81-82% agreed or strongly agreed to: “I will utilize the knowledge and skills I developed around responsible/appropriate GenAI use in other coursework or in my future professional career.”
  • 80-90% agreed or strongly agreed to: “I feel prepared to comply with Ohio State’s guidance on Artificial Intelligence in other coursework.”
  • 83-88% agreed or strongly agreed to: “I feel prepared to responsibly use GenAI tools in other coursework or in a future profession.”