Main navigation

Assessment Conference

2025 Conference

A faculty member speaks animatedly while another faculty member looks on.

The Future Is Now: Assessment in the Age of Technology

Sponsored by the Office of Academic Affairs

Friday, September 26 at the Blackwell Inn

Registration for the conference is full.

 

Schedule at a glance

What

Title
Description

Opening Plenary 

Next Steps in AI Fluency and the Opportunities (and Challenges) for Assessment and Student Learning

Conversations around AI have been elevated with Executive Vice President and Provost Ravi. V. Bellamkonda's announcement of Ohio State's AI Fluency initiative. This panel will provide an overview of this vision, focusing on how programs have incorporated AI into their curriculum and assessment, and a discussion on the challenges brought by the use of AI by students in the classroom.

Shereen Agrawal, Melissa Beers, Nicole Kwiek, Jennifer Whetstone, Norman Jones (moderator) 

Drake Institute for Teaching and Learning 

Concurrent sessions 1-

HIPs Don’t Lie: A Dashboard Approach to High-Impact Practices

High-Impact Practices (HIPS) have been widely advocated as an effective strategy for enhancing active learning, student persistence, and engagement. In conjunction with the High Impact Practices Advisory Committee (HIPAC), the Office of Institutional Research & Planning (IRP) has created an interactive dashboard that tracks actual HIP participation rates and outcomes on all campuses over multiple years. This session will demonstrate our HIP Dashboard to allow for a deeper understanding of HIP participation longitudinally, scaffolding HIPs, and the differential impacts on student success across a range of student demographics and subpopulations.

Michele Hansen, Ola Ahlqvist, Katie Smillie, Kristen Rupert Davis

Where there’s a WIL there’s a way: Finding Consensus in Assessing Writing and Information Literacy Across Departments

The Center for the Study and Teaching of Writing will discuss some of the results of departments’ assessment of the Writing and Information Literacy (WIL) GE Foundation, and representatives from the Departments of English and Engineering Education will compare their process for assessing their WIL courses. While the two departments built different kinds of infrastructure for assessing their writing courses, their work reflected common goals for teaching and assessing writing. These goals can inform how departments can balance the local and institutional needs for assessment.

Chris Manion, Scott Dewitt, Lynn Hall, Ashleigh Hardin, Jennifer Herman

Round up those data! Leveraging assessment tech to close the loop in student learning evaluation

Are you a faculty or staff member new to assessment and interested in leveraging technology for program-level student learning assessment? Please join the College of Pharmacy’s Office of Assessment for a session on strategically implementing available technology such as CarmenCanvas, Qualtrics, and PowerBI to support a program-level assessment plan. Using our annual data roundup events as a model, we will offer tips for supporting your assessment cycle and creating the conditions for data-informed decision making. 

Mary Higginbotham, Katherine Kelley

Driving with a GPS: Using data visualization and assessment to improve student learning

(I collected assessment data, now what?)

Assessment is essential for understanding student learning, but it all comes down to what you DO with the data that matters! This session shares how the GE Bookends team and IRP collaborated to create visualizations of assessment data, and how those data drove course revisions that centered student learning. 

Mitsu Narui, Melissa Beers

Concurrent sessions 2 – 

 

The Future Is Now: Assessment in the Age of Technology 

The assessment, reporting and analysis of student learning outcomes in a course is often seen as an onerous, time-consuming process. Usually, individual or composite grades are too general to qualify as direct assessment of a specific student learning outcome. Learn how using CarmenCanvas can help easily evaluate learning outcomes using a rubric and can report your assessments automatically into Nuventive for quick, easy analysis. With a little work on the planning side, you can spend most of your time considering improvements and implementing meaningful changes that will help your students learn better.

Kate Halihan, Aaron Carpenter, Jennifer Herman 

Assessing the Learning Outcomes of Natural Sciences GE Foundations Courses

This session introduces the proposed assessment rubric for Natural Sciences (NS) foundations courses, which will be used in the upcoming General Education (GE) programmatic review. Participants will gain an understanding of what will be required from them in the review process. The session will also feature case study examples that demonstrate how the rubric can be applied across a wide range of courses within the College of Arts and Sciences. Feedback from the GE NS community will be invited to help refine and strengthen the rubric’s implementation.

Lindsay Westraadt, Jennifer Ottesen

The analysis of reading complexity level of assessment questions and student performance

Pandemic education disruptions impact student success as seen in a 10-percentage point decrease in upper-level pharmaceutical sciences lab exam scores three years after the pandemic lockdown and increased portions of students self-reporting exam struggles. To analyze the intertwining facets of learning assessed in exams, researchers will introduce participants to technologies in question tagging and Gunning Fog readability scoring to determine whether student reading anxiety and Bloom’s comprehension level are causal factors for the decrease in exam score proficiency. (Free Text Complexity Analysis Tool Online | Lumos Learning, n.d.). 

Brittney Mize, Nicholas Denton

Next Steps: Strategic Assessment Planning for AI Fluency 

As AI reshapes disciplinary expectations, how can academic units ensure their assessment plans evolve accordingly? This session will guide participants through facilitated discussions and planning prompts to help them align program-level assessment with anticipated curricular revisions. Participants will identify meaningful, practical ways to gather evidence of student learning and will leave with a set of guiding questions to take back to their units.

Anika Anthony, Larry Hurtubise, Shari Beck

Closing Plenary

HLC reaffirmation of accreditation

Accreditation of colleges and universities through the Higher Learning Commission happens on a 10-year cycle. In the closing plenary, presenters will provide an overview of and update on the upcoming HLC reaccreditation process, including the purpose, timeline, and any expectations for the campus community.

Randy Smith , Mitsu Narui