2022 Assessment Conference

Assessment Conference and Series 2022-23

Celebrating the past with eyes to the future

Sponsored by the Office of Academic Affairs

November 4, 2022

This year's conference will be held in-person at The Blackwell on November 4. Our theme is Celebrating the past with eyes to the future.

Access Scarlet course Detailed online program
The table below contains the session details and presenter information for the 2022 Assessment Conference.
Time Session Information Presenters Room
9 a.m.

Opening remarks and welcome

W. Randy Smith Blackwell Ballroom
9:20 - 10:30 a.m.

Opening plenary: Looking forward

George Rehrey, Indiana University
Kim Arnold, UW Wisconsin
Blackwell Ballroom

10:40 - 11:40 a.m.


Concurrent Session: General Education (GE)


Melissa Beers
Alan Kalish

Chris Manion
Pfahl 302

10:40 - 11:40 a.m.


Concurrent Session: Graduate Program


Anna Meurer
Courtney Thiele
Marcia Ham
Andrew Blatter

10:40 - 11:40 a.m.


Concurrent Session: STEP


Susie Whittington Pfahl 330

10:40 - 11:40 a.m.


Concurrent Session: Drake Institute


Jonathan Baker 
David Sovic 
Pfahl 340
12:00 p.m.

Lunch plenary: Big Ten Assessment

Dennis Groth, Indiana
Wayne Jacobson, Iowa
Mearah Quinn-Brauner, Northwestern
1:20-  2:20 p.m.

Concurrent Session: General Education (GE)


Melissa Beers
Alan Kalish

David Sovic
Pfahl 302
1:20-  2:20 p.m.

Concurrent Session: Chase New Skills Network


Shanna Jaggers Pfahl 330
1:20-  2:20 p.m.

Concurrent Session: FAES


Warren Flood
Jeanne Osborne
1:20-  2:20 p.m.

Concurrent Session: Interculture Competencies


Janice Aski
Cindy Jiang
Pfahl 340
2:30 p.m.

Closing plenary: Assessment Pioneers

Alexis Collier
Steve Fink
Linda Martin

Assessment Basics, Plenary descriptions

10:30 a.m-12 p.m. 

048 Derby Hall

Many faculty members take on a role in program assessment as part of their departmental service. If you are new to assessment or want a refresher, join us for a brief session on the basic process and vocabulary.

Alan Kalish, PhD                 

Assistant Vice Provost, Office of Undergraduate Education

Alan Kalish, PhD, assistant vice provost and adjunct associate professor of educational studies, supports faculty efforts on academic program assessment, implementation of a revised general education program and institutional accreditation. Previously, he worked in educational development for 25 years, directing the University Center for the Advancement of Teaching at Ohio State for 18 years. His research includes transitions from graduate school to faculty life, teaching and learning in higher education, and course and curriculum design.

Teresa A. Johnson, PhD            

High-Impact Curriculum Expert, Office of Academic Enrichment

Teresa A. Johnson, PhD, is the High-Impact Curriculum Expert at OSU’s Office of Academic Enrichment. She earned a doctorate in Microbial Ecology at the University of Illinois at Urbana-Champaign. She has taught in the sciences at Butler University and at the College of Wooster. Her pedagogical research has focused on classroom assessment techniques and impacts of prior knowledge on student learning in the sciences. Her current interests are course and curriculum design, articulation of learning outcomes, as well as support and evaluation of high-impact teaching strategies.

Perspectives on maximizing assessment with learning analytics

Learning analytics has rapidly emerged as one of the most promising opportunities to better understand our students and the choices they make on their individual pathways toward college success. We will provide a brief overview of learning analytics in higher education today and how it can be integrated with assessment efforts currently underway. We will also discuss a number of ways that learning analytics can provide new perspectives about student success, help us answer different questions, and form a foundation for challenging existing data paradigms.

George Rehrey. Principal Instructional Consultant
Indiana University Center for Innovative Teaching and Learning

Kim Arnold, Director, Learning Analytics Center of Excellence
University of Wisconsin-Madison

Big Ten Assessment

Dennis Groth
Indiana University

Wayne Jacobson
Iowa University

Mearah Quinn-Brauner
Northwestern University

Assessment Pioneers and awards

  • Randy Smith
  • Alan Kalish
  • Steven Fink
  • Alexis Collier
  • Linda Martin

Breakout session descriptions

In this presentation, we will provide examples of direct and indirect measuring tools to assess key aspects of intercultural competence (IC) at the course and program levels. Examples will be drawn from research on the development and the assessment of IC in Italian language courses and world language programs, but the assessment tools presented are applicable to any field.

In this session, the Drake Institute for Teaching and Learning will share updates on programs and services that support assessment and evaluation of instructional practice and will share strategies and lead discussions and activities on generating and leveraging course data to guide decisions and revisions on instruction.

This session will highlight data resources available at OSU and the exciting ways these resources can be used to improve programs and program offerings for our students. A panel of program representatives will provide insights on their own program assessment efforts.

Panelists for the session are

Andrew Blatter, Ph.D., Data Analyst, Graduate School
Kathleen Hallihan, Ph.D., Assistant Dean, Glenn College of Public Affairs
Marcia Ham, Ph.D., Learning Analytics Consultant, Office of Technology and Digital Innovation
Courtney Theile, J.D., M.A., Director, Bioethics Graduate Programs

As the new General Education begins this term, several teams of stakeholder instructors and staff have begun to plan the specific processes for collecting direct assessment data on student achievement of the Goals and ELOs of several elements of the program. Groups have started developing the rubrics that will allow instructors of the Writing and Information Literacy (WIL) foundation, the Mathematics, Quantitative Reasoning, and Data Analysis (MQRDA) foundation, and the Bookends: Launch Seminar to map their assignments to the GE Expected Learning Outcomes next year and to report on their students’ aggregate achievement of these. These teams are also considering additional types and sources of data to enrich the picture of how well these program elements are working. 

Assistant Vice Provost Alan Kalish (Undergraduate Education) and Team leads will report on the first planning and development steps that these groups have made this autumn.

The health care and information technology fields demand a more diverse and skilled workforce who can appropriately fulfill the needs of the varied communities they serve. Diverse students – including racially minoritized students, low-income students, first-generation college students, adult learners, and students with disabilities – are more likely to start at a community college due to these institutions’ open-access mission. This study focuses on students transferring from Columbus State Community College into Health and IT pathways at Ohio State. We leverage qualitative interviews with students and staff to understand “what works” and “what needs to be improved” for IT and Health transfer pathways between the two institutions, including barriers to transfer or program progression for diverse student populations.

Throughout STEP’s 10 years of student engagement, the Center for the Study of Student Life has been an important assessment partner. Our journey began with a commissioned study to determine foundational components of the newly visioned program. We then partnered for many years on descriptive studies to make informed programming improvements. Today we are introducing an exit survey strategy and an alumni engagement questionnaire.

Exciting possibilities exist when we explore integrating a comprehensive assessment process into curricular planning and delivery of designed learning experiences. Alignment of learning goals and outcomes across a program’s curriculum, knowing when and how a specific outcome is to be addressed within the program, and gathering evidence of student attainment of the program’s identified set of expected learning outcomes all play a role in the continuous quality improvement of educational programs. Using the six components of student learning assessment as identified within the National Institute for Learning Outcomes Assessment (NILOA) - Transparency Framework, this session will share what assessment elements exist presently, how they have been employed for the past 10 years and examine what potential lies ahead for greater integration of the assessment process and the utilization of collected student learning evidence.

Intended Takeaways

  1. Learn

    Learn about the latest tools and assessment strategies employed by experts at Ohio State. 

  2. Connect

    Build relationships with colleagues across the university working on assessment.

  3. Engage

    Continue conversations around assessment in the coming weeks and months.