Assessment at Ohio State

The goal of the assessment process is to use assessment evidence to improve the student learning experience and ensure students are graduating with the knowledge, skills, and abilities necessary for success in their future careers or studies.

The role of the Institutional Effectiveness and Assessment within the Office Institutional Research and Planning is to partner with faculty and administrators at all levels of the University to oversee and support assessment activities for undergraduate and graduate degrees, certificates, and General Education. The assessment team also supports undergraduate disciplinary and multi-campus programs in their efforts to align programs taught at multiple Ohio State locations. Our goal is to provide guidance that will result in assessment activities that are meaningful and useful to all stakeholders.

Frequently Asked Questions

Academic program assessment is a multi-faceted strategy to improve students learning in the education environment. It involves:

  1. Making clear what students need to know, what they need to be able to do, and the mindset they should acquire from a course of study. These should be the outcomes submitted during the program approval process.
  2. Collecting information as to whether the knowledge, skills, and perspectives students are expected to learn are indeed acquired.
  3. Evaluating and using collected information to improve learning in an ongoing improvement cycle.

We recommend that each unit designate an assessment coordinator, all faculty should be invested in the assessment process. This could include helping to gather data or being involved in discussions on how the data is to be used to improve student learning.

Why we assess

Assessment is a powerful strategy for producing graduates who meet the expectations and learning outcomes of an academic program. In addition, it is required by the University and by our regional accrediting body, the Higher Learning Commission. 

The Office of Academic Affairs (OAA) is committed to the process of assessment as a critical component of continuous academic improvement. OAA’s academic program assessment process asks programs to articulate program goals and learning outcomes, to complete two full assessment cycles over a six-year period, and to submit annual assessment reports documenting assessment findings, plans, and actions.

Active, non-accredited, undergraduate degree, graduate degree, and for-credit certificate programs are asked to participate. Professionally accredited programs that meet assessment standards for OSU and HLC do not need to report separately through this internal process. Instead, these professionally accredited programs should upload their annual report into Nuventive. Please contact Mitsu Narui if you have questions about this process. 

Effective academic program assessment is a team effort! However, to streamline the process, we recommend each degree and for-credit certificate program should designate an assessment leader. One individual can serve as the assessment leader for multiple programs, and individual programs may have multiple assessment leaders that share responsibilities. Their role is to distribute assessment-related information and updates to faculty in their program(s), and they are responsible for submitting the annual assessment report for any programs they represent. 

If you are new to assessment or to OAA’s assessment reporting process, we recommend that you review our Assessment Quick Start Guide and take advantage of IRP’s many self-service assessment resources (to be developed) and the online Learning Outcomes Assessment Handbook (to be developed). You can also reach out to IRP staff for an introduction to the process. IRP can work with you one-on-one and provide workshops for larger groups on topics such as crafting high-quality goals and learning outcomes, assessment methods, closing the assessment loop, and assessment reporting.

Annual assessment reports are due each year on July 1 and July 15th

It is important to think of assessment as a cycle. This means that as you finish one cycle, you can use the information to inform the next one. That said, below is a rough timeline on how to think about your assessment process. In addition, the “Nueventive reporting” column lists more specific requirements for entering assessment plans and activities into the Nueventive system. 

Additional communication on deadlines will be provided via email. 



For previous year’s assessment

  • Complete analysis and interpretation of data collected.
  • Disseminate assessment reports to program stakeholders.


For previous year’s assessment

  • July 1: Enter Results, Dissemination, and Interpretation and activity Summaries into Nuventive for previous year (Summer to Spring).
  • July 15: Colleges submit Executive Summaries of assessment activities to Nuventive.


For upcoming year’s assessment

  • Update assessment plans and data-collection instruments for upcoming year.
  • Collect data for next year’s reporting cycle if using Summer-term data


For upcoming year’s assessment

  • August 15: Update assessment plans in Nuventive, in particular Method summaries and assessment schedule.



For previous year’s assessment

  • Develop action plan after discussing results with program stakeholders.


For previous year’s assessment

  • December 15: Enter Use and Action statements into Nuventive.
  • Optional: Add Follow-up information if applicable


For current year’s assessment

  • Collect data for current year’s assessment if using Autumn data.



For current year’s assessment

  • Collect data for current year’s assessment if using Spring data, finalizing annual data set.
  • Start analyzing and interpreting data gathered during the year.


For previous year’s assessment

  • Optional: Add Follow-up information if applicable, for instance, whether proposed actions were completed.


First and foremost, the process of reporting should provide your program faculty and instructors with the opportunity to think critically about the learning that is happening in your program. It should help you identify areas for improvement and opportunities to evaluate the impacts of any previous action plans.

Each submitted report is reviewed by IRP who provide feedback highlighting strengths and outlining areas for improvement. This feedback is focused on the assessment methods and is not a critique on the program or its learning outcomes. Reports that are submitted by July 1 receive feedback in Nuventive by September 1. 

IRP also aggregates assessment data across the institution and publishes an annual Assessment Report (to be developed) that provides an overview of program assessment at the University. Deans, directors of academic affairs, and associate deans of undergraduate and graduate education can access reports for their academic programs in Nuventive and are provided regular updates on the status of their programs. Assessment reports may also be provided to OSU’s accreditor, the Higher Learning Commission and its representatives as part of OSU’s accreditation reporting process. Overall, this information is used to evaluate OSU’s culture of assessment and to set plans for its continued development.

Please contact Mitsu Narui if you have additional questions.

Through the Council on Academic Affairs (CAA), programs are asked to submit assessment plans during the program proposal and approval processes. However, at the start of each year, program directors will need to confirm that the details of the plan are ready to implement. The steps listed below provide a simple guide to planning your program’s annual assessment efforts—an activity that ideally occurs before the academic year starts, perhaps even at the end of the previous academic year. This guidance identifies the most basic considerations in a concise format.

Step 1: Identify the outcome(s) to be assessed for the year.

Each program is asked to assess at least one outcome each year, making sure to assess all program outcomes over a span of three years. Consequently, annual assessment plans should also account for outcomes assessed in previous or subsequent years, unless covering all outcomes annually.

Additional considerations: For the three-year assessment cycle, direct methods are expected—at least one direct assessment for each outcome within a three-year period. Moreover, for the annual requirement, if only reporting one outcome for a year, the method of assessment should be direct (see more below). Programs are encouraged to report other forms of assessment that further document student and program achievements.

Step 2: Confirm (and refine as needed) the method(s) of assessment to be used.

Each program provides an assessment plan as part of its approval process. In planning the upcoming year’s assessment activities, program directors must confirm or adjust the methods originally proposed, taking into consideration their validity, as well as the resources available for conducting the assessment. Methods vary in how much training will be required for those producing, gathering, or processing the assessment data; methods also vary in the timing of data collection and the degree to which the data addresses program-related questions beyond those associated with the achievement of outcomes.

Additional considerations: Keep in mind that methods may differ also in how well they reflect the program’s student population. For direct assessments intended to meet the minimal reporting described above, programs should aim for data reflecting a representative sample of students significantly advanced in the program, for instance, graduating seniors or enrollees in required upper-division classes. When gathering data from classes, moreover, it may be necessary to filter results for students not enrolled in the program.

Step 3: Develop data-collection instruments and training materials.

Some of the most common forms of data collection are surveys, assignment harvesting (or “embedded artifacts”), reader ratings, and interviews. The associated instruments—questions, evaluation rubrics, and so on—should be designed to capture information directly related to one or more program outcomes, even if used to gather other data as well (e.g., details about student sentiments, student background, etc.). Besides creating the instruments themselves, program administrators must consider how they will be disseminated: with direct emails to particular students? in LMS templates for particular courses? by advisors meeting with students? Each approach has pros and cons, depending upon resources, methodological validity, and larger programmatic goals.

Additional considerations: When training those who will use data-collection instruments (e.g., instructors, raters, etc.), your program may achieve more representative results and response rates if the purposes of the assessment are explained to these stakeholders, noting as well what they can get out of the assessment process. Also, recognize that data gathered for the purposes of assessment should be protected like other student information, which for some methods may necessitate additional training in privacy protocols, such as data deidentification and security.

Step 4: Update Assessment Plans and Methods in Nuventive.Improve.

Once you have an assessment plan for the upcoming year, please enter the details into Nuventive.Improve, or update them as necessary. By entering these plans into Nuventive at the beginning of the year, you’ll get a head-start in completing some of the year-end reporting tasks, and potentially even reporting tasks for subsequent years (assuming the program doesn’t change its assessment methods). At this stage, it’s most important that all the program’s outcomes and methods are entered into the system to confirm that all outcomes will be assessed with a direct method over a three-year assessment cycle.

Additional considerations: Besides filling out the “Method” forms in Nuventive, we recommend uploading your supporting documents as well (timelines, courses serving as sites for data collection, rubrics, questions, etc.). Altogether, these entries and uploads serve as a knowledge repository for the program, facilitating continuity through staffing turnover and role reassignments, even if they occur during the middle of the year.