Skip to main content

Procedure 255: Academic program review

Download Procedure 255: Academic program review

This PDF is the official text of this policy. If there are any incongruities between the text of the HTML version (rightbelow) and the text within the PDF file, the PDF will be considered accurate and overriding.

University Procedure 255

Section 1. Policy

Policy 2550 sets the standard for planning and conducting academic program reviews, creating action plans based on those reviews, and reporting progress against the respective action plans at Metropolitan State University.

The purpose of a program review is to facilitate a high quality and meaningful learning experience for our students by evaluating program relevance, effectiveness and alignment with Metropolitan State University’s mission and strategic goals.

The program review provides input into a Program Action Plan that identifies goals, actions and needed resources. Continuous Improvement Reports describe progress against the Program Action Plans. This procedure describes the actions and documents associated with that process.

Academic Program Review (herein referred to as “Program Review”) at the University is a continuing program improvement process that is consistent with the University's participation in the Academic Quality Improvement Program (AQIP) to maintain its Higher Learning Commission accreditation. Programs shall view this process as informing the development of goals and objectives for continuous improvement.

Section 2. Authority

This procedure is issued pursuant to the authority granted under the Rules and Regulations of the Minnesota State College and University System and consistent with Board of Trustees Policy 3.36 Academic Programs and Procedure 3.36.1 Academic Programs.

Section 3. Effective Date

This procedure shall become effective upon signature by the president and remains in effect until modified or expressly revoked.

Section 4. Responsibility

The responsibility for implementation of this regulation is assigned to the program directors, department chairs, deans, and provost.

Section 5. Procedure

A Program Review covers award(s) (degrees, minors and/or certificates) or any other cohesive part of the curriculum as determined by an academic unit (e.g., college, school, department). The scope of an Academic Program Review is specified in the Program Review Plan by listing all award(s) (if applicable), courses, and any other activities covered by that review. Deviations from this procedure’s statement of responsibility (such as who is responsible for, and who approves the Program Review documents) must be approved by the Provost before beginning the Program Review Plan.

A 5-year schedule of planned program reviews will be submitted to the IFO meet and confer process annually no later than October of each year. Each Dean will ensure that all courses and all degrees in his or her unit are covered in at least one Program Review during that 5 year period. With the endorsement of the Dean, a Department Chair or Program Director may submit a request to the Provost to delay the review.

Program Reviews will begin in the Spring Semester of the academic year before which the program review is scheduled for completion. There are four stages of the Program Review, each of which lasts approximately one semester:

Stage 1: The Program Review Plan is completed during Spring Semester and due May 15.

Stage 2: Most of the data is collected during Summer Semester

Stage 3: The Program Review is completed during Fall Semester and due by December 15.

Stage 4: The Program Action Plan is created through a consultative process and due on May 15.

Continuous Improvement Reports reflect progress against the Program Action plan, and will be submitted periodically, as specified in the Program Action plan.

An Academic Program with external accreditation or review requirements will be scheduled for program review on a cycle that matches the external accreditation or review process (but not less than every 5 years). The Program Review Plan will document the additional sections needed (if any) to complete the Program Review requirements. Additionally, if approved by the Provost, an Academic Program with external accreditation or review requirements may substitute reports to their external bodies for their Continuous Improvement reports.

Section 6. Program Review Process

Stage 1: Program Review Plan

  • Purpose: To reach consensus between the submitter, the Dean and the Provost on the resources needed to conduct the Program Review, and the data and content sections that will be included in the final Program Review (see Stages 2 and 3).
  • Completed and approved by: May 15
  • Output: Approved Program Review Plan
  • Submitted by: Department Chair or Program Director
  • Approved by: Dean, Provost

Topics Covered:

  • Scope of Program Review (all award(s) (if applicable), courses, and any other activities covered by that review).
  • Categories and topics that will be covered in the Program Review document (see Stage 3 for a list of suggested topics). In the final Program Review document, each category should be addressed in some way, with at least one programmatic data point or a relevant narrative.
  • People who will be involved in the review, the roles / expectations of each.
  • Data that will be collected, and by whom. During the process of creating the Program Review Plan, the department chair/program director will meet with Institutional Research to discuss the data needs for each Program Review, and verify the majors, minors, concentrations and courses included in the review.
  • Budget required to conduct and write the review - faculty reassigned time, other resources needed, such as to conduct a focus group
  • Schedule (with intermediate milestones) for completing the review

Stage 2: Data Collection

The data listed in Table 1 is recommended to be analyzed as part of the Program Review, but this list can be modified to meet the needs of a specific Program Review as justified in the Program Review Plan. The planned data list must be included in the approved Program Review Plan. It is expected that by September 1 of the academic year in which the program review is initiated, Institutional Research will provide the Internal Data agreed upon in the Program Review Plan for the previous five academic and fiscal years. Assessment Data should be based on student learning outcomes. External Data will be collected based on the methods described in the Program Review Plan (e.g. through focus groups, advisory boards or surveys, or other methods that are relevant for the specific program review).

Stage 3: Program Review Analysis and Documentation

  • Purpose: To evaluate program relevance, effectiveness and alignment with Metropolitan State University’s mission and strategic goals
  • Completed by: December 15
  • Output: Program Review document
  • Submitted by: Department Chair or Program Director
  • Approved by: Dean, Provost

Table 1: Program Review Contents

(The approved Program Review Plan lists the Table of Contents for that specific review.)

Category

Topic

Introduction

Specification of courses and award(s) (if applicable) covered by this review

Statement of student learning outcomes associated with this Program Review

Course offerings – periodicity, formats, locations

Brief history: Status at last review

Significant actions taken since last review

Description of collaborative efforts with other colleges, community, industry, etc.

Relevance to External Stakeholders

Analysis of labor market history and projections

Employer feedback on current needs

Community partners feedback on needs

Data Source(s): secondary sources and/or collected as specified in the Program Review Plan – may be qualitative and/or quantitative data. All of these are expected by HLC.

Relevance: to Internal Stakeholders

Analysis of enrollment levels - historical and current

Analysis of student credit hours generated

Data Source: Institutional Research

1. Student Population by Academic Year (based on enrollment at the end of the semester of each of the three semesters: Summer, Fall Spring) for each major, minor or certificate included in the Program Review.

  1. Active declared majors, minors, certificates (as appropriate) (have taken a class in at least one of the last 9 semesters)
  2. Enrolled declared majors, minors, certificates (as appropriate) (have taken at least one class in the last year)
  3. Enrolled pre-majors (as appropriate)

2. Demographic Information for Enrolled Majors by Academic Year (and optionally for Enrolled Minors)

  1. Gender (Female, Male, Unknown)
  2. Race/Ethnicity (International Student (non-resident alien), White, Student of Color or American Indian (one or more indication), Unknown)
  3. Underrepresented students served (any combination of the following: Student of color or American Indian, low-income, first generation college student)
  4. Age as of Sept 1 (mean, standard deviation, range, distribution using MnSCU age ranges)

3. Course Enrollment in Courses by Semester

  1. Enrollment for each course (based on end of term enrollment of each semester: Summer, Fall, Spring).
  2. Average fill rate of course sections for FDIS, online, hybrid and on ground delivery modes

Effectiveness: Quality of Program Processes

Assessment of Program Learning Outcomes - historical data

Actions taken over the reporting interval based on assessments

Comparison of assessment data against internal goals

Comparison of assessment data against external benchmarks

Data Source: Assessment Data

Effectiveness: Quality of Program Outcomes

Analysis of time to degree completion - historical and current (compared against internal goals and external benchmarks)

Analysis of job placement data

Description of alumni admission to advanced degrees

Report on graduate satisfaction

Data Source: Institutional Research

Student Success (Clock starts in first semester matriculated, except that summer rounds “up” to fall)

  1. Time to degree completion and comparison to university rate
  2. Number of graduates by fiscal year (July 1 through June 30)
  3. Graduate satisfaction (will need to be added to graduate survey)
  4. Job placement data.
  5. Alumni admission to advanced degrees

All of these are expected by HLC

Effectiveness: Faculty Resources

Evaluation of sufficient faculty resources

Data Source: Institutional Research

For each Academic Year

  1. Average credits taught by resident faculty (for all courses covered in the Program Review)
  2. Average credits taught by community faculty (for all courses covered in the Program Review)
  3. Average credit hours taught by resident faculty (for all courses covered in the Program Review)
  4. Average credits taught by community faculty (for all courses covered in the Program Review)

Alignment

Alignment with the mission of the University and its strategic goals

Contribution to institutional reputation

Degree to which program is "mission critical" or "core" to the educational experience

Alignment with established professional certifications or national standards, if appropriate

Summary

Key strengths

Key weaknesses

Key threats to the current program

Opportunity Analysis

New program opportunities

Any of the following:

  • Opportunity to realign or strengthen program
  • Potential for interdisciplinary programs
  • Alternate delivery mechanisms or locations
  • Other potential net revenue

Stage 4: Program Review Action Plan

  • Purpose: To facilitate a consultative process with faculty, Dean, and Provost to agree on goals, actions, and needed resources.
  • Completed and Approved by: May 15
  • Output: Program Review Action Plan
  • Submitted by: Department Chair or Program Director
  • Approved by: Department Chair or Program Director , Dean, and Provost

Topics Covered:

  • Student learning outcomes for the Program, including internal targets
  • Other goals for the Program, such as collaborations, enrollment, retention, etc.
  • Actions, expected completion dates associated with the above goals.
  • 5 year Assessment plan
  • Timing of future Continuous Improvement Reports
  • Resources needed for the stated actions.
  • This serves as input into the budgeting process for the fiscal year that starts one year later. Some additional resources may be able to be deployed sooner.

Section 7: Continuous Improvement Report

  • Purpose: To describe progress against the goals, actions and resource needs in the Program Action Plan
  • Completion dates outlined in respective Program Action Plan
  • Output: Continuous Improvement Report
  • Submitted by: Department Chair or Program Director
  • Comments provided by: Dean, Provost

Exceptions: Upon approval by the Provost, Academic Programs with external accreditation or review requirements may substitute reports to their external bodies for their Continuous Improvement reports, although some augmentation may be required to cover some of the topics below that are not contained in their external report.

Topics Covered:

  • Assessment data for learning goals, including insights gained and actions taken.
  • Evaluation and evidence of progress against other goals stated in the Program Review Action Plan
  • Describe any changes to the assessment plan or goals, including new internal targets if appropriate.
  • Opportunities that could be pursued in following years
  • Actions to be taken based on the assessment and progress data.
  • Update on resources needed to accomplish the stated goals.

Section 8. Review

This procedure will be reviewed every three years or as needed.

Section 9. Signature

Issued on this day of

Carol Bormann Young, Interim Executive Vice President and Provost

Virginia Arthur, President