University of Idaho - I Banner
A student works at a computer


U of I's web-based retention and advising tool provides an efficient way to guide and support students on their road to graduation. Login to VandalStar.


Program reports on assessment plans and activity are entered into Anthology Planning annually by October 1. A holistic report on learning outcomes achievement for each major, degree and/or certificate is entered into Anthology Planning and supported by direct evidence.

Institutional Assessment and Accreditation reviews these reports using the Quality Assessment Rubric (adapted from James Madison University’s 2015 APT Assessment Rubric) and assigns a quality score and provides detailed feedback. The feedback includes individual comments, recommendations in line with best practices and accreditation and commendations.

Individual programs, colleges and U of I can use the quantitative scores to better understand our strengths and weaknesses, as well as trends and improvements made individually and collectively.

Sections of the Quality Assessment Rubric are described below with links to informational resources that support each section.

Meta-Assessment Rubric Section 1.A & 1.B

Section 1.A and 1.B: This section looks at the student learning outcome statements and evaluates them for student-centered language and adequate description of whom will be assessed, action that will be assessed (verb precision), and content or domain that will be assessed.

Creating Learning Outcomes
Boston University

Degree Qualifications Profile (DQP)
Learner-centered framework for what college graduates should be able to do at differing degree levels.

Approaches for Developing Outcomes
University of Nebraska-Lincoln

Writing Effective Learning Objectives
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Bloom’s Taxonomy, Action Speaks Louder
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Bloom’s Taxonomy of Measurable Verbs
California State University Northridge

SLOs, Bloom’s Taxonomy, Cognitive, Psychomotor, and Affective Domains (pages 3-5)
Crafton Hills College

To Imagine a Verb: The Language and Syntax of Learning Outcomes Statements
National Institute for Learning Outcomes Assessment (NILOA)

Meta-Assessment Rubric Section 2

Section 2: This section looks for a related activity for each student learning outcome. This could include listing courses where an outcome is addressed/assessed or uploading a curricular mapping document.

Mapping Learning Outcomes: What You Map is What You See
National Institute for Learning Outcomes Assessment (NILOA)

Opportunity to Learn (planning worksheets)
University of Nebraska-Lincoln

Meta-Assessment Rubric Section 3.A, 3.B, 3.C, 3.D, & 3.E

Section 3.A: This section refers to how well a tool measures what it is supposed to do or to what extent an assessment task provides a student work product that represents the domain of outcome(s) you intend to measure. Things you might report in your plan might include: steps used to develop the instrument; the relationship or correlation between the measure and the student learning outcome; or clarity on training for raters/observers, instructions for test-takers, and instructions for scoring. Types of validity to address include face validity, construct validity, and formative validity.

Why Should Assessment, Learning Objectives, and Instructional Strategies be Aligned?
Carnegie Mellon University

Evaluating Student Learning Assessment of Outcomes
Cosumnes River Colleges

A Primer on the Validity of Assessment Instruments
Journal of Graduate Medical Education

Assessment Techniques
Canada College

Section 3.B This section refers to whether or not the student learning outcome is assessed using a direct measure.

Choose a Method to Collect Data/Evidence
University of Hawaii – Manoa

Direct and Indirect Measures of Student Learning
Indiana University – Purdue University Indianapolis

Using Indirect vs. Direct Measures in the Summative Assessment of Student Learning in Higher Education
Journal of the Scholarship of Teaching and Learning

Kansas State University: Office of Assessment

Assessment Method
University of Nebraska-Lincoln

Section 3.C: This section refers to the desired level of student achievement. The score should be justified. Possible ways to do this could include: comparing students’ performance against peers, against an establish standard, or against prior years’ results.

Performance Indicator (Criteria for Success)
Central Michigan University: Curriculum and Assessment

Section 3.D: This section asks for information about data collections. Basically, who took the assessment (or information on sample) and information about how it was administered, including important information about the testing conditions such as student motivation or testing conditions.

Section 3.E: This section asks about reliability of measure. This might include discussion about inter-rater reliability exercises, calibration of rubrics, or rubrics norming. Ultimately, this is about what is being done to ensure that there is consistency between raters across sections or over time. Keep in mind some subjectivity exists to how psychometric terms are used in the area of student learning outcomes assessment, and many of the available resources. The materials presented here provide information on the type of activity or reporting that is relevant to our work.

Calibrating Multiple Graders
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Quick Guide to Norming on Student Work for Program-Level Assessment
Washington State University, Office of Assessment of Teaching and Learning

Calibration Protocol for Scoring Student Work
Rhode Island Department of Education

The Use of Scoring Rubrics: Reliability, Validity, and Educational Consequences
Education Research Review

Meta-Assessment Rubric Section 4.A, 4.B, & 4.C

Sections 4.A, 4.B, & 4.C: This section focuses on how the assessment results were reported – are the findings clear, do they relate to the student learning outcome(s), are prior findings also discussed in relation to current findings, and how the data/results interpreted.

Analyzing, Interpreting, Communicating and Acting on Assessment Results
Ball State University

Meta-Assessment Rubric Section 6

Sections 6.A & 6.B:This section asks for information about the changes that will be made based on your assessment activities, to either or both your curriculum and/or your assessment activities.

Guidelines for Using Results and Interpretations
University of Nebraska-Lincoln

Showing an Impact: Using Assessment Results to Improve Student Learning
National Institute for Learning Outcomes Assessment (NILOA)

Physical Address:
Associate Director
Assessment and Accreditation
Admin, Room 325
Moscow, ID 83844-3163

Mailing Address:
875 Perimeter Drive MS 3163
Moscow, ID 83844-3163

Phone: 208-885-5962


Web: Assessment and Accreditation