University of Idaho - I Banner
A student works at a computer


U of I's web-based retention and advising tool provides an efficient way to guide and support students on their road to graduation. Login to VandalStar.

Assessment Resources

Meta-Assessment Rubric Section 1.A & 1.B

Section 1.A and 1.B: This section looks at the student learning outcome statements and evaluates them for student-centered language and adequate description of whom will be assessed, action that will be assessed (verb precision), and content or domain that will be assessed.

Approaches for Developing Outcomes
University of Nebraska-Lincoln

Writing Effective Learning Objectives
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Bloom’s Taxonomy, Action Speaks Louder
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Bloom’s Taxonomy of Measurable Verbs
California State University Northridge

SLOs, Bloom’s Taxonomy, Cognitive, Psychomotor, and Affective Domains (pages 3-5)
Crafton Hills College

To Imagine a Verb: The Language and Syntax of Learning Outcomes Statements
National Institute for Learning Outcomes Assessment (NILOA)

Meta-Assessment Rubric Section 2

Section 2: This section looks for a related activity for each student learning outcome. This could include listing courses where an outcome is addressed/assessed or uploading a curricular mapping document.

Mapping Learning Outcomes: What You Map is What You See
National Institute for Learning Outcomes Assessment (NILOA)

Opportunity to Learn (planning worksheets)
University of Nebraska-Lincoln

Meta-Assessment Rubric Section 3.A, 3.B, 3.C, 3.D, & 3.E

Section 3.A: This section refers to how well a tool measures what it is supposed to do or to what extent an assessment task provides a student work product that represents the domain of outcome(s) you intend to measure. Things you might report in your plan might include: steps used to develop the instrument; the relationship or correlation between the measure and the student learning outcome; or clarity on training for raters/observers, instructions for test-takers, and instructions for scoring. Types of validity to address include face validity, construct validity, and formative validity.

Why Should Assessment, Learning Objectives, and Instructional Strategies be Aligned?
Carnegie Mellon University

Evaluating Student Learning Assessment of Outcomes
Cosumnes River Colleges

A Primer on the Validity of Assessment Instruments
Journal of Graduate Medical Education

Assessment Techniques
Canada College

Section 3.B This section refers to whether or not the student learning outcome is assessed using a direct measure.

Choose a Method to Collect Data/Evidence
University of Hawaii – Manoa

Direct and Indirect Measures of Student Learning
Indiana University – Purdue University Indianapolis

Using Indirect vs. Direct Measures in the Summative Assessment of Student Learning in Higher Education
Journal of the Scholarship of Teaching and Learning

Kansas State University: Office of Assessment

Assessment Method
University of Nebraska-Lincoln

Section 3.C: This section refers to the desired level of student achievement. The score should be justified. Possible ways to do this could include: comparing students’ performance against peers, against an establish standard, or against prior years’ results.

Performance Indicator (Criteria for Success)
Central Michigan University: Curriculum and Assessment

Section 3.D: This section asks for information about data collections. Basically, who took the assessment (or information on sample) and information about how it was administered, including important information about the testing conditions such as student motivation or testing conditions.

Section 3.E: This section asks about reliability of measure. This might include discussion about inter-rater reliability exercises, calibration of rubrics, or rubrics norming. Ultimately, this is about what is being done to ensure that there is consistency between raters across sections or over time. Keep in mind some subjectivity exists to how psychometric terms are used in the area of student learning outcomes assessment, and many of the available resources. The materials presented here provide information on the type of activity or reporting that is relevant to our work.

Calibrating Multiple Graders
Johns Hopkins Sheridan Libraries: The Innovative Instructor

Quick Guide to Norming on Student Work for Program-Level Assessment
Washington State University, Office of Assessment of Teaching and Learning

Calibration Protocol for Scoring Student Work
Rhode Island Department of Education

The Use of Scoring Rubrics: Reliability, Validity, and Educational Consequences
Education Research Review

Meta-Assessment Rubric Section 4.A, 4.B, & 4.C

Sections 4.A, 4.B, & 4.C: This section focuses on how the assessment results were reported – are the findings clear, do they relate to the student learning outcome(s), are prior findings also discussed in relation to current findings, and how the data/results interpreted.

Analyzing, Interpreting, Communicating and Acting on Assessment Results
Ball State University

Meta-Assessment Rubric Section 6

Sections 6.A & 6.B:This section asks for information about the changes that will be made based on your assessment activities, to either or both your curriculum and/or your assessment activities.

Guidelines for Using Results and Interpretations
University of Nebraska-Lincoln

Showing an Impact: Using Assessment Results to Improve Student Learning
National Institute for Learning Outcomes Assessment (NILOA)

Opening Doors to Faculty Involvement in Assessment
National Institute for Learning Outcomes Assessment

Degree Qualifications Profile
Lumina Foundation & National Institute for Learning Outcomes Assessment

Stuck in the Assessment Swamp?
The Chronicle of Higher Education

Rubric Tool Box
Marymount University

Collection of Rubrics (examples)
DeAnza College

Sample Assignments and Rubrics for Top Ten Courses (examples)
Wallace Community College

VALUE Rubrics
Association of American Colleges and Universities (AAC&U)

Poetry Speaking and Performance Rubric (example)
International Reading Association

Science Rubrics (example)
Cornell University

Fine and Performing Arts Rubric (example)
Wyoming Arts Coalition

Aesthetic Responsiveness Rubric (example)
Los Angeles Mission College

Aesthetic Skills Rubric (example)
Delgado Community College

Aesthetics and Creativity Rubric (example)
Long Beach City College

Project Work Rubric (example)
Carnegie Mellon University

Statistics 101 Rubric (example)
Iowa State University

Flash Animation Rubric (example)
Middle Bucks Institute of Technology

What Is a Generally Educated Person?
Peer Review/ The Association of American Colleges & Universities (AAC&U)

On Solid Ground: Value Report 2017
Association of American Colleges and Universities (AAC&U)

VALUE Rubrics
Association of American Colleges and Universities (AAC&U)

Physical Address:
Admin, Room 208
Moscow, ID 83844-3163

Mailing Address:
875 Perimeter Drive MS 3163
Moscow, ID 83844-3163

Phone: 208-885-7995

Fax: 208-885-7998


Web: Institutional Effectiveness and Accreditation