Tools and Resources

Search Results

This is a 2-page document describing the main statistical indices provided as part of an item analysis. It offers information about how to interpret each index.

This PowerPoint presentation describes how to us item analysis to determine the efficacy of multiple choice questions.

A weblog tutorial for instructors and test writers interested in gaining a better understanding of how to use item (or test question) analysis. Item analysis provides useful information about how well test items ‘performed.’

This page introduces the Schreyer Institute's four self-paced modules which address important teaching and learning topics. Topics include learning outcomes assessment (program assessment), item analysis (a method to analyze the effectiveness of multiple choice tests, working with teams, and best practices for Powerpoint. Instructors can access these modules on their own time at their own pace.

These PowerPoint slides accompanied a presentation by Linda Suskie delivered via Zoom on Tuesday, Apr. 25, 2017. Multiple-choice tests can have a place in many courses. If they’re well designed, they can yield useful information on student achievement of many important course objectives, including some thinking skills. An item analysis of the results can shed light on how well the questions are working as well as what students have learned. Viewers will be able to use principles of good question construction to develop tests, develop test questions that assess thinking skills as well as conceptual understanding, and use item analysis to understand and improve both test questions and student learning. Be sure to open the handouts file listed below as you view the presentation!

These handouts (minus quizzes for test security) accompanied a presentation by Linda Suskie delivered via Zoom on Tuesday, Apr. 25, 2017. Multiple-choice tests can have a place in many courses. If they’re well designed, they can yield useful information on student achievement of many important course objectives, including some thinking skills. An item analysis of the results can shed light on how well the questions are working as well as what students have learned. Viewers will be able to use principles of good question construction to develop tests, develop test questions that assess thinking skills as well as conceptual understanding, and use item analysis to understand and improve both test questions and student learning.

This sample score report is generated by our paper exam scanning system. The score report is an important tool that will help you evaluate the effectiveness of a test and of the individual questions that comprise it. The evaluation process, called item analysis, can improve future test and item construction. The analysis provides valuable information that helps instructors determine which are the “best” test questions to secure and continue to use on future course assessments; which items need review and potential revision before a next administration, and which are the poorest items which should be eliminated from scoring on the current administration.

This site from the University of Oregon's Teaching Effectiveness Program offers helpful suggestions, examples, and templates for developing higher level multiple choice items.

This document describes the process of question sampling (item banking).

Faculty sometimes find it difficult to respond to the written comments that accompany SRTEs (aka SETs). This document provides a template for sorting students' comments into themes. The themes provided are common ones, but your ratings may include other themes. If a student's comment includes many themes, we recommend splitting out the comments about different topics. After all of the students comments are sorted, sort the themes from those with the most comments to those with the fewest comments. This can help faculty recognize that not all students agree with the student who wrote one or two particularly hurtful comments. Typically, there is a natural break at around the 3rd or 4th theme and we recommend focusing on the themes most frequently mentioned by students.

This document provides guidelines for presenting your student ratings (aka SRTEs, SETs) for review by a department or program head or a review committee. It provides recommended sections to include in a 1 page summary of your ratings for a particular offering of a course. It can be accompanied by a thematic analysis of students' written feedback (See "Template for Analysis of Student Comments"). Some faculty find that this helps them to clarify what happened in the course and guides them to focus on particular aspects of the course to retain and to improve.

This is a faculty peer evaluation form (peer observation, classroom observation). It has a "checklist" format, it does not have a scaled rating (Likert scale) format. This form asks faculty peer reviewers to note the presence of teaching activities/behaviors that have already been established as indicative of high quality teaching. This form is intentionally designed to be shortened by the faculty in an academic unit so that it reflects the unit's teaching values and the priorities of the unit. It should not be used "as is" because it is too much to expect to reviewers to evaluate; fewer items per section list will make the form easier for faculty to use.

The form was created based in January 2006 based on information in: Chism, N.V.N. (1999) Chapter 6: Classroom Observation, Peer Review of Teaching: A Sourcebook. Bolton, MA: Anker Publishing.

This file is used by scanning operations to match questions on different test forms.

This checklist includes a list of items that Penn State requires be included in all syllabi, per Faculty Senate Policy 43-00 Syllabus. It also includes links to example syllabus statements and lists items that the Schreyer Institute recommends be included in every syllabus.

A teaching portfolio provides materials associated with the experience of teaching and learning. This PDF describes items that might be included in a teaching portfolio, categorized by source: personal material, material from others, and products of good teaching.

This file contains a list of "item-writing rules," which will help you to write multiple choice questions in a way that will improve the ability of the test to focus on the content and prevent students from guessing the correct answer without knowing the material. The rules were developed by experts in the field of psychometrics, like the people who write questions for SATs or GREs.

Form for weighting items for exams with items that have different weights. Used for scanning multiple choice tests that use bubble sheets.