What is a Rubric?
Rubric Development
This section presents workshop style sheets that can be used in professional development activities. The handouts define different rubrics and provide examples of rubrics. There are several performance tasks with rubrics to show how to align tasks with standards and assessments.
What is a
Rubric?
A rubric is a set of scoring guidelines for
judging student work of performance-based tasks.
The rubric answers the
question:
What does
proficiency (and varying degrees of proficiency) at a task look
like?
IU8/IU11 Assessment Consortium 1995
Thoughts on Rubrics
IU8/Adapted from "Designing
Rubrics for Authentic Assessment" by Homestead, McGinnis, and Pate
IU11
Assessment Consortium 1995
Rubric Attributes
IU8/IU11 Assessment Consortium 1995
Rubric Types
HOLISTIC: A holistic rubric contains multiple categories and descriptions within each category. The assessor views the work being assessed as a whole.
PURPOSE-Gives the "Big Picture"
ADVANTAGES-Amount of work evaluated; Efficient
DISADVANTAGES-Lacks specificity; best used for benchmark assessments or program assessments
ANALYTICAL: An analytical rubric looks at specific aspects of the work being assessed. The assessor judges the work by examining its elements.
PURPOSE-Provides specific feedback on level of performance of each element or component
ADVANTAGES-Analyzes each component; identifies needs and strengths
DISADVANTAGES-Time needed for use
IU8/IU11 Assessment Consortium 1995
Holistic Rubric Scoring Example
[Looks at Big Picture]
(Adapted from article by Sabra Price and George Hein, "Scoring Active Assessments," Science and Children, 1994.)
TASK # 1: Measure the
height of two seedlings and record results.
TASK # 2: Explain recorded measurements of
growth pattern.
Advanced ALL CRITERIA ARE MET AND THE WORK EXCEEDS THE ASSIGNED TASK. WORK CONTAINS ADDITIONAL UNEXPECTED OR OUTSTANDING FEATURES. |
Proficient RESULTS MEET THE CRITERIA. |
Basic RESULTS MEET SOME OF THE CRITERIA. |
Below Basic DOES NOT COMPLETE THE TASK. SHOWS NO COMPREHENSION OF THE ACTIVITY. |
|
|
|
|
IU8/IU11 Assessment Consortium 1995
Analytic Scoring: Provides precise diagnostic information.
Reasons for use:
IU8/IU11 Assessment Consortium 1995
Steps in Creating a Rubric
1. Design learning performance assessment:
2. Determine the criteria for the assessment:
3. Determine the essential categories in terms of performance behaviors, for example:
4. Write descriptors for each of the categories in terms of performance behaviors:
5. Write descriptors for scale levels in terms of performance behaviors, for example:
IU8/IU11 Assessment Consortium 1995
A Rubric for Rubric Development
Is closely aligned to the performance task.
Attends to the following aspects:
Clearly defines clearly criteria and attributes for various qualities of work. For example:
Uses format and descriptors which are:
Is accompanied by instruction which is:
Adapted from work by Anne W. Kozik in A Guide for Implementing the Chapter 5 Regulations into Home Economic/Life Management Programs in Pennsylvania
Tips for Teachers in Developing Rubrics
IU8/IU11 Assessment Consortium 1995
Analytical Rubric Scoring Example
[Breaks Assessment Task Into Parts]
(Adapted from article by Sabra Price and George Hein, "Scoring Active Assessments," Science and Children, October, 1994.)
TASK # 1: Measure the height
of two seedlings and record the results.
TASK # 2: Explain recorded
measurements of growth pattern.
|
Advanced |
Proficient |
Basic |
Below Basic |
ASK # 1: Measure the height of two seedlings and record results. |
Measurements are accurate. Data is systematically recorded. |
Approximate measurements are recorded. |
Results are not recorded, but approximate measurements were used. |
Results are not recorded. Inaccurate measurement procedures were used. |
TASK # 2: Explain recorded measurements of growth pattern | More than reasonable explanations are provided. | Explanation for growth pattern is provided. | Explanation relates to unit activities, but does not explain growth patterns. |
No explanation is given or one that makes no sense. |
IU8/IU11 Assessment Consortium 1995
ANALYTICAL SCORING SCALE FOR PROBLEM SOLVING
STUDENT________________________________________________GROUP______________________________
SECTION________________________ DATE_______________________________
PROBLEM____________________________________________________________________________________
ANALYTIC |
SCORING SCALE |
COMMENTS |
UNDERSTANDING THE PROBLEM |
3: Complete understanding of the problem. 2: Minor misunderstanding of the problem. 1: Major misunderstanding of the problem. 0: Complete misunderstanding of the problem. |
|
MAKING A PLAN |
2: Plan is appropriate for problem. 1: Partially correct plan or a plan that could have property. 0: No attempt or totally inappropriate plan. |
|
SOLVING THE PROBLEM |
3: Correct answer. 2: Copying or computational error; partial answer. 1: Incorrect answer based on inappropriate plan. 0: No answer. |
|
LOOKING BACK ON THE PROBLEM |
2: Checks and extends answer; able to generalize results. 1: Checks and correctly labels answers. 0: Does not check for reasonableness of answer. |
RATING_______/10 ________%
Adapted from:
Charles, R., Lester,
F., & O'Daffer, P. (1987). How to Evaluate Progress in Problem Solving.
Reston, VA: NCTM.
IU8/IU11 Assessment Consortium 1995
ASSESSMENT TASK FOR STUDENT LEARNING STANDARD
TITLE_______________________________________________________________________________________
GOAL AREA_________________________________________WRITER__________________________________
STANDARD:___________________________________________________________________________________
______________________________________________________________________________________________
INTENDED LEVEL:________ REVIEWER:___________________________________
E=Exemplary; M=Meets Standard; D= Does Not Meet Standard
1. There are a variety of solutions/correct answers. Exemplary: Each student's response would be unique. There is an emphasis on divergent thinking and student creativity. Meets standard: There is no one right answer. The correct responses, however, are limited and there would be a great deal of difference between student responses. Does not meet standard: There is only one right answer and that is the one the teacher has in mind. |
2. This type of task is likely to occur in the "real" world. Exemplary: The students are working on a real problem or activity and the result of their efforts has a high likelihood of being used. Meets standard: The task involves a simulation of an activity that is actually done for some practical purpose. Does not meet standard: This is a "school" task, done only to meet the needs of assessment. |
3. The task is intrinsically interesting and developmentally appropriate. Exemplary: The students are working to solve interesting problems that appeal to students of this age. Given the opportunity, students pursue on their own. Meets standard: The task involves an interesting activity that students enjoy. Does not meet standard: The task is boring or too difficult for many students. |
4. There is a rubric or checklist which defines acceptable performance and the criteria are sufficiently specific that various raters would achieve similar results. Exemplary: The rubric has various levels with explicit enough descriptions so that students know precisely what is to be done. Meets standard: The rubric has various levels with descriptions that define general expectations. Does not meet standard: The rubric does not provide the student (or rater) with sufficient information to know what is expected. |
5. The task is directly related to a student learning standard, transition standard or course standard and makes it possible to clearly differentiate between students who have achieved the standard and those who have not. Exemplary: Completion of the task provides clear evidence that the learner has mastered the standard. Meets standard: There is a clear relationship between the task and the standard. Assessment is done on the individual student's work. Does not meet standard: It is not clear how this task relates to the standard and/or the successful completion; does not necessarily assure the individual student has mastered the standard. |
6. Students have input in setting the standards and criteria. Exemplary: Students have a high degree of involvement in determining all aspects of the assessment. Meets standard: Students have a high degree of involvement in determining some aspects of the assessment task. Does not meet standard: All elements of the assessment were determined by the teacher. |
7. The task can be repeated or improved until satisfactorily accomplished. Exemplary: The task is complex and significant learning takes place each time the student does it. Meets standard: Continued practice with the task leads to an overall increase in skill level. The task is sufficiently complex to benefit the learner by repeating. Does not meet standard: The task is sufficiently simple that not doing it right initially makes it obvious. Repeating the task is not likely to improve the student’s general skill level. |
8. The task includes self and/or peer assessment. Exemplary: Self and/or peer assessment is integrated as a major element. Meets standards: Includes some self or peer assessment. Does not meet standard: Does not include self or peer assessment. |
COMMENTS:
IU8/IU11 Assessment Consortium 1995
PERFORMANCE
ASSESSMENT
CHARACTERISTICS
Differences Between
Traditional Testing and
Authentic Assessment:
Traditional
Testing (e.g., standardized, multiple choice) |
Authentic
Assessment (e.g., performance, portfolio) |
Given annually, one shot | Ongoing, cumulative |
Based on a single setting | Based on a variety of settings |
One correct response | Open-ended, multiple possibilities |
Norm-referenced | Student-centered, criterion-referenced |
Test/teacher-driven | Student-driven |
"Teacher proof" | Teacher-mediated |
Paper/pencil | Performance |
Narrow measure of skill | Real-world, integrated application that measures capacity for constructing and using knowledge |
Separate from curriculum and instruction | Integral to curriculum and instruction |
Comparisons to others | Comparisons to self and goals |
Produces undesirable anxiety | Produces confidence in ability to self-assess and self-correct |
Performance Assessment-Electricity (Grade 6)
Test questions to students: You are a scientist working for a large computer company. Your assignment is to investigate electrical conduction through a circuit.
1. Make an electrical circuit using all the items on the table (battery, wire, light bulb, switch).
2. Quickly draw a simple picture of your circuit in the space below.
3. Open "Bag A." Use the clip and lead to make an electrical tester. Test each of the items in "Bag A" with your circuit. Place an X on the chart under the appropriate column to show what happened when each item was tested.
"Bag A" Items | Conducts Electricity | Does Not Conduct Electricity | |
4. Did you build a
complete circuit?
Yes____ No____ |
|||
Plastic Spoon | |||
5. Explain how you know. | Steel Washer | ||
String | |||
Penny | |||
Nail | |||
Rubber band |
6. How are the items that do conduct electricity alike?
7. How are the items that do not conduct electricity alike?
8. Examine the item in "Bag B." Do you think it will conduct electricity? Why or why not?
Draft Scoring Rubric
The following draft rubric was developed to assist in scoring student responses to the Grade 4 Performance Field Test in science.
4 = Gives complete and acceptable answers to all questions; provides acceptable rationale; includes a complete and accurate diagram of a circuit with supporting evidence; demonstrates understanding of the concept of circuits and conductivity; uses descriptive terms (conductor, flow, current, etc.).
3 = Gives fairly complete and acceptable answers to most questions; provides adequate answers, but rationale may be vague; includes a complete diagram of a circuit; shows understanding of the concept of electricity and conductivity; responds to questions #4 or #8 in an acceptable manner.
2 = Several incomplete of unsatisfactory answers; rationale is very limited; shows some understanding of the concept of circuitry, but not conductivity; diagram of a circuit may be missing or incomplete.
1 = Very little response (diagram only or few answers); partial answers to a small number of questions; no rationale; does not include a diagram of a circuit; contains at least one correct answer other than questions #3.
Source: California Assessment Program. (1990). Science Performance Field Test: Grade 6. Sacramento: CA State Department of Education.