06score guide

Views:
 
Category: Entertainment
     
 

Presentation Description

No description available.

Comments

Presentation Transcript

Slide1: 

MCAS Alternate Assessment (MCAS-Alt) 2006 Guidelines for Scoring Student Portfolios Massachusetts Department Education June 2006

Slide2: 

  Copyright © 2006 Massachusetts Department of Education Permission is hereby granted to copy any or all parts of this document for non-commercial educational purposes. Please credit the 'Massachusetts Department of Education.'   350 Main Street Malden, Massachusetts 02148-5023 (781) 338-3000 mcas@doe.mass.edu www.doe.mass.edu/mcas Massachusetts Department of Education   This document was prepared by the Massachusetts Department of Education. Dr. David P. Driscoll, Commissioner of Education

Slide3: 

Commissioner’s Foreword Dear Educators: I am pleased to present the MCAS Alternate Assessment (MCAS-Alt) 2006 Guidelines for Scoring Student Portfolios. This publication will be used to train qualified individuals selected by the Department to score student portfolios submitted for the 2006 MCAS-Alt. This manual is used to ensure that scores for each portfolio are accurate and that standards for scoring are applied consistently. Students with significant disabilities who are unable to take MCAS tests, even with accommodations, must participate in MCAS by submitting an alternate assessment portfolio. It is important to include these students in MCAS to measure their performance in relation to the state’s learning standards, to improve their instruction, and to demonstrate that their educational needs matter. Thank you for taking part in this important component of MCAS. Sincerely, David P. Driscoll Commissioner of Education

Slide4: 

Table of Contents MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide5: 

Introduction 1 MCAS-Alt has been administered annually since 2001 in Massachusetts. According to state and federal laws, all students with disabilities are required to participate in statewide assessments, either by taking standard MCAS tests, with or without accommodations, or by taking alternate assessments. Decisions on how each student will participate in MCAS are made by the student’s IEP or 504 team and must be documented in the student’s IEP or 504 plan.   Participation Guidelines A student with a significant cognitive disability should be considered for alternate assessments by an IEP or 504 team when the student: receives instruction in which the content and level of instruction have been modified well below the expectations of non-disabled students enrolled in the same grade; And receives intensive, individualized instruction across all settings in which a subject is taught; And does not adequately demonstrate knowledge and skills in the subject being assessed on a standardized, paper-and-pencil test such as MCAS, even when accommodations are provided.   Students with other complex and significant, though not necessarily cognitive, disabilities should also be considered for alternate assessments when those disabilities present the student with unique and significant challenges that may not allow the student to fully demonstrate knowledge and skills on a paper-and-pencil test such as MCAS, even with accommodations.   Portfolio Contents and Structure The MCAS-Alt portfolio consists of a structured collection of products, compiled throughout the school year, that document the student’s performance of skills and understanding based on the Curriculum Framework in the content area being assessed. Evidence is organized in a portfolio according to the standards specified for assessment in each content area, and includes the following products and information: Data charts showing the student’s performance over time on tasks based on the learning standard being assessed Work samples, video/audio clips, and/or photographs showing the student’s performance on tasks based on the learning standard being assessed Descriptive notes provided by the teacher, examples of materials and tools used by the student, reflection sheets, and other supporting documentation at the discretion of the teacher   Creation of portfolios is guided by information in the Department publication entitled the Educator’s Manual for MCAS-Alt, which is updated annually, distributed at Department-sponsored training events, and posted to the Department’s Web page at www.doe.mass.edu/mcas/alt.   Scoring MCAS-Alt Portfolios Once portfolios are completed and submitted to the Department each May, they are reviewed and scored by licensed Massachusetts educators at a three-week summer scoring institute sponsored by the Department of Education. The Rubric for Scoring Portfolio Strands, found in Appendix A of this publication, is used as the basis for scoring all student portfolios. Detailed information on scoring portfolios can also be found on the following pages. This publication, the 2006 Guidelines for Scoring Student Portfolios, is also posted at www.doe.mass.edu/mcas/alt. MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide6: 

General Guidelines for Scorers Be objective and impartial. Try not to let opinions or personal feelings influence your scoring. Put aside your opinions about the appropriateness of the student’s placement, program, services, or the reason for his or her participation in alternate assessment. Avoid the tendency to base your scores on any of the following: overall presentation and organization of the portfolio neatness of student (or teacher) work handwritten versus typed products 'electronic' versus 'paper' portfolios black-and-white versus color quality of photos or videotapes (provided all images are recognizable and well-labeled) Respect student and teacher confidentiality. Do not use the names of teachers or students when discussing the contents of any portfolio. Do not score any portfolio if you are familiar with the student or teacher who submitted it. Do not review or consider any IEP information provided in the portfolio. Respect the contents of the portfolio. Maintain the order of all contents in the portfolio. Keep food and drinks away from the portfolio. The portfolio must be returned in the same condition in which it was submitted. Review all evidence in a strand before scoring the strand. Score only what you see in the portfolio. Do not make inferences or assumptions about what the student or teacher may have intended. Use actual evidence, rather than the work description, as the basis for determining the score.   Score each rubric area separately for each strand. Do not let the score in one rubric area influence the score in another. Do not raise the student’s score in one area to overcome or compensate for a lower score in another, or lower a score across several rubric areas without first examining all of the evidence.    Do not rush through scoring, but do not spend too much time reviewing evidence either. Ask for assistance if you get stuck. On average, the review of a strand should not exceed about twenty minutes. Complete all score forms neatly and legibly. It is important to print neatly and clearly on all score forms, particularly those being returned to teachers. You will be asked to recopy any forms with information that is crossed out or illegible. 2 Thank you for your interest in scoring MCAS Alternate Assessment portfolios. Please review the following general guidelines carefully and review each step of the scoring process in this booklet, including all scoring rules and appendices. MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide7: 

Content Areas Assessed by MCAS-Alt 3 The content areas assessed by 2006 MCAS-Alt in each grade are shown below. MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide8: 

Required Portfolio Contents   Forms: 4 The following products ('evidence') must be included in the portfolio, based on each strand assessed in the portfolio. A strand must include at least one data chart documenting the student’s performance of one skill, and two pieces of primary evidence documenting the same skill. The data chart must show performance of the skill on at least five different dates. Additional primary and secondary evidence of other skills in the strand may be submitted, at the discretion of the teacher. If one or more of these forms is missing, the score will not be affected. Scorers should provide a numbered comment on the Portfolio Feedback Form selected from the Comment Key. Contents of Each Portfolio Strand: MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios The Portfolio: Portfolios in each content area will consist of either two or three 'strands' according to the table on page 3, plus required forms (shown below) organized in a three-ring binder for each student taking an alternate assessment. Guidelines for assembling the portfolio are provided in the 2006 Educator’s Manual for MCAS-Alt posted to the Department’s Web page at www.doe.mass.edu/mcas/alt.

Slide9: 

Types of Evidence: Each portfolio strand will be scored separately. A strand may consist of any of the following portfolio products, some required and others optional, as described below: Allowable Portfolio Evidence Primary Evidence (required) - Clearly labeled* products that document student learning directly, such as: Data charts field data chart bar graph line graph work samples video (3 minutes or less) photographs that clearly show a work sample, the end product of instruction, or steps in a sequence leading to the end-product audiotapes of an oral presentation, performance, or used as an accommodation (if applicable) Secondary Evidence (optional) - Products that either support primary evidence or illustrate the context in which the learning occurred, such as: photographs that show setting, instructional approach, materials, assistive technology, etc. brief notes or narrative descriptions by the teacher, peer, parent, or others who assisted the student audiotapes reflection sheets or other self-evaluation activity (goal setting, task analysis, student charting own performance, self-correction) letters or notes of support from peers, employers, or other teachers aids and supports used by the student visual aids graphic organizers templates (for example, those used with assistive technology) adapted tools or materials NOTE: Secondary Evidence will contribute to scores for Self-Evaluation and Generalized Performance, but do not affect the overall performance level in the content area. 5 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios * Labeling of evidence must include: Student’s Name Date (month/date/year) % Accuracy (number correct divided by the total) % Independence (or frequency of prompts provided)

Slide10: 

Summary of Scoring Process: Scorers Receives a portfolio from Table Leader Removes from unsealed white envelope Stores envelope under portfolio, or nearby Verifies all required forms were submitted NEATLY marks accordingly on Portfolio Feedback Form NEATLY records scorer ID, scorer number, and student information on Portfolio Feedback Form (see page 28) Reviews entire strand for completeness Records information about each piece of evidence on Strand Organizer (see page 27) 1 2 3 4 6 The Scorer: MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide11: 

Summary of Scoring Process : Scorers Reviews Scoring Guidelines and determines the score in each strand for all rubric areas: Level of Complexity Demonstration of Skills and Concepts Independence Self-Evaluation Generalized Performance Using a pen, NEATLY places an X to indicate each content area and strand on the Portfolio Feedback Form, and writes the learning standard number(s) addressed in each strand Using a pen, NEATLY circles each score on the Portfolio Feedback Form. 7 The Scorer (continued) Adds numbered comments from Comment Key to Portfolio Feedback Form (see pages 28 and 30) 5 6 Using a #2 pencil, NEATLY transfers scores from the Portfolio Feedback Form to the top copy of the Student Score Form ( see page 29) Removes top copy of Student Score Form Attaches it to top two copies of Portfolio Feedback Form with a paper clip 7 8 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide12: 

Summary of Scoring Process: Scorers Places portfolio back in unsealed white envelope Returns the following materials to the Table Leader: Portfolio in unsealed white envelope Completed Student Score Form attached with paper clip to top two copies of the Portfolio Feedback Form Places bottom (third) page of the Portfolio Feedback Form facedown in the inside back cover of the student’s portfolio 9 10 8 The Scorer (continued) MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide13: 

Summary of Scoring Process: Table Leaders 9 The Table Leader: Gives each scorer a portfolio to score Receives portfolios from scorers after scoring is completed Checks Portfolio Feedback Form for legibility and neatness (if not completed neatly, returns PFF to the scorer to complete a new form) 1 If no second score is required, checks that forms are completed accurately, places portfolio back in carton, and holds the forms aside with all other forms  If 'second read' is required, gives portfolio to Scorer 2 and keeps forms from Scorer 1 separate  If portfolio is scored twice, checks for agreement between Scorers 1 and 2 Determines whether a second score is required: For each scorer, every fifth portfolio is scored twice. When evidence is missing or insufficient in any strand (scored 'M'), it must be scored again in that area by an 'M-Resolver.' Every Grade 10 portfolio is scored twice. 2 3 4 If scorers agree, places portfolio back in carton If scorers disagree, scores only the rubric areas in question, then places portfolio back in carton Records each scorer’s accuracy percentage on the Scorer Tracking Form When all portfolios in a carton have been scored: Checks that all Portfolio Feedback Forms are NEAT and LEGIBLE (if not, return to scorer) Brings carton to scanning room Retrieves new carton Distributes portfolios to scorers one at a time Repeats steps until all portfolios are scored 5 6 =  MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios %

Slide14: 

Scoring Level of Complexity Refer to Strand Cover Sheet (Item numbers 4, 5, and 6). For each strand, use the Resource Guide to confirm that: Evidence in this strand addresses the outcome listed on Strand Cover Sheet (Item 6) Measurable outcome is linked to a learning standard for a student in this grade. If not, provide a comment. Note: Evidence is still scorable if it is based on the learning standard, even if evidence does not match the outcome. Use the Scoring Rubric below to determine score for Level of Complexity. Scoring Rules (A) Use the following information plus the Scoring Rubric below to score Level of Complexity. Definitions: 'Access skills' may be social, motor, or communication skills that allow a student to participate in a standards-based activity, but do not address curriculum content directly. Consult your Table Leader if uncertain. 'Entry points' address curriculum directly, but below grade-level expectations. NOTE: A skill (e.g., 'waiting one’s turn') may be an entry point in one subject (ELA) and access skill in another (Math). If it seems the standard(s) are addressed 'at grade-level expectations,' it must be scored as usual, then set aside for review by scorers identified as content experts who will make the final determination for Level of Complexity. If uncertain, set aside anyway. (B) Level of Complexity may vary within each strand. At least three pieces of evidence must be at the higher Level of Complexity in order to score at that level. Otherwise, score at the lower level in that strand For example, two pieces of primary evidence at entry points and one at access skills must be scored '2' for Level of Complexity. SCORING RUBRIC 10 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide15: 

Scoring Demonstration of Skills andamp; Concepts (DSC) Confirm that the following are included in the strand being scored. At least: one data chart for at least 5 different dates two pieces of primary evidence that each address the same outcome (or learning standard) as on the data chart All pieces of primary evidence must be labeled with the following: 1. Student’s name 2. Date (mo/day/yr) 3. % Accuracy 4. % Independence For data and primary evidence, record % accuracy on Strand Organizer for each date within the final 1/3 timeframe. Calculate average percentage of accuracy for all evidence in the final 1/3 timeframe. Score = M in DSC and Independence. Give a comment If yes, Use the Scoring Rubric below to determine the score for DSC. Scoring Rules (A) If DSC is scored M, then Independence must also be scored M. (B) Count each data point only once for the DSC score. If a work sample is also included on a data chart, count it once in the final calculation for DSC. (C) If % accuracy is not provided on primary evidence, calculate it yourself, if you can do so in two minutes or less (if not, score M). (D) A strand may also include primary evidence related to other outcomes and learning standards in the same strand. When this occurs, score as follows: First, determine whether the 'core set' of required evidence is included (i.e., data chart and two pieces addressing the same outcome or learning standard). Then determine whether additional evidence was submitted in the strand If so, record % accuracy on Strand Organizer for all data and evidence in the final 1/3 timeframe of the data chart. Obtain the average by totaling the accuracy percentages for all data points and evidence in the final 1/3 timeframe, and divide by the number of data points and pieces of evidence. 11 SCORING RUBRIC Determine the final 1/3 timeframe on the data chart (or final three data points, if final 1/3 is fewer than three points). MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios If no,

Slide16: 

Scoring Independence Use the same data points and evidence to calculate Independence that you used to calculate % accuracy. Record % independence on the Strand Organizer at the same time you record % accuracy. Calculate the average % independence for the evidence recorded on the Strand Organizer. Use the Scoring Rubric to determine the score for independence. Scoring Rules (A) If Independence is scored M, then DSC must also be scored M. (B) Count each data point only once for the Independence score. If a work sample is also included on a data chart, count it once in the final calculation for DSC. (B) If % cues/prompts are documented, rather than % independence; or a ratio is provided (e.g., '6/7 independent'), convert to a percentage (e.g., 10% cues/prompts = 90% independence). Use a calculator, if needed. (C) If % independence is not provided on primary evidence, you may calculate it yourself in two minutes or less (if not, score M). (E) Evidence indicating that a student required assistance '30-40% of the time,' or was independent 'almost all of the time,' is unscorable. (F) Count cues/prompts, but not accommodations, in the score for Independence. (G) Full hand-over-hand assistance = 0% independence. 12 SCORING RUBRIC MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide17: 

Calculating DSC (Accuracy) and Independence Each of the three labeled pieces of primary evidence is scorable and the strand is complete. (At least one labeled data chart and two labeled pieces of related primary evidence were submitted.) 2. On the Strand Organizer, record % accuracy and % independence for all labeled evidence within or after the final 1/3 timeframe (or last three data points, whichever is more). In this strand, the final 1/3 timeframe begins on 2/28/06. 3. For the final calculation, do not include % accuracy and % independence for evidence if already included on the data chart. Since % accuracy and % independence for the work sample (2/28/06) and video clip (4/29/06) are already included on the data chart, do not include this information a second time. 4. Calculate the average for all evidence in the final 1/3. 5. Use the Scoring Rubric to determine the final score in each rubric area. Demonstration of Skills = 4 Independence = 4 Scenario #1: Scoring DSC (Accuracy) and Independence (All evidence in this strand is based on the same measurable outcome) Evidence includes: one data chart one work sample one video clip Primary Evidence #1 Primary Evidence #2 (already charted) Feb. 28, 2006 Work Sample 80% Accuracy 100% Independence Primary Evidence #3 (already charted) % Accuracy (beginning 2/28/06) 80% 60% 100% avg. = 80% % Independence (beginning 2/28/06) 100% 100% 100% avg. = 100% 13 % Accuracy % Independence MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide18: 

4/15/06 Work Sample 100% Accuracy 100% Independence Calculating DSC and Independence 1. Each of the three labeled pieces of primary evidence is scorable and the strand is complete. (At least one labeled data chart and two labeled pieces of related primary evidence were submitted.) 2. On the Strand Organizer, record % accuracy and % independence for all labeled evidence within or after the final 1/3 timeframe (or last three data points). In this strand, the final 1/3 timeframe begins on 2/28/06. 3. For the final calculation, be sure to include % accuracy and % independence for the two work samples, since they are within or after the final 1/3 and not already included on the data chart. 4. Calculate the average for all evidence in the final 1/3. 5. Use the Scoring Rubric to determine the final score in each rubric area. Demonstration of Skills = 4 Independence = 4 % Accuracy (beginning 2/28/06) 80% 60% 100% 100% 100% avg. = 88% % Independence (beginning 2/28/06) 100% 100% 100% 100% 100% avg. = 100% Primary Evidence #2 (within final 1/3 timeframe, and not included on chart) 5/01/06 Work Sample 100% Accuracy 100% Independence (All evidence in this strand is based on the same measurable outcome) Evidence includes: one data chart two work samples Primary Evidence #3 (after final date on chart) 14 Primary Evidence #1 % Accuracy % Independence MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Scenario #2: Scoring DSC (Accuracy) and Independence

Slide19: 

9/20/06 Work Sample 40% Accuracy 100% Independence Calculating DSC and Independence 1. Each of the three labeled pieces of primary evidence is scorable and the strand is complete. (At least one labeled data chart and two labeled pieces of related primary evidence were submitted.) 2. On the Strand Organizer, record % accuracy and % independence for all labeled evidence within or after the final 1/3 timeframe (or last three data points). In this strand, the final 1/3 timeframe begins on 2/28/06. 3. For the final calculation, do not include % accuracy and % independence from the two work samples, since they were completed prior to the final 1/3 timeframe (i.e., before 2/28/06). 4. Calculate the average for all evidence in the final 1/3. 5. Use the Scoring Rubric to determine the final score in each rubric area. Demonstration of Skills = 3 Independence = 2 % Accuracy (beginning 2/28/06) 60% 40% 60% avg. = 53.3% % Independence (beginning 2/28/06) 20% 40% 40% avg. = 33.3% Primary Evidence #2 (not within final 1/3, and not included on chart) 12/2/05 Work Sample 60% Accuracy 40% Independence (All evidence in this strand is based on the same measurable outcome) Evidence includes: one data chart two work samples Primary Evidence #3 (not within final 1/3, and included on chart) 15 Primary Evidence #1 % Accuracy % Independence MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Scenario #3: Scoring DSC (Accuracy) and Independence

Slide20: 

Calculating DSC and Independence 1. Each of the three labeled pieces of primary evidence is scorable and the strand is complete. (Rule: field data chart plus a bar or line graph summarizing the field data are both scorable). 2. On the Strand Organizer, record % accuracy and % independence for all labeled evidence within or after the final 1/3 timeframe (or last three data points). In this strand, the final 1/3 timeframe begins on 12/3/05. 3. Since the bar graph includes the same information as the field data chart, do not repeat % accuracy and % independence on the Strand Organizer. Similarly, since the work sample is already included on the bar graph, do not repeat the percentages from that work sample in the final tally. 4. Calculate the average for all evidence in the final 1/3. 5. Use the Scoring Rubric to determine the final score in each rubric area. Demonstration of Skills = 4 Independence = 3 % Accuracy (beginning 12/3/05) 100% 100% 100% avg. = 100% % Independence (beginning 12/3/05) 67% 50% 83% avg. = 66.67% Primary Evidence #2 (field data summarized on bar graph above) 12/2/05 Work Sample 100% Accuracy 50% Independence (All evidence in this strand is based on the same measurable outcome) Evidence includes: one bar graph one field data chart one work sample Primary Evidence #3 (included on both charts) 16 Primary evidence #1 (bar graph) % Accuracy % Independence MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Scenario #4: Scoring DSC (Accuracy) and Independence

Slide21: 

Calculating DSC and Independence 1. Each of five labeled pieces of primary evidence is scorable and the strand is complete. (At least one labeled data chart and two labeled pieces of related primary evidence were submitted.) 2. On the Strand Organizer, record % accuracy and % independence for all labeled evidence within or after the final 1/3 timeframe that addresses the same skill as the data chart, and other related skills. In this strand, the final 1/3 timeframe begins on 2/28/06. 3. On the Strand Organizer, record % accuracy and % independence for evidence produced on or after 2/28/05, including: data points work samples that show the same skill as the data chart work samples that show other skills in the strand 4. Calculate the average for all evidence in the final 1/3 timeframe. 5. Use the Scoring Rubric to determine the final score in each rubric area. Demonstration of Skills = 3 Independence = 2 Primary Evidence #2 (same skill, but not charted) 2/15/06 Work sample 100% Accuracy 50% Independence Evidence in this strand includes: one bar graph four work samples (two show the same skill as the chart) Primary Evidence #4 (different learning standard in same strand) 17 12/15/05 Work Sample 60% Accuracy 40% Independence Primary Evidence #1 % Accuracy % Independence Primary Evidence #3 (included on chart) 3/30/06 Work Sample 40% Accuracy 40% Independence Primary Evidence #5 (different learning standard in same strand) 3/1/06 Work Sample 80% Accuracy 100% Independence % Independence (beginning 2/28/06) 20% 40% 40% 100% avg. = 50% % Accuracy (beginning 2/28/06) 60% (2/28/06) 40% (3/30/06) 60% (4/29/06) 80% (3/1/06) avg. = 60% MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Scenario #5: Scoring DSC (Accuracy) and Independence

Slide22: 

Scoring Self-Evaluation Scoring Rules (A) Multiple examples of self-evaluation on a single piece of primary evidence = one example of self-evaluation. (B) If the student uses the same self- evaluation activity or reflection sheet on multiple pieces of evidence, count each as an example of self-evaluation. (C) Self-evaluation does not include choosing a response to a question during routine Instruction (e.g., 'Which object is larger?'). (D) When stickers, such as colored or 'happy face' stickers, are used to show self-evaluation, these count as examples of self-evaluation ONLY when it is clear from the description provided that the student has chosen the sticker to describe or reflect on his or her performance. If a choice by the student is not evident, do not count the sticker(s) as an example of self-evaluation. Add a comment from the Comment Key. On the Strand Organizer, record each example of self-evaluation found in the evidence for the strand. If the Total Then the Number is: Score is: 0 18 SCORING RUBRIC M 1 1 2 or more 2+ MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Count as one example of self-evaluation each of the following activities performed by the student: selecting work for portfolio choosing materials/activities reflecting on performance checking steps of an activity goal-setting choosing workmates graphing his or her own performance correcting his or her work

Slide23: 

Scoring Generalized Performance Scoring Rules (A) Determine the total number of ways in which the skill or knowledge was demonstrated. Different settings for instruction and/or people who worked with the student do not, by themselves, count as examples of Generalized Performance. If student addresses the same skill using the same approach, but with a different person in each of two different settings, GP = 1. If student uses a different approach in each of two settings, GP = 2. Homework and work in a community setting each count as a context. (B) The score for Generalized Performance can never be 0. (C) Use of age-inappropriate materials: Lower the score in Generalized Performance by 1 point if materials used for instruction are not age-appropriate (e.g., use of dolls, cartoons, nursery rhymes, etc., by 16-year-old students). Check with your Table Leader if you are uncertain. If age-inappropriateness was noted, add a comment from the Comment Key. 19 On the Strand Organizer, record the number of contexts and instructional approaches found in the evidence for the strand. If the Total Then the Number is: Score is: 1 2 SCORING RUBRIC MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios 1 2 3 or more 3+

Slide24: 

Scoring Rules in Special Cases Must all primary evidence be included on a data chart for the strand to be complete? No. Work samples, videos, and other primary evidence may be included as data points on a graph or chart or not, at the teacher’s discretion. Whether or not primary evidence is included on the chart, it is counted for the purpose of determining scores for the strand. However, at least two pieces of primary evidence must be submitted based on the outcome described on the chart. 2) What if strands unmatched to students in that grade are submitted, or fewer than the required number of strands are submitted in a content area? Score only those strands that are required for a student in that grade. Do not mark any scores for strands that were required but NOT submitted, but instead, provide a comment. Do not score strands that were submitted, but were not required for submission, such as the following: Math strand unmatched to the student’s grade ELA strands other than General Standards 4 or 8, and (in grades 4, 7, and 10) ELA Composition High School Earth Science EXCEPTION: For Science and Technology/Engineering in grades 5 and 8, for which three strands are required for submission, score a fourth Science and Technology/Engineering strand if it was submitted. For High School Science and Technology/Engineering, if three disciplines (strands) are submitted, rather than three learning standards in one discipline, as required, you must do the following: Determine if any of the three disciplines includes evidence of three different learning standards. If so, score only that discipline. If not, determine if any discipline includes evidence of more than one learning standard. If so, score only that discipline. If all disciplines include evidence of only one learning standard, score the first discipline in the portfolio. If a student took a standard MCAS test in a subject required for assessment, do not mark any scores in that content area.   3) What if no or multiple Strand Cover Sheets are submitted for a strand? Scorers should 'bundle' all pieces of evidence in the same strand, regardless of the number of Strand Cover Sheets, and then score the entire strand. Scorers must determine whether the required evidence was submitted in the strand, and whether a total of three portfolio strands were submitted in the content area, as required (if not, see the Rule #2 above). If the scorer can organize the materials submitted in this strand in under five minutes, then the scorer can score the strand. If not, provide a comment.   4) What if the portfolio contains forms and cover sheets from previous years? If a portfolio contains outdated forms, score the strands using current scoring guidelines, provided all required evidence and information are included. 5) Can evidence be submitted from both the current and previous school years? Yes, Science and Technology/Engineering portfolios in grades 5, 8, and 10 may contain evidence accumulated over two consecutive school years (i.e., the current and one previous school year). All other content areas must include evidence from only the current school year (beginning 7/1/05). 20 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide25: 

6) Are photographs, videos, and audiotapes scorable as primary evidence? Photographs can be scored in the following situations, and ONLY when the subject is clear and the photo labeled with all required information (student’s name, date, % accuracy, % independence): A photograph may be scored as primary evidence when it clearly shows a work sample that is either too large, fragile, temporary in nature, or unsafe to include in a portfolio. A photograph may be scored as primary evidence when it clearly shows the end product of a sequence of steps (and/or each step in the process) performed by the student A photograph may be scored as secondary evidence when it clearly shows the setting, instructional approach, or context in which an activity occurred (e.g., a student sitting at a computer) Audiotapes can be scored when they are clearly audible or transcribed in writing and labeled with all required information (student’s name, date, % accuracy, % independence). They are scored as primary evidence in the following cases: When the outcome listed on the Strand Cover Sheet is related to communication use of language participation by the student in discussion, recitation, performance, or other oral activity When the student provides verbal, rather than written, responses as an accommodation and there is clear evidence of what the student was asked to do. 7) What if a Grade 11 or 12 portfolio includes Science and Technology/Engineering? Score this content area as you would a grade 10 portfolio. 21 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios Scoring Rules in Special Cases

Slide26: 

Additional Scoring Scenarios and Rules 22 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios * Numbered comment from Comment Key (See Appendix – Page 30)

Slide27: 

23 Additional Scoring Scenarios and Rules (Cont’d) MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios * Numbered comment from Comment Key (See Appendix – Page 30)

Slide28: 

Grades 10, 11, and 12 Portfolios Each portfolio for a student in either grade 10, 11, or 12 must be scored twice, with scoring discrepancies resolved by a Table Leader or M-Resolver. Portfolios in grades 10, 11, and 12 must be set aside in the following cases for additional review for the Competency Determination: Scorers must set aside any portfolio that scored 4 or 5 in Level of Complexity in any strand. Scorers must set aside any portfolio in which the work is close to grade 10 level, even if they are uncertain about the Level of Complexity, and even if Level of Complexity has been scored 3. All portfolios with Work Description for Grade 10 Competency Determination labels must be set aside. A portfolio will be considered for the Competency Determination, however, even if it does not include these labels. Once portfolios have been set aside, they will be reviewed by a panel of content experts in ELA and Mathematics who will determine whether the evidence is 'at grade level' for a student in grade 10, and whether the entire body of evidence is comparable to the performance of a student who has 'passed' the grade 10 MCAS test with a score of 220 (Needs Improvement). In order for a student to earn the Competency Determination, his or her portfolio must include evidence of specific grade 10 learning standards as described in the 2006 Educator’s Manual for MCAS-Alt. Data charts are not required in portfolios submitted for the Competency Determination. Therefore, the strand should not be scored M if a data chart is missing. Score Level of Complexity as follows for Grades 10, 11, and 12 Portfolios: Level of Complexity = 5, when: the student is addressing standards at or close to grade-level expectations, AND all required evidence for a competency portfolio is submitted in the strand (see the 2006 Educators’ Manual for MCAS-Alt) Level of Complexity = 4, when: the student is addressing standards at or close to grade-level expectations, AND some required evidence for a competency portfolio is submitted in the strand Level of Complexity = 3, when: the student is addressing standards below grade-level expectations (i.e., 'entry points'), regardless of the amount of evidence submitted 24 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide29: 

Appendix A: Rubric for Scoring Portfolio Strands 25 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide30: 

Appendix B: Score Forms Introduction to 2006 MCAS-Alt Score Forms Scorers will use the following forms during the Scoring Institute to calculate and record scores and comments for all MCAS-Alt portfolios. Strand Organizer This form will be used by the scorer as a worksheet and discarded after scoring is completed for each strand in the portfolio. Scorers record information in the appropriate sections of the Strand Organizer for individual pieces of evidence in the strand to summarize and keep track of important information about the piece.   Portfolio Feedback Form (PFF) This form will be completed by scorers and returned to teachers in each portfolio to provide direct feedback from a scorer who reviewed the portfolio. Scorers will summarize (in pen) the information from the Strand Organizers on this form and provide numbered comments from the Comment Key. There are three colored copies of each PFF. The top two copies will be collected by the Table Leader and clipped to the Student Score Form. The bottom copy will be returned inside the portfolio, with the Comment Key printed on the reverse side. Student Score Form (SSF) Final portfolio scores will be recorded by the scorer on this 'bubble' form using a #2 pencil. Scorers must carefully separate the top copy of the SSF from the perforated packet found in each portfolio and neatly transcribe the information from the PFF onto this top copy. Student Score Forms will be electronically scanned by Measured Progress staff at the scoring site. Comment Key The scorer will select appropriate comments from this numbered list of comments in order to provide feedback to the teacher(s) who prepared the portfolio. Numbers are placed by scorers in the appropriate boxes on the PFF. 26 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide31: 

Appendix: Score Forms - Strand Organizer 27 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide32: 

Appendix: Score Forms - Portfolio Feedback Form (PFF) 28 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide33: 

Appendix: Score Forms - Student Score Form (SSF) 29 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

Slide34: 

Appendix: Score Forms - Comment Key 30 MCAS-Alt: 2006 Guidelines for Scoring Student Portfolios

authorStream Live Help