Professional Documents
Culture Documents
Purpose, Intent, and Use .............................................................................................................................. 3 Definition of PVAAS Teacher-Specific Reporting ...................................................................................... 3 Purpose of PVAAS Teacher-Specific Reporting ......................................................................................... 3 Types of Information Provided in Reports ................................................................................................ 3 Reporting by State Assessment or Section/Class...................................................................................... 4 SAS EVAAS for K-12 and Experience with Teacher Reporting............................................................... 4 Access to PVAAS Teacher-Specific Reporting ........................................................................................... 4 Public Reporting of PVAAS Teacher-Specific Reporting............................................................................ 4 Use of PVAAS Teacher-Specific Reporting ................................................................................................ 4 Understanding PVAAS Teacher-Specific Reporting .................................................................................. 5 PDEs SY12-13 PILOT ................................................................................................................................. 5 Inclusion of PVAAS Reporting in Act 82/Educator Evaluation ...................................................................... 5 Intent of PDEs Act 82 Regulations, Administrative Manual, and PVAAS FAQs........................................ 5 Proportion of PAs Educator Effectiveness System................................................................................... 5 PVAAS Teacher-Specific Reporting Compared to PVAAS School Reporting in the School Performance Profile ........................................................................................................................................................ 6 Including on Final Rating Form ................................................................................................................. 6 Teacher Who Changes Districts, Schools, Grades, Subjects, and/or Courses Across School Years .......... 6 PVAAS Three-Year Rolling Average ............................................................................................................... 7 PVAAS Three-Year Rolling Average ........................................................................................................... 7 Formula for PVAAS Three-Year Rolling Average ....................................................................................... 7 Three Consecutive Years of PVAAS Reporting .......................................................................................... 7 Example of Not Having Consecutive Years ............................................................................................... 7 Students Included in Three-Year Rolling Average..................................................................................... 7 Single Year PVAAS Reporting .................................................................................................................... 8 Teacher-Specific Data Used If No PVAAS Three-Year Rolling Average ..................................................... 8 Methodology................................................................................................................................................. 8 Proportion of Teachers Meeting/Not Meeting the Standard for PA Academic Growth .......................... 8 Stability of PVAAS Teacher Measures ....................................................................................................... 9 Annual Release of PVAAS Teacher-Specific Reporting ........................................................................... 10 Variables Considered by PDE .................................................................................................................. 10 Consideration of Student Characteristics or Attributes.......................................................................... 11 Minimum Number of Students to Receive a Report .............................................................................. 11 No Teacher Report Received After Roster Verification Process ............................................................. 11
Students Not Included in PVAAS Teacher-Specific Reporting................................................................. 12 Students Included in PVAAS Teacher-Specific Reporting for Keystones ................................................ 12 Summer Keystone Scores ....................................................................................................................... 13 Students Who Change Schools or LEAs During the School Year ............................................................. 13 Students with Two or More Teachers in the Same Subject/Grade/Course ........................................... 13
reporting in three other states (Ohio, Tennessee, and North Carolina), as well as 24 regional and district implementations across the United States. SAS EVAAS for K-12 has been involved with research on the effectiveness of teachers as measured by value-added analyses for over 20 years. Pennsylvania has benefited from the experiences and lessons learned about teacher-specific reporting in other states and districts across the country.
become lead teachers; serve as members of school-wide planning committees; participate in curricular planning; and/or provide professional development to colleagues.
partial responsibility for content specific instruction of assessed eligible content as measured by PAs assessments (PSSA and/or Keystone exams).
PVAAS Teacher-Specific Reporting Compared to PVAAS School Reporting in the School Performance Profile
What these two things have in common is the measurement of growth via PVAAS (the Pennsylvania Value-Added Assessment System). However, the rules of attribution are handled differently for schools as compared to teachers. PVAAS growth for TeacherSpecific Data in the Educator Effectiveness system is only provided to those teachers who had instructional responsibility towards the assessed eligible content in a grade/subject/course that is assessed by a PSSA (the exception being grade 3) or Keystone exam (for those students enrolled in a Keystone-designated course). PVAAS growth in the School Performance Profile, however, is the growth of ALL students in the school who are assessed on eligible content in a grade/subject/course that is assessed by a PSSA (the exception being grade 3) or Keystone exam (for those students enrolled in a Keystonedesignated course).
Teacher Who Changes Districts, Schools, Grades, Subjects, and/or Courses Across School Years
PVAAS value-added data will follow a PA-certified teacher as s/he moves from school to school within an LEA. Likewise, PVAAS value-added data will follow a PA-certified teacher as s/he changes PA-assessed grades, subjects and/or courses within an LEA. Example: The teacher works in the same LEA for three consecutive school years. Year 1: The teacher provides instruction in Grade 5 PSSA Math and Reading. Year 2: The teacher provides instruction in Grade 5 PSSA Math only. Year 3: The teacher provides instruction in Grade 5 PSSA Reading only. The teacher will receive a PVAAS score for each PA-assessed grade/subject for each school year. The teacher will receive one PVAAS three year rolling average at the end of school year three. If a teacher changes LEAs within the Commonwealth, PVAAS data will only follow a teacher if authorized by the teacher. The process to operationalize these regulations is in development by PDE. The specific wording is as follows, 22 Pa. Code 19.1(IV)(b)(4), which reads: (4) If a classroom teacher, who is working or has worked for other LEAs in the Commonwealth, is being considered for employment by a different LEA, the prospective employer may ask the
PVAAS Statewide Team for PDE pdepvaas@iu13.org April 2014
teacher for written authorization to obtain the teachers teacher-specific data from a current or previous employer to provide for the continuity of the three year rolling average described in Paragraph IV(b)(2)(iv).
Methodology
Proportion of Teachers Meeting/Not Meeting the Standard for PA Academic Growth
The observation on the SY12-13 PVAAS teacher reporting scatterplots that there are as many data points (teachers) above the line (standard for PA Academic Growth) as below the line raises several interesting and complex issues. As an initial point, there are not simply winners and losers in PVAAS as what may be a perception on the scatterplot. In addition to a value-added estimate, each district, school, and teacher receives a standard error. PVAAS uses both metrics (the value-added estimate and the standard error) to ascertain whether, on average, there is enough evidence to show that students made decidedly more than the expected growth, decidedly less than the expected growth, or there is not enough evidence to show students made anything different than the expected growth. A district or school may have a slightly negative valueadded estimate, but there may not be enough evidence to say that students are decidedly making less than the expected growth. There is always some uncertainty in statistics and using the standard error with the value-added estimate is a way to factor this into the reporting and protect districts, schools, and teachers from the risk of misclassification. For any subject/grade/course, there are typically a large number of districts, schools, and teachers who have met the standard for PA Academic Growth. When viewing the scatterplots and observing the numbers of teachers whose group of students made more than the expected growth (or less than the expected growth), it is not simply a split down the middle because of those entities that made about the expected growth. There are typically a similar proportion of entities making more than the expected growth as those entities making less than the expected growth, but it isnt necessarily the case, and the relative size varies by subject/grade/course/year. Additionally, a multiple-year estimate (PVAAS 3-year rolling average) is required for PVAAS to be used as part of a teachers evaluation, so this can also result in a different distribution for any subject/grade/course/year. It is important to remember that there are a number of ways to define growth, even within the same statistical model, and we work with PDE and SAS EVAAS for K-12 to consider
PVAAS Statewide Team for PDE pdepvaas@iu13.org April 2014
each option very closely. Historically, PVAAS reporting for PSSA Math and Reading used a base year approach (2006 as the base year) so that the growth expectation is consistent from year to year. However, Pennsylvania needed to address the issue of transitioning its assessment system to align to the PA Core Standards. Currently, an intra-year approach is being used. So, what does that mean? The base year for PVAAS reporting for PSSA Math and Reading in grades 4-8 was reset starting with the SY12-13 assessment results. During the transition period of Pennsylvanias state assessments, each year will serve as its own base year, and growth will be based on students maintaining their position in the statewide distribution of scores for each year. Pennsylvania will also be resetting the base year for Math and Reading in grades 4-8 for SY13-14 and SY14-15. In SY15-16, Pennsylvania will decide whether to keep resetting the base year, or to set the base year as done previously with 2006. Even if the statewide performance/achievement results change significantly from the old PSSA to the new PSSA aligned to the PA Core Standards, PVAAS still assesses whether a group of students maintained their relative position on the statewide distribution relative to themselves. If a group of students for a district, school, or teacher was higher achieving, average achieving, or lower achieving on the easier test, is the group of students for a district, school, or teacher at least maintaining their achievement and at the same relative position of achievement on the newer or harder test? In this approach, the definition of growth is that students maintain their relative place in the distribution from one year to the next. It is a relative definition of growth specific to each year, but it is also the most fair and statistically valid approach due to the testing changes in Pennsylvania. SAS EVAAS for K-12 has had years of experience with transitioning standards and assessments with other statewide clients. Learning from the experiences in other states has been very helpful as the plan was established for Pennsylvania.
Not all value-added approaches are created equal. More simplistic value-added approaches do NOT yield robust repeatability estimates and residual coefficients. This has enormous implications in terms of the usefulness of the PVAAS reporting: educators and policymakers can rely on the teacher estimates to inform their decisions. This reliability does not simply exist in a research setting. PVAAS teacher value-added estimates, derived from a similar approach as the PVAAS district and school model and provided to teachers in another state over the span of 14 years, have similar repeatability. An analysis of their estimates yielded important insights into how different kinds of teachers may change in effectiveness over time. More specifically: Teachers with groups of students yielding high growth are very likely to continue to do so. Teachers with groups of students yielding high growth after their first three years of teaching were extremely likely to remain as teachers with groups of students yielding high growth three years into the future (about 95% were either average or above average in effectiveness). Teachers with groups of students yielding low growth may improve over time. For the teachers with groups of students yielding low growth based on three-year estimates, approximately half of them will be identified as ineffective three years later. Thus, if policymakers, administrators, and educators make high-stakes decisions based on three-year estimates, there is very little risk that the teachers identified as effective will be identified as ineffective three years later.
10
programs, pull-out programs, and other unique approaches to delivering effective instruction to students where more than one teacher provides content-specific instruction of the eligible content as assessed by PSSA and/or Keystone exams. LEAs will work with teachers to reflect an accurate proportion of instructional responsibility for each teacher for each student in each state-assessed subject/grade/course.
11
that all students are accounted for, but SAS EVAAS for K-12 would not include the student in a teachers value-added Reporting. 3. Active N Count: The second N count to be considered is the active N count, meaning the full-time equivalent of 6 students for a teacher to receive PVAAS teacher-specific reporting. The active N count takes into account the total % Instructional Responsibility for each student. For example, if a student is claimed as 50% Instructional Responsibility, then this student would count as 0.5 active student(s). A student claimed as 25% Instructional Responsibility would could as 0.25 active student(s). Once the % Instructional Responsibility is taken into account, a teacher must have a minimum of 6 active students to receive a PVAAS teacher-specific report in that tested subject/grade/course.
12
The data identifying which students are taking a Keystone-related course with a Keystone exam are provided by each LEA. The LEAs will identify the link between the student and the Keystone-related course. This link is then verified or edited in the roster verification process.
13