Вы находитесь на странице: 1из 4

From: To: Cc: Subject: Date:

Carr, Sherry L Anderson, Eric M Codd, Clover RE: Why Student Growth Ratings are wasting precious resources Thursday, September 12, 2013 1:26:00 PM

Thanks.
From: Anderson, Eric M Sent: Thursday, September 12, 2013 10:31 AM To: Carr, Sherry L Cc: Codd, Clover Subject: RE: Why Student Growth Ratings are wasting precious resources

We are very familiar with the TU white paper. In fact, we met with TU leaders before bargaining began. (Its interesting to know Kristin is on their board.) The white paper is certainly consistent with current best practice thinking for designing a comprehensive system. For various reasons we were not able to include all their ideas into our bargaining platform, but we seriously considered everything they proposed. Clover and I would be happy to sit down and discuss this in depth. Eric
From: Carr, Sherry L Sent: Thursday, September 12, 2013 5:18 AM To: Anderson, Eric M Cc: Codd, Clover Subject: RE: Why Student Growth Ratings are wasting precious resources

Thanks. To clarify, my inquiry was really more around the white paper that Teachers United did and not so much the specificity of Kristins note. I thought of it because Kristin is on their board.
From: Anderson, Eric M Sent: Wednesday, September 11, 2013 9:25 AM To: Carr, Sherry L Cc: Codd, Clover Subject: RE: Why Student Growth Ratings are wasting precious resources

Hi Sheri, Clover and I discussed Kristins thoughtful email. Here is our response: We believe that Student Growth Ratings can help teachers and schools improve by stimulating reflective conversations about practice. We however recognize that Student Growth Ratings for individual teachers can be influenced by school leadership and curricular and instructional practices instituted at the building level. This is a key reason that Student Growth Ratings are merely a trigger for an inquiry into practice, and are not a weighted component of a teachers evaluation rating. To better support reflective inquiries into practice we recommend that the District should

provide teachers and school leaders with growth data aggregated for the school, in addition to the individual classroom. Teachers and evaluators would therefore better understand whether the classroom growth observed was reflective of a broader trend at the school. This could potentially shift the focus of the inquiry to a larger discussion about building level practices. To reinforce this, we would also recommend that Executive Directors of Schools receive comprehensive student growth reports for the schools they supervise. With respect to Kristins concerns about the MSP: State tests are the only validated comprehensive standards-based assessments that are administered to all students, and which are strictly aligned to grade level teaching expectations. This universality of these tests enables the District to calculate Student Growth Ratings that control for the demographic composition of each teachers classroom using a value-added model. The model estimates the teachers aggregate impact on student growth after accounting for the types of students taught. By removing these factors from the equation, teachers and evaluators can focus their attention on other potential explanations for the observed growth results rather than who the students were. This therefore supports a reflective inquiry process. We agree that one limitation of current state tests is that they are only administered once a year in the spring. The future Smarter Balanced assessments aligned to Common Core standards promise to offer a more comprehensive system that includes interim tests. This will potentially enable the District to leverage multiple data points throughout the year when formulating Student Growth Ratings. I hope this helps. Let us know if you wish to discuss further. Eric
Eric M. Anderson, Ph.D. Manager, Research, Evaluation, & Assessment Seattle Public Schools (206) 252-0844 emanderson@seattleschools.org

From: Carr, Sherry L Sent: Sunday, September 08, 2013 8:43 AM To: Codd, Clover; Anderson, Eric M Subject: FW: Why Student Growth Ratings are wasting precious resources

Clover and Eric, I have read this note and the white paper written by Teachers United. Are you familiar with it? Im wondering if you have discussed any adjustments we might make moving forward? If it is easier to meet and discuss or via phone, that s fine too. SC
From: Kristin Bailey-Fogarty [mailto:fogartykristin@gmail.com] Sent: Thursday, August 29, 2013 1:44 PM

To: SchoolBoard Cc: Whitworth, Kim; Codd, Clover; Lynne Varner Subject: Why Student Growth Ratings are wasting precious resources

I think there's a misconception out there that those of us who support high standards for excellent instruction and using growth data in a teacher's evaluation also support the current Student Growth Ratings Seattle Public Schools is using. Identifying the teachers who are effective at moving students is too important a task to do badly. Student Growth Ratings that use MSP (or MAP, last year) data collected in early May for only a few teachers is not the best we can do. In fact, my experience has been that our current May-May data collecting reveals much more about building leadership and programs than it does about individual teachers. Two years ago I taught 25-34 students in 7th grade LA/SS blocks. The average growth of my students' reading lexiles was about 57 points - about one year's growth. My principal and BLT implemented data-based interventions - I Can Learn for students who did not meet standard in math and Read 180 for students who did not meet standard in reading. Small class sizes were honored, supports like an Intervention Specialist were put into place, and while we had access to great research-based curriculum we were given the freedom to do whatever we needed to do to serve our students. Last year my first cohort of reading students, all of whom qualified for Free and Reduced Lunch, grew 160 lexile points - more than three years. I am the same teacher. This data does not mean I suddenly became a more skilled, intense, or passionate teacher during the 2012-2013 school year. Instead, this huge student growth reflects an excellent use of resources, smart decision making by building leadership, and a shared commitment among staff that we would do what it took to serve these students. I was a more effective teacher with my reading class because excellent supports had been put into place that helped me serve my students. This year, I and my colleagues will teach using Common Core standards. We believe it is the right direction in which to take our instruction and we are not willing to let our students lose a year of access to the highest standards so that they are better prepared for the MSP. Next year, students won't even take the MSP. We are doing what is best for kids, knowing that we will all show lower "student growth" based on the district's metrics. I am asking you to redirect the excellent assessment team at JSCEE, led by Eric Anderson, toward developing meaningful ways to measure student growth using the Smarter Balance assessment, and in looking for ways to measure every student in September, January, and June so that we are not only accurately measuring a teacher's impact on her students but are providing teachers with data they can use to inform instruction. I don't "hate" Student Growth Ratings - they are a lion with no teeth - but they are a waste of our resources, particularly our human resources at JSCEE, and they are meaningless when it comes to identifying our excellent teachers. They are also meaningless to me as a teacher or to my PLC as we try to use data to improve instruction, and isn't that what data is really for? Thank you,

Kristin

Вам также может понравиться