NJEA calls for a timeout in evaluation, testing

Teachers report “a chaotic and inconsistent mess”

Published on Thursday, December 19, 2013

 State Board of Ed Meeting
Education Commissioner Chris Cerf fields questions about the new teacher evaluation system at the recent NJEA Convention. NJEA President Wendell Steinhauer moderated the discussion.

NJEA is calling upon Commissioner of Education Christopher Cerf and the State Board of Education to immediately correct deficiencies in the state’s new teacher evaluation system and its related curriculum and testing.

“Every day, we are hearing new reports from our members across the state that the roll-out of the new evaluation system is a chaotic and inconsistent mess,” said NJEA President Wendell Steinhauer.  “Administrators are unprepared and untrained for these major initiatives, and the Department of Education lacks the capacity to identify and respond to districts’ needs.

“NJEA has told Commissioner Cerf that New Jersey is rushing headlong into catastrophe, as we hear more and more about the lack of readiness for these unprecedented changes at the district and school levels,” Steinhauer said.

In June, Assemblywoman Mila Jasey, D-Essex, and Assemblyman Patrick Diegnan, D-Middlesex, —both of whom were prime sponsors of the tenure reform law enacted in August 2012 that requires the new evaluation and testing programs—called for a one-year delay in implementing them.

“As a prime sponsor of that legislation, I want to see this work,” said Jasey in a news story published at the time. “I don’t want to see it go up in flames because we didn’t give it the proper vetting.”

On June 20, the New Jersey Leadership for Educational Excellence (LEE) Group—which includes the N.J. School Administrators Association, the N.J. Principals and Supervisors Association, the N.J. School Boards Association, the N.J. PTA, NJEA, and the N.J. Association of School Business Officials—also called upon the Christie Administration to “apply for the offered flexibility” from the U.S. Department of Education (USDOE) “to ensure the long term gains of our school reform efforts.”

Earlier in June, the USDOE offered states the opportunity to seek a waiver under the No Child Left Behind Act in implementing plans to use student growth on standardized tests as a key factor in evaluating and employing teachers.

Cerf said New Jersey would not seek a delay in implementing its new evaluation and testing plans because they were covered under the new tenure reform law.

 State Board of Ed Meeting
At the 2013 NJEA Convention, Buena Regional Middle School teacher Melissa Tomlinson asked Commissioner Cerf why the state did not take up the federal government’s offer to delay the use of student test scores to evaluate teachers.

“Ideally, the law should be amended to delay the overall implementation of the evaluation system by one year, allowing for 2013-14 to be a pilot year for all districts,” Steinhauer commented.  “While the commissioner and State Board can’t change the law, they do have the authority to make necessary adjustments to the evaluation system and to more strictly enforce its implementation at the local level.”

“We are not blaming district and school administrators for these problems—they have struggled along with our members to meet these new requirements,” he added.  “We are simply urging the state to slow down and get it right.”

Specifically, the current evaluation regulations include the following requirements that are not being met by school districts:

  • Every district was required to establish a District Evaluation Advisory Committee (DEAC), composed of administrators and teachers, to guide the collaborative process called for in the regulations – Few districts have set up DEACs and, in many that have, the committees exist on paper and have not met.
  • After receiving appropriate training, teachers were required to write Student Growth Objectives (SGO) for their 2013-14 students by no later than Nov. 15, 2013, and the regulations call for principals and supervisors to collaborate with teachers on the final approval of SGOs – Little, if any, training was provided by districts on how to write SGOs; in fact, directives from many school administrators to write SGOs before the start of the school year reflected the complete lack of understanding of the process.  In recognition of this pervasive problem, NJEA conducted 111 local and county workshops for thousands of teachers on how to write SGOs according to the regulations, and NJEA published detailed information on SGOs in its monthly professional journal, the NJEA Review.  Despite these and other efforts by educators to learn the new process on their own, and to properly prepare SGOs, district and school administrators routinely rewrote SGOs without consulting their teachers.
  • Excessive and inappropriate student assessment  In many districts, teachers were directed to administer prior years’ final exams to their students in the first week of school as a “baseline” assessment, with the result that the vast majority of students “failed” while providing no useful diagnostic information to teachers on individual students’ baseline learning needs (i.e. why students “failed”).  As described by one teacher, who was required to administer a geometry exam to students who have never been taught geometry, “I didn’t know if they failed because they’d never heard of the Pythagorean Theorem or because of a deficit in algebra, measurement, or arithmetic—the test told me nothing, but turned off the entire class on the second day of school.”

“The data on Student Growth Percentiles (SGP), the method by which individual student test scores will be used to evaluate teachers, also needs to be more closely examined,” Steinhauer said, noting that “in the final report from the Evaluation Pilot Advisory Committee (EPAC), the New Jersey Department of Education gives very little data about the correlation between SGP scores and teacher practice in the classroom.  The report states that in some districts there was a positive correlation between SGP scores and teacher performance in the classroom.  They do not report how many of the districts in the pilot saw such a correlation, or how they explain those districts that did not experience this correlation.”

In addition, Steinhauer said, the EPAC report also pointed out that in the first year of using the new evaluation models, there is a steep learning curve for evaluators.  The report makes the point that districts involved in the second year of the pilot found that “increased familiarity also helped administrators more effectively differentiate levels of teacher practice.”

“In other words,” Steinhauer said, “it took a full year of practice before administrators were able to more accurately differentiate between the levels of teacher practice in the classroom.”

NJEA recommends the following changes:

  • That the two-year review period used to determine a teacher’s effectiveness under the new system be broadened to three years for the first evaluation cycle.
  • That a waiver be provided for teachers whose SGOs were not collaboratively developed or were based on improper student assessment.
  • That the DOE closely monitor the relationship between teachers’ SGP scores and their other ratings, and pursue corrective action with districts in which the two are highly uncorrelated.
  • That a waiver be provided for teachers with SGPs who hosted student teachers during the school year.

“There are many unintended consequences for teachers from the poor implementation of this evaluation system,” Steinhauer concluded.  “As with any major new policy, these problems need to be fixed.  More time would certainly help, too.”

Comments are closed.