Audit Report on the on the Department of Education’s Controls over High School Progress Reports

May 15, 2011 | MJ10-133A

Table of Contents

AUDIT REPORT IN BRIEF

In the 2006-2007 school year, the Department of Education (DOE) implemented annual School Progress Reports for the purpose of creating greater accountability, establishing expectations, and uniformly measuring and comparing school progress. The progress reports reflect letter grades (A, B, C, D, or F) that rate how each of the City’s public schools is performing. For high schools, overall scores are based on three general areas: student progress, student performance, and school environment. Since their implementation, DOE has used the progress reports as an integral part of rewarding high performing schools and for identifying chronically low performing schools for restructuring or closure.

This audit determined whether DOE maintained adequate controls to ensure that data reflected in the annual high school progress reports are reliable (faithfully represents the data recorded in the DOE databases from which it was derived), comparable (provides a clear frame of reference for assessing performance and information is measured uniformly and reported consistently from period to period), and understandable so that stakeholders (i.e., parents, educators, school officials, legislators, etc.) could reasonably rely on the progress reports for decision-making purposes. This audit did not assess the accuracy of student course grades and test scores awarded by teachers and recorded in the DOE databases or in source documentation. The audit also did not attest to the appropriateness of specific attributes measured therein or determine whether there are other attributes better able to measure student progress and school performance. These matters were considered outside the scope of this audit.

Audit Findings and Conclusions

The audit determined that DOE maintained adequate controls to ensure that the data reflected in the 2008-2009 high school progress reports were reliable. The audited data elements used in preparing the reports were (with some minor exceptions) verifiable and representative of student data recorded in DOE’s computer databases. With regard to the characteristics of comparability and fairness in reporting, however, DOE has made a number of modifications in underlying attributes, weights, and/or grade scales (i.e., diploma weights, peer groupings, and cut scores) used to calculate peer indexes and measure performance. These changes may hinder one’s ability to effectively use the reports to assess a school’s performance over a period of years.

Further, although we determined that sufficient documentation was available for audit purposes to provide reasonable assurance that the audited student data was representative of the data recorded in DOE’s databases, there were some instances where hard-copy student files and/or Regents exam documentation were not available for our review.

Audit Recommendations

The audit made 10 recommendations, including that DOE should:

  • Consider including a pro-forma disclosure in the progress reports and/or supplemental information to demonstrate the effect of significant changes in peer group calculations, changes in cut scores, or other metrics on prior years. If such a restatement is not feasible, DOE should determine a means for users to effectively compare current changes retrospectively to better enable year-to-year comparisons.
  • Perform periodic, independent audits of student data to provide reasonable assurance of its accuracy and reliability.
  • Ensure that student records, Regents exam documentation, and other relevant student information are appropriately tracked and retained by the schools as required.

DOE Response

We received a written response from DOE officials on April 18, 2011. In their response, DOE officials generally agreed with most (nine out of 10) of the audit’s recommendations and partially agreed with the remaining one.

$285 billion
Feb
2025