The data sgp leverages longitudinal student assessment data to produce statistical growth percentiles (SGP) for each individual student. An SGP measures how much a student has progressed relative to their academic peers with similar test score histories. Students who make more progress than their peers will have higher SGPs while students who have less progress will have lower SGPs. The goal is to provide educators, parents, and other educational stakeholders with a tool to better understand their studentsâ€™ progress on MCAS.

The basic approach for calculating SGP is to use the studentâ€™s current test score and their previous tests from one or more prior testing windows to calculate a regression model. The model uses a series of coefficient matrices to determine the expected progression for that student given their past performance. SGPs are then calculated for the current year and compared with the predicted progression to indicate how well the student is growing.

One interesting feature of SGP is that it provides insight into the quality of instruction provided to a particular student by allowing educators to view how much their students are growing versus what is expected for them based on their performance level. Educators who have students in the upper quartile of the mSGP distribution are likely driving significant academic growth in their students and should be rewarded for their efforts. Likewise, educators who have students in the bottom quartile of the SGP distribution may be providing insufficient growth for their studentâ€™s and should be provided additional resources.

To ensure that the SGPs are unbiased, the knots and boundaries for the trajectories are determined using a distribution over several years of compiled test data instead of a single yearâ€™s distribution. This reduces the impact of anomalies in a single year and is a critical aspect of how SGPs are generated. The resulting distributions also tend to be smoother than those produced by medians alone as shown in the figure below.

A final consideration for SGP generation is the number of prior test scores to include in the calculation of the regression models and the resulting SGPs. While incorporating multiple years of test score data can help to increase the reliability of the SGPs, it also increases the overall size of the dataset and potentially the amount of memory required to process. It is therefore recommended that schools use the minimum number of prior test score years required to produce reliable SGPs. This will typically be three test scores from prior testing windows. If a school chooses to use additional test score data beyond this, the SGP calculations will take longer and it is possible that some of the results may not be valid due to over-fitting.