Tuesday, June 5, 2012

And the Best High School on Cape Cod is . . .

Harwich High School June 4, 2012 PRESS RELEASE: Harwich High School Students Compare Cape Cod High Schools And the Best High School on Cape Cod is . . . Following the completion of their Advanced Placement Exam, our AP United States Government students took up a challenge presented to them as a post-exam class project. The challenge was to use the data available on the state Department of Elementary and Secondary Education website to compare public high schools on Cape Cod. What instigated this was the flurry of recent articles on this subject, many of which concluded, based on the recent USNews #15 national ranking, that Sturgis is head and shoulders above the rest of the high schools on the Cape (or the state, for that matter). Is this true? Do the rest of the schools need to copy Sturgis’ IB model or get left behind? We wanted to see what the data would show. The metrics used to create national rankings tend to be very narrow. USNews’ rankings, after setting certain initial benchmarks, are based only on AP/IB participation and performance. The wealth of data on the DESE site gave us the opportunity to look at a wide range of measurements and to create some broader indexes. Each student took a different area (MCAS, e.g.) and gathered the three most recent years of data for each of twelve public high schools on the Cape (Provincetown excluded). We assembled this data on a common spreadsheet (thanks to Google docs), and did some initial analysis. We then discussed how to aggregate the data and came up with three indicators – one based on MCAS results, one based on SAT and AP scores, and one based on graduation data. We also looked at the demographic data available and came up with a way of creating adjusted indicators. We then ranked the annual results as well as three year averages of our indicators. For their final essay, students then analyzed all of this, and also considered the “Bloomfield Report” accusing Sturgis of “cherry picking” and the Cape Cod Today article based on Sturgis Executive Director Eric Hieser’s response. [See related article below] One interesting thing to emerge was the variety of results – different indexes in different years produced different rankings among the twelve schools. Just about every school could point to indictors that show their relative effectiveness. We present a brief analysis here, incorporating the students’ analysis and some quotes from their essays. The complete data is posted on our website: MCAS We have included in our overall summary three MCAS indicators. For the first, we simply averaged the DESE’s Student Growth Percentile scores for the English and Math tests - these measure the improvement of student results compared to their 8th grade scores and are not calculated for the science exam since it is not given in 8th grade. Mr. Hieser cites the SGP data as particularly important in comparing schools. In her final essay Lizzie Ray wrote, “I find this most compelling because the growth in MCAS test scores from 8th grade to 10th grade actually shows how good the school and teachers were in teaching the information compared to other schools.” Based on these figures, we see a wide range of results over the last three years. For example, Sturgis averaged a 62 last year, but a 47.3 in 2010 (below the state median of 50) and a 55.8 in 2009. The only schools that were above 50 for all three years were Mashpee, Upper Cape Tech, and Harwich. Based on a three-year average, Harwich had the best results, averaging a 66.8. We decided to create a broader “MCAS Indicator” that included scores as well as the SGP. We added to the SGP figures the Composite Performance Index calculated for all three exams (multiplied by 0.4 so that they would be roughly equally weighted to the SGPs). These again produced a range of results, perhaps reflecting what teachers often sense that one class can be very different from the next in the same school. For example, Chatham has the second best score in 2010 but the 12th in 2011. Overall, Harwich again had the highest scores, with Sturgis second and Nauset third. We then applied a Demographic Multiplier of our own calculation. We started with the concept that demographics influence test scores. This has been frequently documented – for example, see this New York Times article on family income and SAT scores. We did not find a common method of adjusting for demographic factors, so we created one. We added the percentage of low income students, the percentage of Limited English Proficient students, and half the percentage of Special Education students (we halved this because while it has a significant impact, the procedure of classifying students varies by school and may influence the figure). We then halved the total figure to be conservative, erring we believe on the side of underestimating its influence rather than overestimating, and added one to create our Demographic Multiplier. Applying the multiplier to our MCAS Indicator we created an MCAS Indicator Adjusted. The ranks for the three-year average did not change much – Harwich again had the highest scores, with Sturgis second. However, third through eighth became very close in results, all within 4% of each other - Nauset, Bourne, Mashpee, Upper Cape, Chatham and Falmouth. SAT/AP The second indicator we developed was based on SAT and AP scores. We decided to combine these as they are somewhat similar tests administered by College Board, and are mostly taken by upperclassmen, as opposed to MCAS which is taken in 10th grade. This indicator was a bit more complicated to develop. We wanted to consider scores as well as number of tests for the AP, and then try to balance this with the SAT average scores. So we multiplied the percentage of AP scores in the 3-5 range (generally considered passing) by the number of tests taken per student in the school and added the percentage of students who participated in AP. We then multiplied that figure by five in order to balance it with the SAT averages, dividing the total sum by 20 to get a figure in the 100-200 range. The results proved to be more stable than the MCAS scores with only a couple of schools varying their rank more than two places over the three years. Nauset scored best on this, followed by Barnstable, Sandwich, and Harwich. Using our Demographic Multiplier to adjust this indicator did not change these top four schools. This indicator necessarily excluded Sturgis, Cape Tech, and Upper Cape who did not report AP data (except a small amount of data from Upper Cape). In Sturgis’ case, this is of course due to their participation in the International Baccalaureate program. The DESE does not track IB data, and even if we had the figures, this would present the problem of how to compare IB and AP. Because of this problem, we calculated an indicator based only on the SAT scores which all schools reported. We feel this measure would also have validity. As Jen Gonsalves stated, “Students will receive higher SAT scores when they have a strong foundation of reading and math as taught in the early years of high school.” Based on this, Sturgis scored the best, followed by Nauset, Harwich, Falmouth, and Sandwich. However, applying the Demographic Multiplier to these figures had a more dramatic effect, due to how much closer the different schools’ results were. Now, based on the three-year average, Barnstable was at the top, followed by Harwich and Dennis-Yarmouth, with Sturgis fourth. Graduation Cape Cod Today recently published an article based on the graduation rates of different high schools, but we wanted to broaden this figure by also including the state data on graduates going to college. As Tyler Kane argued, “The percentage of students that are moving on after high school to attend a four-year college or university shows how well the school has actually prepared students for college and informed them of its importance.” Our indicator adds the percent of graduates going to a 4-year college plus half the percentage of graduates going to a 2-year college to the graduation rate plus one fifth of the Special Education graduation rate (discounted for relative weight). As with the SAT/AP indicators, the results were quite consistent over the three years. Half the schools ranked the same all three years. On this indicator, Sturgis scores best, followed by Sandwich, Chatham, Nauset, and Bourne. When applying the Demographic Multiplier, we find that the scores become closer. Sturgis is still first, narrowly higher than Sandwich, and also Chatham and Bourne tied for third. Combined Indicators We then added three of the indicators together to find a combined aggregate of our MCAS, SAT and Graduation indicators. Based on these Sturgis ranked first, followed by Harwich, Nauset, Sandwich, Chatham, Bourne and Falmouth. Finally, we combined the same indicators adjusted for demographics. By this measure, Harwich was first, followed by Sturgis, Bourne, Nauset, Sandwich, Dennis-Yarmouth, and Mashpee. Conclusion While we found this exercise interesting, a number of caveats should be noted. There are some technical problems in relying on the three most recent years of data, in that these were not the same three years depending on which piece of data we consider. The most recent demographic data was for 2011-12 while the scoring data was mostly from 2010-11, and other data points such as college attendance was from 2009-2010. Also, we can’t be certain that all of the data points are reported in the same way or reflect the same things in different schools. For example, looking at official truancy reports, almost all Cape schools reported a truancy rate of 0 but Bourne reported one of 24.2%. We’re not sure what that means – we did not use this figure. As noted above, we are also unsure of the comparability of Special Education data. Our Demographic Multiplier is certainly open to criticism. This represented what we felt to be a fair and conservative approach as described above, but others may be able to argue convincingly for a different approach that might mitigate the data more or less. For example, the Beacon Hill Institute has used one that appears to have a significantly greater adjustment impact, ranking Bourne as the most effective high school on the Cape and fifth in the state. To be most accurate, different multipliers should probably be developed for different indicators based on exactly what impact demographic factors have on each of them. We would be interested to see the results of other demographically-adjusted indexes. This also doesn’t take into account the different nature and mission of the various schools on Cape. In particular, we are probably being unfair to the technical schools whose mission to prepare students for vocational trades is not reflected in the test scores. To some degree, the same may be true for all schools since for each the goals for students go beyond what the data can measure. We feel strongly that these indicators do not give a complete picture of a school nor are a fair basis for saying that one school is better than another. As Luz Arregoces noted, “The limitation is that we are unable to acknowledge the actual atmosphere of each school.” Various other factors, such as the relative supportive climate of a school, may be more important particularly to parents trying to determine in which school their child will best thrive. We are blessed on Cape Cod by the wide range of successful schools with dedicated staffs. In comparing performance results with overall state data, most Cape schools are above average, in the state that has for several years ranked first in the nation in school performance. Given the results, we are vulnerable to the accusation that we lacked objectivity and that we designed the study to make Harwich High School look best. While this would be hard to disprove, it would not be a fair charge. We assembled all the data we believed to be useful from the DESE website, and we designed the indicators before calculating them and seeing how the rankings would look. We did not adjust any of them based on the results. And if our goal was to make ourselves look good, we could have stopped at our first calculation of MCAS SGP and been done. With all the above caveats noted, we feel that in using reported data to compare schools, broad indicators, such as the ones we developed, are more appropriate than those used by the various media outlets we have seen. As their teacher, I am very proud of the work my class did on this challenge, and I would like to thank them for their effort and dedication on this and all we studied this year. John Dickson, Teacher Luz Arregoces Meaghan Callahan Amanda DeOliveira Una Doherty Jen Gonsalves Colin Hamilton Tyler Kane Gardy Kevin Ligonde John O’Connor Lizzie Ray Jackson Van Dyck Meghan Van Hoose Related Article: Does Sturgis “Cherry Pick” Its Students? For part of their final essay analysis, the students in our AP United States Government class considered the “Bloomfield Report” accusing Sturgis of “cherry picking” as well as the Cape Cod Today article based on Sturgis Executive Director Eric Hieser’s response. We summarize our collective analysis here. We find Mr. Hieser’s description of the admissions process of Sturgis to be truthful, and that Sturgis indeed admits its students by a random lottery. We see no evidence to the contrary. We also find the evidence described by Mr. Bloomfield to be accurate, showing that Sturgis admits a disproportionate number of high-performing students. We found similar evidence in Sturgis’ demographic data, particularly the percentage of students from low-income families. Sturgis has the lowest percentage of any school on the Cape at 6.8% which is particularly striking when compared to Barnstable and Dennis-Yarmouth from which Sturgis draws most of its student body, whose rates are 33.4% and 38.1% respectively. This shows that Sturgis does not enroll a random sample of students. We think the explanation for this lies in the nature of charter schools generally and the culture of Sturgis in particular. While the lottery to admit students to a charter school is random, the group of families who enter the lottery is not. A charter school lottery will tend to attract families who are more involved and who more highly value their child’s education, as well as those who have greater means and more information and awareness of the system. This will create a student body on average of higher achieving students. Meghan Van Hoose pointed out, “Since it is a school you have to choose to go to, students who don’t want to work hard are most likely not going to go.” We think Sturgis increases this tendency with its “IB for all” culture. Parents of struggling students who are considering different options for high school will be less likely to choose Sturgis because of this. This increases the tendencies noted above, and sets Sturgis to be a kind of select or honors school - a result contrary to the guidelines for charter schools under education reform due to the negative effect this has on the schools from which students are drawn. Jackson Van Dyck described, “While Sturgis does consistently have better scores in most areas, it is not because it specifically chooses the best students. It is because a higher number of top students apply there.” Despite this effect and based on our data analyzed above, John O’Connor noted, “Sturgis is not the overwhelming superpower it is rumored to be.” Depending on which indicator is used, Sturgis is at or near the top, and is first in our combined indicator. But this changes when you start to control for the different demographic compositions of the schools. Based on what we consider to be a conservative approach to adjusting the results, Sturgis still ranks highly, but not at the top. So we do not believe that Sturgis “cherry picks” its students. But we do believe that it admits mostly high-performing students, and that it hurts the surrounding schools from which it draws these students, who would score higher on our indicators if Sturgis were not drawing top students away. As O’Connor stated, “In the end, that is the way to take Sturgis off its high horse – with data and cold hard facts, not opinions about cherry picking.”