Tuesday, November 6, 2012

Harwich High School 2012 Exit Poll Results

Continuing a long tradition, Harwich High School government students conducted an exit poll today at the Harwich Community Center from 8AM to 1PM. Thirty-two students volunteered to conduct the poll on their day off from school, and 564 voters were interviewed. The exercise gives students valuable insights into the opinions and decision making of actual voters, and teaches them the importance of the act of voting. The results also give us potential insights into the election results. Overall in our poll, Obama was the clear winner for president, beating Romney 60% to 38% with Stein and Johnson each receiving about 1%. For the U.S. Senate, Warren was ahead of Brown 56% to 44%. For U.S. Representative, Keating was ahead of Sheldon, 67% to 28% with Botelho receiving 4%. And for County Commissioner, Lyons received 65%, Flynn 47% and Seinhilber 31%. For the ballot questions, Question 1 received 88% support, Question 2 51%, Question 3 59% and Question 4 79%. In addition to asking about how people voted, the students asked about some of the influences on peoples votes. When asked which issues were most important in deciding who to support for President, the Economy/Jobs was cited by 64% of voters, Health Care/Education by 40%, Social Security/Medicare by 27%, Values Issues by 26%, the Candidates’ Character by 25%, the Deficit/Debt by 25%, Tax Policy by 24%, Foreign Policy by 17%, and Environmental Issues by 15%. In looking at how voters who cited each issue voted for President, the three economic and fiscal issues favored Romney, who did significantly better than his total percentage, especially with those who cited the Deficit/Debt, 66% of which voted for him. Other domestic issues favored Obama, as 79% of those who cited Health Care/Education supporting him, 88% of those citing the Environment, and 72% of those citing Social Security/Medicare. For those citing Values Issues (e.g. Abortion, Gay Marriage), 78% supported Obama, and for those citing the Candidates’ Character 65% supported him, both above his overall percentage. Foreign Policy was about the same as the total result. It would appear from this, that at least in Massachusetts (or at least in Harwich), Romney was unable to convince voters to focus enough on economic issues, and that the other issues remained important enough and favored Obama enough to lead to his convincing victory. It will be interesting to see if the same turns out to be true in other states. We also asked voters whether they had been influenced by the debates, or by advertising and direct mail/calls. Only 8% reported being influenced by ads, and only 2% by direct contact. For debates 23% reported being influenced, and with them, Romney did much better getting 48% support, while for those citing advertising as an influence, 69% supported Obama. This would seem to support the conventional wisdom in this race that the debates helped Romney, but that the ads helped Obama. We asked voters when they had made their decision for the Presidential race and 86% indicated that they had decided more than a month ago, these supporting Obama by 61% to 38%. Among those making their decision more recently, Romney did a bit better, but not enough to make much difference. This may suggest that the dynamics of how people were going to vote was pretty much set before the debates and other recent events occurred, despite temporary fluctuations in national polls. This may be less true in swing states where the campaigns directed more money and attention. In looking at the Senate race, while the race is closer, one problem for Brown is that he was only doing 6% better than Romney. Given the expectation that Obama will win the state easily with around 60% of the vote, this would not seem to be a big enough difference to get Brown to a win, if that is consistent statewide. More voters cited making later decisions in the Senate race than the Presidential one. For the 78% that cited deciding earlier than the last month, the race is closed, Warren leading 54% to 46%. But for the 13% who decided in the last month, 63% supported Warren, and for the 9% who decided in the last week, 62% supported her, perhaps indicating that her strategy at the end of the campaign to make control of the Senate an important issue paid off. One reason for hope for a better result for Brown, was that Independents in our poll favored him 53% to 46%, though this may not be enough to help him overcome the Democratic advantage in this state. Looking at Party Identification, our poll may be skewed toward the Democratic candidates, as 41% of interviewees were Democrats, and only 21% were Republicans. While we would expect more Democrats, that difference may be more that is overall true in the state, or in Harwich. Among the 37% who called themselves Independent (Unenrolled), Brown did better than Warren, but Obama won with 53%, which may be a good sign for his national prospects. We broke down the results by gender, finding a significant gender gap with women favoring Democrats more than men. For the presidential race, there was a 12% gap, and for the Senate race there was a 14% gap. This latter result is somewhat surprising since Brown emphasized his pro-life and other women’s issue positions, especially late in the campaign. We seem to have also produced a Democratic skew in our poll results since 61% of our respondents were women. Brown had a lead among the men in our poll, 52% to 47%, so this race again may be closer than our overall result. Lastly, we broke the results down by age, comparing young (18-39), middle aged (40-59) and older (60+) voters. In the presidential race, there was remarkably little difference in the results as all three groups clearly favored Obama (63%, 60%, and 60% respectively), but there was a significant difference in the Senate race. Brown won among young voters, 50% to 49%, while Middle Aged voters favored Warren 55% to 45% and Older voters favored her 58% to 41%. Somehow, Warren seems to have been particularly successful in appealing to older voters. We would like to thank the Harwich Town Clerk, Anita Doucette, for her continued support of our exit poll program, and especially to all the voters who took the time to talk with us today. This exercise has become a cornerstone of our Civics Education program at Harwich High School, and an experience our students and future voters will never forget. John Dickson, Harwich High School, Government Teacher

Friday, November 2, 2012

Ninth District Congressional Debate

Ninth District Congressional Debate - 11/2/12
For the first time today, Harwich High School hosted a Congressional General Election Debate, featuring Rep. William Keating (D), Mr. Christopher Sheldon (R), and Mr. Daniel Botelho (I). The students did a tremendous job organizing the debate. The debate was broadcast tonight on WCAI, our NPR station, at 7PM - streaming now at I also uploaded a video of the event on youtube - see below Thank you to our debate team!

Tuesday, June 5, 2012

And the Best High School on Cape Cod is . . .

Harwich High School June 4, 2012 PRESS RELEASE: Harwich High School Students Compare Cape Cod High Schools And the Best High School on Cape Cod is . . . Following the completion of their Advanced Placement Exam, our AP United States Government students took up a challenge presented to them as a post-exam class project. The challenge was to use the data available on the state Department of Elementary and Secondary Education website to compare public high schools on Cape Cod. What instigated this was the flurry of recent articles on this subject, many of which concluded, based on the recent USNews #15 national ranking, that Sturgis is head and shoulders above the rest of the high schools on the Cape (or the state, for that matter). Is this true? Do the rest of the schools need to copy Sturgis’ IB model or get left behind? We wanted to see what the data would show. The metrics used to create national rankings tend to be very narrow. USNews’ rankings, after setting certain initial benchmarks, are based only on AP/IB participation and performance. The wealth of data on the DESE site gave us the opportunity to look at a wide range of measurements and to create some broader indexes. Each student took a different area (MCAS, e.g.) and gathered the three most recent years of data for each of twelve public high schools on the Cape (Provincetown excluded). We assembled this data on a common spreadsheet (thanks to Google docs), and did some initial analysis. We then discussed how to aggregate the data and came up with three indicators – one based on MCAS results, one based on SAT and AP scores, and one based on graduation data. We also looked at the demographic data available and came up with a way of creating adjusted indicators. We then ranked the annual results as well as three year averages of our indicators. For their final essay, students then analyzed all of this, and also considered the “Bloomfield Report” accusing Sturgis of “cherry picking” and the Cape Cod Today article based on Sturgis Executive Director Eric Hieser’s response. [See related article below] One interesting thing to emerge was the variety of results – different indexes in different years produced different rankings among the twelve schools. Just about every school could point to indictors that show their relative effectiveness. We present a brief analysis here, incorporating the students’ analysis and some quotes from their essays. The complete data is posted on our website: MCAS We have included in our overall summary three MCAS indicators. For the first, we simply averaged the DESE’s Student Growth Percentile scores for the English and Math tests - these measure the improvement of student results compared to their 8th grade scores and are not calculated for the science exam since it is not given in 8th grade. Mr. Hieser cites the SGP data as particularly important in comparing schools. In her final essay Lizzie Ray wrote, “I find this most compelling because the growth in MCAS test scores from 8th grade to 10th grade actually shows how good the school and teachers were in teaching the information compared to other schools.” Based on these figures, we see a wide range of results over the last three years. For example, Sturgis averaged a 62 last year, but a 47.3 in 2010 (below the state median of 50) and a 55.8 in 2009. The only schools that were above 50 for all three years were Mashpee, Upper Cape Tech, and Harwich. Based on a three-year average, Harwich had the best results, averaging a 66.8. We decided to create a broader “MCAS Indicator” that included scores as well as the SGP. We added to the SGP figures the Composite Performance Index calculated for all three exams (multiplied by 0.4 so that they would be roughly equally weighted to the SGPs). These again produced a range of results, perhaps reflecting what teachers often sense that one class can be very different from the next in the same school. For example, Chatham has the second best score in 2010 but the 12th in 2011. Overall, Harwich again had the highest scores, with Sturgis second and Nauset third. We then applied a Demographic Multiplier of our own calculation. We started with the concept that demographics influence test scores. This has been frequently documented – for example, see this New York Times article on family income and SAT scores. We did not find a common method of adjusting for demographic factors, so we created one. We added the percentage of low income students, the percentage of Limited English Proficient students, and half the percentage of Special Education students (we halved this because while it has a significant impact, the procedure of classifying students varies by school and may influence the figure). We then halved the total figure to be conservative, erring we believe on the side of underestimating its influence rather than overestimating, and added one to create our Demographic Multiplier. Applying the multiplier to our MCAS Indicator we created an MCAS Indicator Adjusted. The ranks for the three-year average did not change much – Harwich again had the highest scores, with Sturgis second. However, third through eighth became very close in results, all within 4% of each other - Nauset, Bourne, Mashpee, Upper Cape, Chatham and Falmouth. SAT/AP The second indicator we developed was based on SAT and AP scores. We decided to combine these as they are somewhat similar tests administered by College Board, and are mostly taken by upperclassmen, as opposed to MCAS which is taken in 10th grade. This indicator was a bit more complicated to develop. We wanted to consider scores as well as number of tests for the AP, and then try to balance this with the SAT average scores. So we multiplied the percentage of AP scores in the 3-5 range (generally considered passing) by the number of tests taken per student in the school and added the percentage of students who participated in AP. We then multiplied that figure by five in order to balance it with the SAT averages, dividing the total sum by 20 to get a figure in the 100-200 range. The results proved to be more stable than the MCAS scores with only a couple of schools varying their rank more than two places over the three years. Nauset scored best on this, followed by Barnstable, Sandwich, and Harwich. Using our Demographic Multiplier to adjust this indicator did not change these top four schools. This indicator necessarily excluded Sturgis, Cape Tech, and Upper Cape who did not report AP data (except a small amount of data from Upper Cape). In Sturgis’ case, this is of course due to their participation in the International Baccalaureate program. The DESE does not track IB data, and even if we had the figures, this would present the problem of how to compare IB and AP. Because of this problem, we calculated an indicator based only on the SAT scores which all schools reported. We feel this measure would also have validity. As Jen Gonsalves stated, “Students will receive higher SAT scores when they have a strong foundation of reading and math as taught in the early years of high school.” Based on this, Sturgis scored the best, followed by Nauset, Harwich, Falmouth, and Sandwich. However, applying the Demographic Multiplier to these figures had a more dramatic effect, due to how much closer the different schools’ results were. Now, based on the three-year average, Barnstable was at the top, followed by Harwich and Dennis-Yarmouth, with Sturgis fourth. Graduation Cape Cod Today recently published an article based on the graduation rates of different high schools, but we wanted to broaden this figure by also including the state data on graduates going to college. As Tyler Kane argued, “The percentage of students that are moving on after high school to attend a four-year college or university shows how well the school has actually prepared students for college and informed them of its importance.” Our indicator adds the percent of graduates going to a 4-year college plus half the percentage of graduates going to a 2-year college to the graduation rate plus one fifth of the Special Education graduation rate (discounted for relative weight). As with the SAT/AP indicators, the results were quite consistent over the three years. Half the schools ranked the same all three years. On this indicator, Sturgis scores best, followed by Sandwich, Chatham, Nauset, and Bourne. When applying the Demographic Multiplier, we find that the scores become closer. Sturgis is still first, narrowly higher than Sandwich, and also Chatham and Bourne tied for third. Combined Indicators We then added three of the indicators together to find a combined aggregate of our MCAS, SAT and Graduation indicators. Based on these Sturgis ranked first, followed by Harwich, Nauset, Sandwich, Chatham, Bourne and Falmouth. Finally, we combined the same indicators adjusted for demographics. By this measure, Harwich was first, followed by Sturgis, Bourne, Nauset, Sandwich, Dennis-Yarmouth, and Mashpee. Conclusion While we found this exercise interesting, a number of caveats should be noted. There are some technical problems in relying on the three most recent years of data, in that these were not the same three years depending on which piece of data we consider. The most recent demographic data was for 2011-12 while the scoring data was mostly from 2010-11, and other data points such as college attendance was from 2009-2010. Also, we can’t be certain that all of the data points are reported in the same way or reflect the same things in different schools. For example, looking at official truancy reports, almost all Cape schools reported a truancy rate of 0 but Bourne reported one of 24.2%. We’re not sure what that means – we did not use this figure. As noted above, we are also unsure of the comparability of Special Education data. Our Demographic Multiplier is certainly open to criticism. This represented what we felt to be a fair and conservative approach as described above, but others may be able to argue convincingly for a different approach that might mitigate the data more or less. For example, the Beacon Hill Institute has used one that appears to have a significantly greater adjustment impact, ranking Bourne as the most effective high school on the Cape and fifth in the state. To be most accurate, different multipliers should probably be developed for different indicators based on exactly what impact demographic factors have on each of them. We would be interested to see the results of other demographically-adjusted indexes. This also doesn’t take into account the different nature and mission of the various schools on Cape. In particular, we are probably being unfair to the technical schools whose mission to prepare students for vocational trades is not reflected in the test scores. To some degree, the same may be true for all schools since for each the goals for students go beyond what the data can measure. We feel strongly that these indicators do not give a complete picture of a school nor are a fair basis for saying that one school is better than another. As Luz Arregoces noted, “The limitation is that we are unable to acknowledge the actual atmosphere of each school.” Various other factors, such as the relative supportive climate of a school, may be more important particularly to parents trying to determine in which school their child will best thrive. We are blessed on Cape Cod by the wide range of successful schools with dedicated staffs. In comparing performance results with overall state data, most Cape schools are above average, in the state that has for several years ranked first in the nation in school performance. Given the results, we are vulnerable to the accusation that we lacked objectivity and that we designed the study to make Harwich High School look best. While this would be hard to disprove, it would not be a fair charge. We assembled all the data we believed to be useful from the DESE website, and we designed the indicators before calculating them and seeing how the rankings would look. We did not adjust any of them based on the results. And if our goal was to make ourselves look good, we could have stopped at our first calculation of MCAS SGP and been done. With all the above caveats noted, we feel that in using reported data to compare schools, broad indicators, such as the ones we developed, are more appropriate than those used by the various media outlets we have seen. As their teacher, I am very proud of the work my class did on this challenge, and I would like to thank them for their effort and dedication on this and all we studied this year. John Dickson, Teacher Luz Arregoces Meaghan Callahan Amanda DeOliveira Una Doherty Jen Gonsalves Colin Hamilton Tyler Kane Gardy Kevin Ligonde John O’Connor Lizzie Ray Jackson Van Dyck Meghan Van Hoose Related Article: Does Sturgis “Cherry Pick” Its Students? For part of their final essay analysis, the students in our AP United States Government class considered the “Bloomfield Report” accusing Sturgis of “cherry picking” as well as the Cape Cod Today article based on Sturgis Executive Director Eric Hieser’s response. We summarize our collective analysis here. We find Mr. Hieser’s description of the admissions process of Sturgis to be truthful, and that Sturgis indeed admits its students by a random lottery. We see no evidence to the contrary. We also find the evidence described by Mr. Bloomfield to be accurate, showing that Sturgis admits a disproportionate number of high-performing students. We found similar evidence in Sturgis’ demographic data, particularly the percentage of students from low-income families. Sturgis has the lowest percentage of any school on the Cape at 6.8% which is particularly striking when compared to Barnstable and Dennis-Yarmouth from which Sturgis draws most of its student body, whose rates are 33.4% and 38.1% respectively. This shows that Sturgis does not enroll a random sample of students. We think the explanation for this lies in the nature of charter schools generally and the culture of Sturgis in particular. While the lottery to admit students to a charter school is random, the group of families who enter the lottery is not. A charter school lottery will tend to attract families who are more involved and who more highly value their child’s education, as well as those who have greater means and more information and awareness of the system. This will create a student body on average of higher achieving students. Meghan Van Hoose pointed out, “Since it is a school you have to choose to go to, students who don’t want to work hard are most likely not going to go.” We think Sturgis increases this tendency with its “IB for all” culture. Parents of struggling students who are considering different options for high school will be less likely to choose Sturgis because of this. This increases the tendencies noted above, and sets Sturgis to be a kind of select or honors school - a result contrary to the guidelines for charter schools under education reform due to the negative effect this has on the schools from which students are drawn. Jackson Van Dyck described, “While Sturgis does consistently have better scores in most areas, it is not because it specifically chooses the best students. It is because a higher number of top students apply there.” Despite this effect and based on our data analyzed above, John O’Connor noted, “Sturgis is not the overwhelming superpower it is rumored to be.” Depending on which indicator is used, Sturgis is at or near the top, and is first in our combined indicator. But this changes when you start to control for the different demographic compositions of the schools. Based on what we consider to be a conservative approach to adjusting the results, Sturgis still ranks highly, but not at the top. So we do not believe that Sturgis “cherry picks” its students. But we do believe that it admits mostly high-performing students, and that it hurts the surrounding schools from which it draws these students, who would score higher on our indicators if Sturgis were not drawing top students away. As O’Connor stated, “In the end, that is the way to take Sturgis off its high horse – with data and cold hard facts, not opinions about cherry picking.”

Saturday, March 3, 2012

Civil Liberties Panel 2012

Harwich High School hosted our fifth annual Civil Liberties Panel with Superior Court Judge Robert Rufo, assisted by Assistant District Attorney Edward Lynch and Defense Attorney Robert Hofmann. Mr. Dickson substituted in the role of the patrol officer. We conducted a mock pre-trial hearing involving a car stop and search for the first session, and response to an underage party in the second. A special thank you to Judge Rufo for his contenting commitment to community outreach and legal education, and to ADA Lynch and Atty Hofmann for sharing their time and expertise.

Session 1:


Session 2:

Saturday, February 4, 2012

Prof. Wasby's Lecture on the Federal Courts

Retired Professor Stephen Wasby gave a lecture on the Federal Court system to the A.P. U.S. Government class at Harwich High School, February 3, 2012. This is the fifth annual guest lecture by Prof. Wasby, who is able to present the students with a level of detail and expertise in the federal court system that they could never get from a textbook. They also get a preview of what a college-level lecture is like.

We would like to thank Prof. Wasby for his generosity in sharing his time and expertise with us.

Part 1:


Part 2: