You may remember how the phrase states' rights was used as a pejorative by enthusiasts for Alexander Hamilton's model of a powerful central government to dismiss and over-rule those who preferred Thomas Jefferson's vision of a federal relationship-wherein states governed domestic issues and Washington, D.C., governed national ones.
States' rights was primarily used in the arena of educational policy, where states resisting integration-motivated forced bussing were loudly criticized in states' rights terms by those demanding it.
The ultimate outcome was less integration not more.
Urban middle-class flight, both white and black, and eventually the withdrawal of forced bussing and similar initiatives occurred. Remember Kansas City's $2 billion court-ordered spending experiment? It was, for practical purposes, a complete educational policy failure, but that's a secondary point.
The primary point is that it's an interesting background to the adoption of the same states' rights argument by the same folks who had earlier disparaged it; when a national requirement they despised but couldn't prevent-that public school spending produce measurable results toward producing proficient students-led them to a finesse strategy they used their political clout to force adoption. Example: for the testing requirement of No Child Left Behind, they got the right for each state to substitute its own preferred (read: easier) test for the more rigorous national one, if it wished. All except one.
The State of Vermont chose to purchase, deploy, and publicize the results from NECAP, a test designed to show a 2/3 proficiency rate from the same cohort of students who made only 1/3 proficient on the federal (with unpublicized results) NAEP tests.
The State of Maryland chose to purchase, deploy, and publicize the results from MSA, the Maryland State Assessment. As for Vermont, the non-NAEP test seems to put public education in a far better light than the NAEP one. A University of Maryland study in 2007 documented exactly that conclusion.
If you Google "Cross-Grade Comparisons Among Statewide Assessments and NAEP" by Schaefer, Liu, and Wang (2007), you'll find on page 7 a pair of charts showing that, just as in Vermont, the seeming 70 percent reading-and-math-proficient result is actually 30 percent NAEP-defined proficient.
Adding insult to injury, the charts show a downward proficiency line across the grades: the longer students stay in school, the worse they perform. That's what I term punting from proficiency.
Just as in Vermont, educators in Maryland don't want to make it too easy for you to see this 1/3-to-2/3 ratio in test score results. As a result, they simply choose-while publishing the MSs-not to publish the comparable NAEPs.
Maryland's Montgomery County page in the state report is typical: on page 3 you see that after reporting all the 70-to-80 percent MSA proficiency results for all grades on the MSAs, it recommends that "for information on the NAEP, go to [the website]".
Unlike Vermont, Maryland makes it slightly easier to pursue the NAEP scores if you insist by furnishing a link where you can punch up the actual NAEP scores.
For example, in 2009, Maryland's fourth graders came in at 35 percent proficient in math, with whites at 45 percent and blacks at 20 percent (mostly white Vermont came in at 51 percent).
And just as in Vermont, which purchases the NECAP test from private-sector publisher/vendor Measured Progress Inc, Maryland purchases MSA from private-sector vendor/publisher Pearson Education, whose business ancestry goes back to publisher Scott Foresman.
And you thought public educators were predictably anti-corporate in outlook? Maybe they are, as shown by the difficulty in finding the clear link between public ed and test vendors in the publications of either state; actually, Maryland is even more opaque than Vermont on this apparently uncomfortable subject, and the Pearson link shows up only deep into the Maryland SED website's page on its MSA test protocol.
Similarly, you may have thought that Maryland's parents, like Vermont's, would be uniformly dismayed and displeased by such cooking of the test books; and similarly, you'd be wrong.
In Montgomery County, the Baltimore/Washington suburb, media family income near $100,000, population near 1 million, major employer government, major industries corporate and research, and so on.
The county school system publishes some attractive goals:
Goal 1-Ensure Success for Every Student. Milestone 1 says that "all students will achieve or exceed proficiency". They don't: 2/3 don't by NAEP measure, 1/3 by MSA measure.
The county wants to measure parental satisfaction, so it sends out a parental satisfaction survey asking, not about Goal 1 but about Goal 3, Strengthen Productive Partnerships for Education, and gets a 20 to 40 percent response to such statements as "My child's teacher keeps me informed" and "the school does a good job of getting important information to patents", both statements drawing 90 percent agreement from parents.
No such statements about test scores, such as "I know what level-basic or proficient or advanced-in reading or math or science my child has achieved" but one about "I know that there is an atmosphere of open communication at my child's school".
I'd guess that the old-fashioned report card no longer exists in the Montgomery County schools and that parents, who are in careers overwhelmingly connected to just those economic sectors most demanding of high literacy and numeracy levels, don't expect the same for their kids and don't seem to care.
The overall parental satisfaction-with-public ed question Montgomery County chose not to ask, was actually asked by the Educational Resource Information Center of the U.S. Education Department: On a nationwide basis, parents report a 3.82-out-of-5 or 76 percent "moderately satisfied" response, and "...only one-third of parents gave their children's schools an excellent rating".
As you might logically expect, satisfaction was highest for parents with higher achieving students. I'd guess that Vermont- or Maryland-specific results wouldn't be much different.
Another measure of parental (dis)satisfaction is the percentage of the 5-17 age cohort in non-public alternatives. Neither Vermont nor Maryland school bureaucracies, nor any local districts, will furnish that number, but the Council for American Private Education and the National Center for Educational Statistics both state 11 percent. That percentage has been higher in the past, ironically when both public and alternative schools were producing better proficiency results,.
Now, with test scores stagnant, spending rising, and in many states, enrollment shrinking, most of the taxpaying and childrearing public doesn't seem to care. Maybe that's why "punting" is a politically successful educator strategy.
Former Vermonter Martin Harris lives in Tennessee.