[Skip to page content]
< Back
NMSI Blog

How To Properly Measure AP Scores

Last week the College Board released their AP Report to the Nation, and the report showed that AP is thriving.  This news is great for the country as studies show that students do better in College and have a higher chance of graduating if they do well on AP Exams.  Intuitively, it is easy to see why this is the case.  All one has to do is look at released AP Exam questions for your favorite subject athttp://apcentral.collegeboard.com/apc/public/courses/teachers_corner/index.html , and you’ll see firsthand the rigorous material that AP students who succeed on these exams master.  And the numbers also back up these studies.  There were about 50,000 more college graduates earning BS degrees in 2012 than 10 years ago in 2002, which is directly correlated to the increase in the number of students passing AP Exams during that same time.  I recognize this analysis is a correlation and not causality, however, when holding for appropriate variables, one finds that passing an AP exam leads to higher graduation rates.  And higher graduation rates lead to higher earnings, a higher tax base, less unemployment, and most importantly, a stronger America.  A more specific example is the relationship between AP STEM subjects and STEM degrees.  I recently spoke to leaders at the National Defense Industry Association who asked why the National Math and Science was focused on passing AP scores as a means to produce more STEM professionals.  I answered by telling them if they poll their recent hires in the engineering field, they would find that the one constant all those individuals had in high school, assuming their high schools offered AP STEM courses, is that all passed an AP STEM Exam.  It doesn’t mean that passing an AP STEM Exam means one will become an engineer, but if anyone wants to become an engineer, passing an AP STEM Exam is the starting point.

My favorite part of the College Board’s AP Report to the Nation is how they rank the states.  They rank states by the percentage of their graduating seniors who passed at least one AP Exam.  This ranking makes so much more sense than the percentage of exams taken that are passed, which is what many teachers, administrators, and the media too often focus upon.  To illustrate this point, the best example is to compare Maryland to New Jersey.  For public schools, New Jersey ranks first in the country in the percentage of AP Exams passed at 73.4%, much higher than Maryland’s 61.2% which ranks 21st.    But these numbers are virtually meaningless.  The fact is that students in Maryland are 40% more likely to pass an AP Exam by the time they graduate than students in New Jersey.  More specifically, last year, 29.6 percent of Maryland’s seniors passed an AP Exam last year, while only 21.2% of New Jersey’s seniors achieved the same.  Maryland is first in the country and New Jersey is 13th.    This percentage that the College Board uses is much more relevant because it has to do with the percentage of the states’ students that are being served.  Maryland could easily have a higher percentage of exams passed versus taken by restricting the number of test takers, which is a practice of New Jersey schools – at least compared to those in Maryland.  It should be noted Commissioner Cerf in New Jersey agrees that more students in New Jersey should be exposed to AP courses and is working diligently toward that end.  If the College Board ranked states on percentage of exams passed versus taken, it would deny tens of thousands of worthy students the opportunity to take rigorous AP courses as schools and states would more likely focus on just getting their elite students into AP so that they would have a high passing rate.

It’s somewhat counterintuitive, but percentage of exams passed based on tests taken almost has no relevance at all.  What does have relevance is the number of students who are passing out of the total school population.  In other words, 50% can be better than 100%. For example, let’s assume two schools have the same number of students. School A has 20 students take AP Physics and has all 20 of them pass  (100%). School B has 100 students take AP Physics and has 50 pass (50%).  Too often teachers, administrators, and media laud school A that has a 100% pass rate and do not think as highly of school B that has a 50% pass rate.  But clearly, the school B did better.  They had more students master physics, more students given the opportunity for a STEM career, and had a greater percentage of their students master physics than School A – by 2 1Ž2 times.  And this doesn’t even take into account the fact that they had another 50 students exposed to rigorous physics material who will now know what to expect in College Physics.  Part of the problem is that many consider that those who do not pass, fail.  This may make sense in a linear vernacular sense, but exposing more kids to rigorous material, even if they don’t pass a test, isn’t necessarily a bad thing and can be very educationally sound.  Those who understand that School B is better than school A and that 50% is better than 100% in this case need to speak loudly, so that all administrators, school boards, and media who don’t understand can get on the same page.   If all schools, like School B, focused on getting more of their students to do well on AP Exams, instead of a percentage of their students who take the class to do well, our country’s future would look even brighter.