The college list—a measure of a school?
I teach at an independent school, and in many ways we are lucky to be spared the tumult caused by the constant need to “measure” progress through standardized testing and the constant evaluation of schools and teachers via shallow means that seem to be so dominant in discussion of education today.
Still, every school needs some measure of how they are doing, and in the independent school world, the measure chosen is often the college list. At first, this seems like the perfect measure of a school. Many of our schools are “prep” schools, designed to “prep” kids for college, all of our parents expect their children to attend college after graduating from our schools, and many of our parents expect that our schools will provide their child with that vital competitive advantage necessary to secure admission to an “elite” college.
I want to explore the flaws of using the college list as a metric of success for a school, and then ask a few questions about how it tends to be used by schools in their self-promotion efforts.
There are a number of reasons I believe the college list is a flawed metric of an independent school. You can start by looking at the mission statement of almost any school. You’ll no doubt find plenty of talk of teaching citizenship, academic engagement, and a love of learning, but I doubt you’ll find any mention of “getting students in” to top colleges. If this isn’t in the mission, then the college list is measuring something entirely different form the mission of the school, and using the list as a metric can actually be detrimental to the mission of the school, since often the things in the mission go unmeasured, while something completely unrelated like the college list drives the conversation about mission.
Secondly, I want to dispel the notion that schools get students into colleges. Yes, there was a day, not too long ago when the admissions director at Harvard would make the trek up to Exeter, ask the principle what fine young boys he had for Harvard this year, and the principal would get to rattle off his list of boys he thought would grow into Harvard men. But those days are far distant from the current admissions landscape. College counselors have no magic code that they can use to secure admission for a student, and it can be very dangerous to let the assumption that they do go unchallenged, as often parents and students can come to blame the college office for a denial.
Thirdly, a high school should be cautious of tying its self-worth to something that is beyond its control. Which students elite colleges decide to admit every year is something schools have no influence over, and as colleges become more and more diverse, making efforts to reach out to the furthest corners of the globe to recruit students, it’s only natural that the number of students from any particular independent school that will earn admission to these schools will decline. A comparison of Exeter’s college list from 1930 to today’s would surely make it seem as though today’s students are far less accomplished on the narrow measure of securing admission to Harvard, Yale and Princeton. Does this mean that the quality of an Exeter education has somehow slipped in the past 70 years? Surely not. By almost every measure, I’m sure you’ll find that the Exter student body of today is as accomplished, if not more so, than their 1930 peers.
Finally, and most importantly, the college list isn’t a clean measure of anything. There are far too many factors that go into the college decision of a single student, to say nothing of the decisions of the entire senior class. When you read a college list for a school, you have no idea of the various wild cards like legacy, athletic recruiting, diversity recruiting and other special interests that played into each decision. You also have no idea of the various personal and financial considerations that made a student choose one school over another. If a school wanted to boost it’s college list, it would be a fairly simple task—admit more 9th graders who show incredible athletic potential, particularly in heavily recruited sports like crew and squash. Or simply admit more legacies from colleges where you wish to increase your admit rate.
All of this is to say that I think the decision of where a student chooses to go to college to be one of the most personal he or she will make, and our job as schools and teachers is to support that student with thoughtful counsel, the strictest confidence, and little more. I would extend this further, and say that as faculty, we generally do not need to know about the college plans of every student, how many kids are going to our alma mater, or anything more than what individual students choose to share with us.
I know of schools that approach this differently, and choose to publicize their college lists in many different ways. In fact, you can often see this in alumni magazines of high schools. Consider the following two ways a school might choose to present a college list:
Method A seems clearly designed to allow those who choose to measure a school narrowly, to simply scroll down the list to find the college they’re interested in, count up the number of kids attending it, and draw their own (often false) conclusions about the high school. Method B seems to be much more focused on presenting information about individual children.
But before you even make this decision, I would make the argument that you should ask the question of why, in the first place, you are choosing to print the list at all. Is it to celebrate the students? If so, is this what they want? Does a student want the only time for his name to be printed in the magazine to be under then name of what he might think the community sees as “backup U?” Does this approach leave any room for the student who decides to take a gap year?
We struggled with something like this at my former school in our school profile (where is traditionally commonplace to list the colleges where students from our school went). In addition to the number of matriculations in the past four years at each college, we also listed the number of acceptances. Inevitably, this would mean that some college would end up with a line that looked something like 7/1, indicating only 1 of 7 acceptances from our school choose to matriculate there. As you might imagine, this can send a clear message to a school that our students view it as a safety, even when it may not be the case, and this data point is just the result of a couple of anomalous years. So we later switched to only listing only the matriculations, which led to less confusion for colleges about trying to divine meaning from from tiny fluctuations in yield.
All this gets back to my original thought about deepening the conversation about quality in education. How do you measure whether a school is achieving its mission? What do you do with the information you find? And how do you communicate the unique strengths of a school to the wider community?