Skip to content

The college list—a measure of a school?

January 12, 2011

I teach at an independent school, and in many ways we are lucky to be spared the tumult caused by the constant need to “measure” progress through standardized testing and the constant evaluation of schools and teachers via shallow means that seem to be so dominant in discussion of education today.

Still, every school needs some measure of how they are doing, and in the independent school world, the measure chosen is often the college list. At first, this seems like the perfect measure of a school. Many of our schools are “prep” schools, designed to “prep” kids for college, all of our parents expect their children to attend college after graduating from our schools, and many of our parents expect that our schools will provide their child with that vital competitive advantage necessary to secure admission to an “elite” college.

I want to explore the flaws of using the college list as a metric of success for a school, and then ask a few questions about how it tends to be used by schools in their self-promotion efforts.

There are a number of reasons I believe the college list is a flawed metric of an independent school. You can start by looking at the mission statement of almost any school. You’ll no doubt find plenty of talk of teaching citizenship, academic engagement, and a love of learning, but I doubt you’ll find any mention of “getting students in” to top colleges. If this isn’t in the mission, then the college list is measuring something entirely different form the mission of the school, and using the list as a metric can actually be detrimental to the mission of the school, since often the things in the mission go unmeasured, while something completely unrelated like the college list drives the conversation about mission.

Secondly, I want to dispel the notion that schools get students into colleges. Yes, there was a day, not too long ago when the admissions director at Harvard would make the trek up to Exeter, ask the principle what fine young boys he had for Harvard this year, and the principal would get to rattle off his list of boys he thought would grow into Harvard men. But those days are far distant from the current admissions landscape. College counselors have no magic code that they can use to secure admission for a student, and it can be very dangerous to let the assumption that they do go unchallenged, as often parents and students can come to blame the college office for a denial.

Thirdly, a high school should be cautious of tying its self-worth to something that is beyond its control. Which students elite colleges decide to admit every year is something schools have no influence over, and as colleges become more and more diverse, making efforts to reach out to the furthest corners of the globe to recruit students, it’s only natural that the number of students from any particular independent school that will earn admission to these schools will decline. A comparison of Exeter’s college list from 1930 to today’s would surely make it seem as though today’s students are far less accomplished on the narrow measure of securing admission to Harvard, Yale and Princeton. Does this mean that the quality of an Exeter education has somehow slipped in the past 70 years? Surely not. By almost every measure, I’m sure you’ll find that the Exter student body of today is as accomplished, if not more so, than their 1930 peers.

Finally, and most importantly, the college list isn’t a clean measure of anything. There are far too many factors that go into the college decision of a single student, to say nothing of the decisions of the entire senior class. When you read a college list for a school, you have no idea of the various wild cards like legacy, athletic recruiting, diversity recruiting and other special interests that played into each decision. You also have no idea of the various personal and financial considerations that made a student choose one school over another. If a school wanted to boost it’s college list, it would be a fairly simple task—admit more 9th graders who show incredible athletic potential, particularly in heavily recruited sports like crew and squash. Or simply admit more legacies from colleges where you wish to increase your admit rate.

All of this is to say that I think the decision of where a student chooses to go to college to be one of the most personal he or she will make, and our job as schools and teachers is to support that student with thoughtful counsel, the strictest confidence, and little more. I would extend this further, and say that as faculty, we generally do not need to know about the college plans of every student, how many kids are going to our alma mater, or anything more than what individual students choose to share with us.

I know of schools that approach this differently, and choose to publicize their college lists in many different ways. In fact, you can often see this in alumni magazines of high schools. Consider the following two ways a school might choose to present a college list:

Two ways of presenting the college list.

Method A seems clearly designed to allow those who choose to measure a school narrowly, to simply scroll down the list to find the college they’re interested in, count up the number of kids attending it, and draw their own (often false) conclusions about the high school. Method B seems to be much more focused on presenting information about individual children.

But before you even make this decision, I would make the argument that you should ask the question of why, in the first place, you are choosing to print the list at all. Is it to celebrate the students? If so, is this what they want? Does a student want the only time for his name to be printed in the magazine to be under then name of what he might think the community sees as “backup U?” Does this approach leave any room for the student who decides to take a gap year?

We struggled with something like this at my former school in our school profile (where is traditionally commonplace to list the colleges where students from our school went). In addition to the number of matriculations in the past four years at each college, we also listed the number of acceptances. Inevitably, this would mean that some college would end up with a line that looked something like 7/1, indicating only 1 of 7 acceptances from our school choose to matriculate there. As you might imagine, this can send a clear message to a school that our students view it as a safety, even when it may not be the case, and this data point is just the result of a couple of anomalous years. So we later switched to only listing only the matriculations, which led to less confusion for colleges about trying to divine meaning from from tiny fluctuations in yield.

All this gets back to my original thought about deepening the conversation about quality in education. How do you measure whether a school is achieving its mission? What do you do with the information you find? And how do you communicate the unique strengths of a school to the wider community?

12 Comments leave one →
  1. January 12, 2011 8:24 am

    I like your thought on the mission statement. If it is not
    in there, why are we measuring it. People will argue that
    “mentioning” college admissions/acceptance is not measuring, but in
    a way it is. You are comparing yourself to something. I like
    Presentation B of the college list because it seems to put more
    focus to the student. I wonder about listing all of the colleges to
    which a student has been accepted. Would only being accepted by one
    seem inferior. And as you mentioned what about a gap year? What
    about those that college is not the right fit, so they do not apply
    to any. We really need to take a good look at what do we want to

    • January 12, 2011 10:54 am

      I think you’re right. Too often, I think we don’t address these issues directly. We talk glowingly about our mission, but don’t mention how we actually measure to see if we’re living the mission. Or we try to avoid talking about the college process altogether, and yet leave the college list in our publications to let parents draw their own conclusions. My thought is we have to address both. We have to explain what the college process means to us, and how we judge its success, if not by the college list. And this needs to happen almost the very moment you enroll a kindergartner, since the 5% parents out there already have their 5 year old in their alma mater’s sweatshirt.

      Also, a point I forgot to make is that if we let the college list drive the process, this might create some perverse incentives for the college counseling staff to steer students away from schools that might be the best fit for them. Counselors might feel incentives to steer kids to take on more loans for a more “prestigious” school, encourage kids to apply to schools for the sake of racking up admits, or discourage gap years. We can also lead students to think that they are letting us as a school down, if they don’t score admission to a prestigious school, which can have some bad repercussions on a student’s self esteem and and undermine their satisfaction at being admitted to a college not deemed “elite.”

  2. January 12, 2011 9:41 am


    Measuring the success of any independent school’s mission is complicated. All kinds of data could go into the equation. For example:

    *combined SAT scores
    *profile of AP scores
    *historical trends of SAT and AP scores
    *% of junior class that are National Merit finalists
    *qualitative survey data of students’ satisfaction with the school
    *athletic achievements
    *artistic accomplishments
    *# of debate trophies
    *# of robotics championships
    *# of teachers who have published
    *# of PhDs on the faculty

    You get the picture. The list is almost endless.

    What a school chooses to focus on or promote seems to depend on what the school culture values. In my opinion, most of these variables really don’t tell you much about the quality of the experience for students. The quality experience is in the relationships between teachers and students. At the heart of what makes a good school, and what allows a school to fulfill its mission, is how it defines and evaluates good teaching. Good teachers are the heart and soul of a school. That is how I would measure excellence. What are the teachers like, what is their attitude towards growth and improvement, and how do they relate to students? I know that is not what your post focuses on, but I would say the college list is totally irrelevant to me, as an educator and parent of a junior. I want to know if my daughter is cared for, nurtured, appreciated for her talents, encouraged to take risks, encouraged to question, encouraged to think deeply about ideas, and encouraged to be connect beyond herself. If a school does this, then my child is well-served regardless of where she goes to college.

    However, it is very hard as a parent to stay focused on this when all the noise is about the COLLEGE LIST. This mostly comes from anxious parents. I think it is a school’s responsibility to fight back and help parents see the big picture. There are hundreds of excellent colleges.

    Anyway, thoughtful post and good questions to ponder.

    Bob Ryshke

    • January 12, 2011 11:07 am

      I think you’re absolutely right that it is the school’s responsibility to educate the parent body about the more complete measures of success it uses to assess itself, and the flaws in using any of the narrow metrics, especially the college list, you’ve described before. I think that requires a long term, very thoughtful approach to parent education that I don’t see many schools taking on. It’s easy to see why, when you’re up against the WSJ ranking schools by the % of kids they send to Harvard, Yale and Princeton.

      Also, your list seems composed of things that while easy to measure, and usually , don’t really get at most schools’ mission statements. I think we need to create some easily measurable metrics that do assess whether schools are meeting their mission statements. One thought I had after reading Bo’s comment on mentoring, is asking our students if they feel they have a mentor, and what that means to them. I’m going to try to blog on this soon. It turns out that this is one of the primary metrics Harvard uses to measure undergraduate happiness and satisfaction.

    • Agnesm permalink
      January 14, 2011 8:14 pm

      I think it is easier to use numbers than to figure a way to assess the intangibles. Many people feel safe looking at numbers, they believe that numbers “tell it like it is.” We are so data driven! The intangibles? The mission? The values? No, we are not doing a good job emphasizing them in relation to transcripts, SAT’s and AP’s.

      • January 14, 2011 8:18 pm

        This is a great point. We do sometimes have an inherent bias to think that numbers are “unbiased”, but we as a society cling to stories. I think we need to find the way to tell the stories of our schools and students. I know this is wishy-washy and vague, but I do have in my ever-growing pile o’ draft posts a plan to write about some specific metrics that both tell stories and can be quantified. First on my list will be mentoring.

  3. January 13, 2011 3:45 pm

    Hi John:

    This is a very important topic, and one I also think about often. I love what Pat Bassett has written about “Demonstrations of Learning,” and how we could better evaluate our schools by declaring loudly to the world what our graduates will be able to do upon graduate, holding them to that standard, and reporting our success rate at these high measures.

    I often think that many of our missions include some of the following things: providing an education to students that is in their eyes engaging, meaningful, and rewarding; providing a school setting where students feel mentored and supported; teaching such that all of our students, across the board, gain better than average grounds in their mastery of skills; and ensuring students acquire the very valuable higher order critical thinking skills such as complex problem-solving, document interpretation, and effective written communications.

    Fortunately, we now have a trio of tools available which measure exactly these things, and we can use (and I do use at my school), to loudly communicate how well we are doing, to try to offset a narrower review based upon the college list and SAT scores/NMSQT qualifications. The High School Survey of Student Engagement, the NWEA Measures of Academic Progress, and the College Work REadiness Assessment provide these data.

    If you are interested further, you can see this fine monograph at NAIS on these three tools:

    I have also written about these three tools often on my blog at; this post is one of the overviews I have done:

    Keep up the cause! This is so important!

    • January 13, 2011 4:01 pm

      Jonathan and John:

      I think your idea of using these newer tools to measure progress, achievement, college readiness are a great idea. I will read your blog post. John and I have discussed a few of these tools. Unfortunately, Westminster is not using any of them currently. From my experience at other independent schools (Trinity School (NYC), Phillips Academy, Marlborough School (LA), Moses Brown School (RI), North Shore Country Day (IL), etc.) it would take a mindset shift to move the community to using these tools instead of the college list, SAT, NMSQT, etc. However, we should talk about the advantages of moving in this direction.


    • January 13, 2011 4:12 pm

      Thanks so much for this comment! I agree with all your thoughts and know the CWRA very well. I was in on the ground floor of launching this at St. Andrew’s in Delaware. As you’re now doubt aware, it’s a truly unusual standardized test—one that students actually enjoy taking and find that it makes them really think. I’ve been advocating its use at my school, and hope that it continues to grow in popularity. Connecting the CWRA with the CLA in the classroom initiative seems like a powerful model of instruction, as it empowers faculty to design their own performance task for use in their classrooms.

      I also loved the videos you guys put out describing the CWRA at St. Gregory’s.

  4. January 14, 2011 2:09 pm

    I agree with your main point about college lists being
    misleading, but claiming that the mission statement has anything at
    all to do with what a school (or any other institution) is about
    misses the point. Mission statements are purely administrative BS
    that are intended to suck up people’s time and attention while
    producing no useful product. They are a diversionary tactic to keep
    people from noticing how much admistrators are getting paid.
    Occasionally they are used as PR fluff to try to convince parents
    (or other customers) that something wonderful is happening. I have
    yet to see a mission statement that actually guided any decision
    making anywhere. As a parent choosing schools for my son, I did
    look at the college list on the profiles of schools. It does
    concern me that the high school my son is attending sends only
    about 1/4 of its graduates to 4-year colleges. (Which 4-year
    colleges does not matter to me so much, but the paucity of
    college-bound kids does put some real limits on how high the
    curriculum can go.) There is a better school in town (which he
    attended for middle school), but I can’t really afford it and it
    also had somewhat limited opportunities in science and

    • January 14, 2011 4:53 pm

      I should clarify my point a bit. All of the school’s I’ve taught at are private secondary schools were 99%+ of kids went to four year schools. And the college list I’m talking about is the list of which 4 year schools students went to, which is often taken as a metric of quality when comparing one private school to another. It is totally different when you are talking about most schools, where not every student goes to college. The fact that most students are prepared to go to college, and earn admission at a 4 year school is, as you say, tell you something about the general academic strength of the student body and the curriculum (but I would also say it tells you something about the wealth of the community and the education level of parents in that community as well). The gap between a student prepared to go to a 4year college, and one who just graduates high school would seem to be a lot bigger than the gap between one who matriculates at a state school and an ivy, in my opinion, so trying to read the tea leaves of the private school college list, and draw some meaning from this school sending 5 kids to brown versus that school sending only 3 really is not all that useful.

      And while I think you are right that some mission statements can sometimes be just written as fluff for view books, I don’t think it has to be that way, and I think a lot of improvement could come to schools if parents, students, and educators tried to hold their institutions to these words.

      • January 15, 2011 1:36 pm

        I agree that distinguishing between Harvard, Stanford, UC
        Berkeley, UC Santa Barbara, … tells essentially nothing about the
        school. Seeing a diversity of schools rather than all students
        going to one school does tell parents something about the advising
        by the counselors. If almost everyone is going to just one or two
        default schools (no matter what the “rank” of the school) is a sign
        of inadequate counseling. Seeing everyone going to lower-tier
        schools is also a sign that the students lack ambition (or are
        being dissuaded from aiming high). Of course, college lists for
        private schools in California are a bit different from ones on the
        east coast, because the public University of California is as
        prestigious as private universities (and now costs as much for out
        of state students), so there is not the same pressure to get into a
        private university. My son was in a private
        middle-school/high-school for middle school, and so I saw the
        mission statements, college lists, and all the other PR junk. There
        were a few things in all that material which were informative: the
        classes sizes at the school were genuinely small (too small, in
        fact, driving the tuition up too much—we might have stayed at the
        school if the class had been 50% larger and the tuition 25% lower),
        the building was very nice with good art studios and computer labs
        (though lacking a library or PE facilities), the art and music
        training was exceptional (but the science training was
        second-rate), … The college lists were uninformative—essentially
        anyone who could afford the private school was going to college.
        So, basically, I agree with you. The college list provides limited
        information for deciding between schools that have over 90% of
        their graduates going to college. I’m interested in what measures
        you can come up with that would be more useful.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: