Results 1 to 3 of 3

Thread: Rankings and oddities

  1. #1
    Join Date
    Nov 2005
    Posts
    726

    Default Rankings and oddities

    I've pulled together a bit of data for anyone who would like to view the Boston Magazine data, the Newsweek rankings and the US News and World Report rankings. They are all in a spreadsheet, so you can sort and filter to your heart's content.

    I am left wondering how Weymouth scored as highly as it did in the Boston Magazine rankings, and how many of the US News and World report bronzes were awarded. Milton and Framingham stuck out as lower performers who ranked in Newsweek's rankings.

    I should note that I added a column averaging the cost-efficiency and academic performance rankings - it's pretty meaningless, but I had wanted to sort by it out of curiosity.
    Attached Files Attached Files
    Last edited by Kim Reichelt; 08-26-2008 at 02:48 PM. Reason: addition of final paragraph

  2. #2
    Join Date
    Nov 2005
    Posts
    726

    Default Gaming the rankings...

    This article in today's Boston Globe talks about how institutions can "game" the rankings.

    The source material for this article comes from here, a piece in Inside Higher Ed (which is a more useful read than the Globe piece).

    I've wondered how much of this might go on in some of the HS rankings, particularly the Newsweek one which relies solely on AP participation.
    Last edited by Kim Reichelt; 06-04-2009 at 09:00 AM. Reason: to include the Inside Higher Ed article

  3. #3
    Join Date
    Nov 2005
    Posts
    726

    Default Gaming the rankings, even if not intentionally

    Newsweek's rankings, which seem to keep being mentioned in Town Crier letters to the editor and discussion board posts, have a serious flaw.

    Let's ignore for a moment that they focus on a single metric (AP exams taken), and let's ignore that even for that single metric that they ignore whether the AP courses themselves are even taken, or whether the tests are passed, there's another flaw with the measure that (at least for some subset of schools) renders it even more meaningless.

    Here's how Newsweek describes what they do:

    We take the total number of Advanced Placement, International Baccalaureate or Cambridge tests given at a school in May, and divide by the number of seniors graduating in May or June.

    If a school has a stable population, such a metric (again ignoring its value) at least is internally consistent. But if a school does not, the metric begins to completely break down and make no sense. This calculation gives a tremendous advantage to schools whose populations dwindle over the four years (as often happens at many charters). Since the metric used is AP exams divided by graduating seniors, having younger students who take an exam (and are counted in the numerator), but do not graduate (so are not counted in the denominator) can dramatically over-inflate the metric.

    For example, in Massachusetts, three charters made the top 100.
    • Sturgis Charter Public (64 graduating v. 108 freshmen)
    • MATCH Charter (34 graduating v. 94 freshmen)
    • Mystic Valley Regional Charter (41 graduating v. 97 freshmen)


    It does not appear that such population dwindling (which should be a huge negative in evaluating a school, not a positive) is accounted for in the metric.

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •