Chicago Magazine’s Top Schools 2016

August 28, 2016 at 11:02 am 73 comments

Chicago Mag

Wanted to share the lists from a new Chicago magazine piece that ranks CPS elem and high schools (also suburban rankings for those interested.)

The rankings take into account a mix of test scores, attendance, and 5-Essentials ratings (a very helpful resource to understand how parnets and teachers rate their school.)

As we know, CPS’s selective schools typically top “ranking” lists due to taking the top-scoring students in the city.  This elementary list is a bit different because it uses MAP test growth rather than attainment.  This explains why some non-selective schools popped up on this list while Decatur and Edison are not on the list.  These schools may have also lost some points on the 5 Essentials rating (I cannot fully tell how Chicago Mag pulls the 5 Essentials number for its rating.)

In any case, as with any ranking list, it’s not about the fine details of who is better than who, but rather to help parents look at options they many not have considered.  The lists may shine light on some schools that are doing good things but aren’t necessarily “on the radar” for everyone.

Here’s the link with full info:

ELEMENTARY SCHOOLS (does not include charters)

  1. Skinner West (S) – Near West Side
  2. Wildwood IB World Magnet School – Forest Glen
  3. Blaine – Lake View
  4. Dixon – Chatham
  5. Hawthorne – Lake View
  6. Skinner North (S) – Near North Side
  7. McDade Classical (S) – Chatham
  8. Healy – Bridgeport
  9. Powell Jr. Paideia Com Acad – South Shore
  10. Whistler – West Pullman
  11. Edgebrook – Forest Glen
  12. Sherwood – Englewood
  13. Mitchell – West Town
  14. Prescott – Lincoln Park
  15. Coonley – North Center

HIGH SCHOOLS (non charter)  S=Selective (I’ve added the CPS rating after each school)

  1. Payton (S) – Near North (1+)
  2. North Side College Prep (S) -North Park (1+)
  3. Whitney Young (S) – Near West Side (1+)
  4. Jones (S) -Loop (1+)
  5. Lane Tech (S) – North Center (1+)
  6. Brooks (S) -Roseland (1+)
  7. Von Steuben (S) – North Park (1+)
  8. Lindblom (S) – West Englewood
  9. Phoenix Military Acad (S) – Near West Side (1+)
  10. Kenwood – Kenwood (1+)
  11. Lincoln Park – Lincoln Park (not noted as selective in Chgo Mag, but probably has selective programs) (1+)
  12. Senn – Edgewater (1)
  13. Chicago High School For Agricultural Sciences (S) – Mr. Greenwood (1+)
  14. World Language HS – West Lawndale (1)
  15. Westinghouse (S) – East Garfield Park (1+)
  16. Amundsen – Lincoln Square (1)
  17. Alcott – Lake View (1+)
  18. Prosser Career Acad (S) – Belmont Cragin (1)
  19. Rickover Naval Acad (S) – Edgewater (1+)
  20. Solorio – Gage Park (2+)
  21. Mather – West Ridge (2+)


  1. Noble Butler – Pullman (2+)
  2. Chicago Math and Science Acad – Rogers Park (1+)
  3. Noble UIC – West Side (1+)
  4. CICS Northtown – North Park (1+)
  5. Noble Baker – South Chicago (2+)

Entry filed under: Uncategorized.

Academic Center Admission by Tier Back to School Fall 2016

73 Comments Add your own

  • 1. robin in WRP  |  August 28, 2016 at 11:07 am

    Great to see rising neighborhood high school, like Mather and Amundson

  • 2. chickpeatravels  |  August 28, 2016 at 11:34 am

    Which do you think is a better school Nta or InterAmerican?

  • 3. Cynthia  |  August 28, 2016 at 11:39 am

    Lakeview didn’t make it. Surprised because of the media attention. Not surprised the same time.

  • 4. Jason  |  August 28, 2016 at 12:07 pm

    The high school rankings are completely inaccurate, and laughably so. Where is Lindblom? CPO you should add the 2016 US News and/or Newsweek high school rankings to the OP as those are more reputable and vetted. Don’t know much about this magazine, but when they have to keep adding schools they “missed” to their rankings, I think that calls into question their methods and integrity.

  • 5. South side obsessed  |  August 28, 2016 at 12:47 pm

    Interesting Lindbloom is not on there. Neither is either one of the Lasalles. Especially since Lasalle I was named as a Blue Ribbon school last year. Super surprised to see Powell on there. They have come a longggggg way. Blaine made it despite their political problems last year. A testiment to the former principal I would say.

  • 6. cpsobsessed  |  August 28, 2016 at 12:48 pm

    Well, “inaccurate” implies that data is wrong. Chicago Mag shows (most of) their input which is all numbers based. So @Jason, when you say “inaccurate”, what are you referring to? What is the “true” accurate ranking of CPS schools? There are many ways one can rank them. You seem to have an idea of what the true ranking should be… Lindblom may be similar to how Decatur or Edison isn’t included (perhaps high attainment, low growth for some reason?) It is a level 1+ school with high 5 Essentials scores, ACT avg 23. The “growth” figure is just average (again, something possible in very good schools with high scoring kids.)

    Does USNews report just within Chicago? If so, can you share a link?

  • 7. Vicki  |  August 28, 2016 at 2:51 pm

    Regarding high schools- the ACT growth is interesting. The kids at NSCP are going in with the highest in the city and showing growth in the lower half of this group? Am I interpreting this correctly?

  • 8. Marketing Mom  |  August 28, 2016 at 5:31 pm

    This is very interesting and can help folks look beyond the usual handful of high schools they only consider.

  • 9. cpsobsessed  |  August 28, 2016 at 7:46 pm

    @Vicki – you’re referring to the 58% growth # for NSCP? My understanding is that schools with a lot of high scoring kids may have a difficult time reaching very high growth (e.g. with Edison and Decatur not showing on the elem list.)

    I’d have to read a little more about HS growth #s are calculated to understand that measure better.

  • 10. Vicki  |  August 28, 2016 at 8:38 pm

    @cpsobsessed…yes. However, CPS website shows Edison to have great growth, same with Decatur (with the exception of 6th grade math). The NSCP data shows average growth. My understanding of growth is “what are they doing with the kids once they get them?” Payton is taking those kids and pushing them further. Is NSCP just keeping them at the same level? Not sure how many 32+ they had on ACT. It would be interesting to look at the real data, not just averages.

  • 11. cpsobsessed  |  August 28, 2016 at 10:08 pm

    Here’s what it says ACT growth is based on. I’m unclear if this is measuring the same kids year over year? Kids don’t take the ACT twice though, so I’m still unclear on how that’s calculated.

    × Student Growth measures the change in ACT EPAS®† standardized test scores between two points in time, in this case between Spring 2014 and Spring 2015. This growth is compared to the average national growth for schools that had similar scores in 2014. A 50th percentile score means the school grew at the same rate as the national average. The charts below show the Student Growth by grade level.

  • 12. Eagles Mom  |  August 28, 2016 at 10:45 pm

    OMG? Where is Lindblom?

  • 13. Jen K  |  August 29, 2016 at 8:29 am

    I’ve always hoped they would do a non-SEES list and this comes close. Most families will not get into one of those schools so it would be helpful to see which neighborhood/magnet/charters are performing well. I would also like to see a longer list – maybe top 30 elementary? So happy to see some of these unexpected schools get recognized for the work they are doing.

  • 14. Chris  |  August 29, 2016 at 10:56 am

    ” I would also like to see a longer list – maybe top 30 elementary?”

    It is pretty silly to do 20 HS, and only 15 ES–that means 1 in 6 (non-Charter) HS’s is on the list, but only about 1 in 27 ES.

  • 15. cpsobsessed  |  August 29, 2016 at 12:11 pm

    Agreed. I wonder if we can request the full ranking? I understand they can’t print it in the mag but perhaps online…..

  • 16. tacocat  |  August 29, 2016 at 2:47 pm

    Not sure how ACT ‘growth’ is indicative of a good school. If the school has outstanding scores two years in a row, i would call this good. But this survey would seem to penalize for lack of growth.

  • 17. Blaine  |  August 29, 2016 at 3:26 pm

    Anyone able to shed light on how Blaine makes the list with zero data in the 5Essentials…and is ranked higher than schools who have higher growth? Props to Blaine for what they do, just not clear on how they are so highly ranked in a list that touts using non-traditional data that is unavailable for that school.

  • 18. cpsobsessed  |  August 29, 2016 at 4:54 pm

    @Tacocat – I don’t *think* ACT growth looks at say ACT 2016 vs ACT 2015 (as it would be impossible to grow year after year.) But I can’t figure out what the measure is.
    I do believe that growth scores on highly selective schools with all very high scoring kids can look distorted.
    I’ll see what I can find on that growth measure, just out of curiosity.

  • 19. Chicago School GPS  |  August 29, 2016 at 5:18 pm

    The debate about ranking metrics aside, it is great to see quite a few “hidden gems” and improving schools on the list. Hopefully this exposure will spur families to take a look at those schools, many of which have high percentages of low-income students but are making great strides.

  • 20. donna  |  August 29, 2016 at 6:39 pm

    I think growth is most important. I believe the metric used is EPAS (now defunct) and the growth would measure EXPLORE to PLAN to ACT. Same kids using the same series of assessments. As someone posted earlier- what are they doing with the kids that they get? It’s great that Edison gets gifted children..but they also need to show that the children are growing. I am actually shocked by the meager growth at North Side. I am impressed with Walter Payton. For some reason, I don’t feel that sending a student with a high PLAN scores and only showing average growth is impressive. On the flip side- show me any school that takes a kids and makes them grow!!!

    As a teacher, I want to see growth. I can tell you it is very easy to take already smart, hard working kids and get them to comply with what you give them, resulting in average growth. It would be interesting to see the all of the ACT scores coming out of those schools. Sometimes the average is not the best indicator.

  • 21. cpsobsessed  |  August 29, 2016 at 8:17 pm

    idk.. we know Payton and NSCP have similar entry scores.
    ACT scores last year were:
    Payton 29.6
    NSCP 28.7
    There may be some weird adjustments/scales that make NSCP’s growth look low, but obviously both are turning out very strong average scores.

  • 22. Newcomer  |  August 29, 2016 at 10:57 pm

    OR….. don’t choose a high school based on “ACT growth.” There are many ways to improve one’s ACT score, and so many kids are using those methods (online training, tutors, books, self-study, bootcamp) that it would be a reckless gamble if your kid just walked in and took the test once. Do your ACT prep on the sidelines but certainly don’t depend on your high school for that. High school is for immersing in great literature, learning other languages, doing advanced math, singing in a musical, wheel-throwing pottery, publishing a newspaper, running cross country, writing poetry, debating smart opponents, joining offbeat clubs, playing lacrosse, being an integral ensemble member in an orchestra/ jazz band / choir, perfecting time management, figuring out what your major might be, figuring out who you are, painting murals, studying with your friends, arguing with them, exposing yourself to new ideas, growing into an adult, learning how to be a mentor and a mentee, and supporting your peers while they do all of that…NONE of which will be tested on the ACT.

  • 23. Don Justice  |  August 30, 2016 at 7:37 am

    The metrics are misleading. Absolute ACT score is useful. Growth is useful with the caveat that prepping kids to do well on the ACT will improve skills, but the test is not measuring creativity nor critical thinking. The 5 essentials is easily manipulated by telling staff and students that low scores hurt you, so only put down perfects. We know from past scandals that attendance rate is manipulated. Have a student not showing up? Mark that they are now homeschooled or moved to Mexico. Graduation rate is a joke, as principals push for everybody to pass, regardless of whether or not the student deserves it. Missing a credit? Independent study. Find a scared teacher or bully one to just give some worksheets and a semester credit. Or, complete virtual high school on line in a week with the counselor doing most of the work. Freshmen on track is manipulated through bullying, threats of low ratings for teachers who fail anyone, or put attach so much extra work to remediation get a failing student that no teacher will fail anyone.

  • 24. I wondered, too...  |  August 30, 2016 at 8:32 am

    @17. I wondered the same thing! How can a school that does not have the 5 Essentials factored in make the “top” list? By rights, that should drop a “perfect” score on this rubric to a mere 75% (the Essentials are allegedly 25% of the Chicago Mag score).

    Maybe someone at Chicago Magazine just wants to give CPS the finger over its decision to fire the Blaine principal? LOL.

  • 25. cpsobsessed  |  August 30, 2016 at 10:09 am

    @I wondered… hahaha, maybe??
    My guess is either:
    1. Even though CPS doesn’t report data on the site bcs of a low sample size, the data may exist in a bigger file that Chicago mag used (I can’t recall where they said they got their data.)
    2. For schools without one measure, they just treated it as missing and used the other measures. (but to your point, would be odd to give that school a high-ranking status.)

    In either case, I think Blaine has always been a strong performer on scores and has a strong parent base. Not sure if they’ve chosen a new principal yet, but given the neighborhood demos it’ll likely continue to be a solid elementary choice. (If you can afford to live there.)

  • 26. Learning CPS  |  August 30, 2016 at 10:40 am

    @ 25 The data, or lack thereof, for Blaine directly from the 5Essentials reports is here:

    Doubt ChicagoMag had access to anything more.

    Agreed, Blaine is a good school, all of these are good schools. But this is the kind of thing that makes how the list is put together a little confusing. Would love to see more than 15 elementary schools and get a better view of what the key differences were.

    RE: Growth as a metric. I think can be very valuable when showing progress in a school, especially if they started with lower growth and move up. But for SE schools, it is a tricky thing and I don’t think lower growth scores at an SE school inherently shows lack of good teaching – especially if attainment is at the top (which for SE it almost always is). The new CPS NWEA scores for 2015-2016 are out now and growth in many SEES only schools (those not with a neighborhood component) is lower than it has been. If you look at the RIT scores of SE kids, they are often significantly higher than expected for their grade level to start in the Fall. I’ve read that while RIT can go up to close to 300, RIT scores of 240/250 are considered pretty top achieving scores on the scale and most kids aren’t expected to hit those until high school. SE kids are hitting those in grades 3-5 in many cases. How can the test then demonstrate significant growth if you are starting with scores near the top? You can only advance curriculum so far at that stage and these tests are simply measuring how far into difficult questions the kids get in reading and math. I’d be concerned if there was stagnant or no growth in an SEES school, but small growth is probably right on track.

    Rankings just on top knowledge attainment test scores definitely don’t tell the whole picture and favor SE schools. Rankings on knowledge growth don’t show the whole picture either and work against how an SE school is set up. I think both need to be included in any ranking.

  • 27. cpsobsessed  |  August 30, 2016 at 10:55 am

    @26 – great explanation.
    I believe growth and attainment were included for both elem and HS (although elem used PARCC which was in it’s first pilot year, so potentially not an ideal measure.)

  • 28. Learning CPS  |  August 30, 2016 at 12:01 pm

    @27 – Totally agree that PARCC is not a good measure, especially in it’s first year of use when many schools had varying participation. Not to harp on Blaine more, but 85% of their students refused PARCC so even if that is tied into this ranking, it is bad data point no matter how great the 15% of kids who took it were.

    This ranking approach also weighted elementary NWEA growth significantly stronger than PARCC attainment. PARCC was 20% but both reading and math growth were EACH 20%, meaning that almost half of the elementary score came from growth. A school with high attainment, but smaller growth numbers will not necessarily show as well with that weighting. I would also add that attendance at the elementary level as a minor indicator of quality of school and much more likely to be tied directly to parent involvement in making sure the child is transported to school. Skipping school is far less common in elementary grades. Even most of the CPS Level 3 rated elementary schools have over 90% attendance. Weighting that at 15% probably doesn’t show all that much.

    Bringing more factors into the ratings is not a bad thing, I’m just not sold that this particular list got it all quite right yet.

  • 29. Cliff  |  August 30, 2016 at 12:12 pm

    OR maybe they are using the 2014 scores?

    Their “methodology” page (which is definitely under-specified) doesn’t say.

  • 31. cpsobsessed  |  August 30, 2016 at 12:36 pm

    for an admin who urged his entire school to opt out of PARCC and (as it seems) not respond to the 5 essentials survey, he does like to tout the ratings. (this comment intended non-snarkily.)

  • 32. JMOChicago  |  August 30, 2016 at 2:00 pm

    If they used NWEA/MAP to measure growth, hypothetically, that test “is unlimited in terms of how far up or down it adapts to determine
    an individual student’s level.” So, no matter if your child is proficient or beyond proficient when they are admitted. The NWEA/MAP still purports to measure growth because it is an ADAPTIVE TEST. The test that a child will take at a non-SEES school in the 3rd grade may be different than the test that a 3rd grader takes at a SEES school in the same year because the test adapts in real time (according to

    Whether you are the parent of a SEES or non-SEES student, the purpose of education is to foster educational growth no matter where a student begins in their academic career.

    I can understand that it may be a shock to the system (and possibly the ego) when your child’s school drops from a list that measures growth. On the other hand, this is a great opportunity for parents to meet with teachers and administrators and push for answers as to how their child’s school plans to address issues around educational growth.

    The past method of ranking schools based on achievement scores on a standardized test has ALWAYS been problematic and misleading.

    This method at least tries to correct for the problems of past methods. If you are a parent or principal or teacher who is feeling defensive because your school was not included, I’d encourage you to take the opportunity to check out the components of your scores that led to that and talk about improvements.

    I’d also like Chicago Magazine and RTI to publish the data set in the open, I think that would only help schools/parents discuss the ranking and what needs to improve (WBEZ and Catalyst are the leaders in publishing their data.) And, like other readers, I’m very curious about the Blaine scores…Blaine is a very good school. But the complete absence of data counting towards 25% of their score…I would be curious as to how that was handled for ranking purposes.

  • 33. Learning CPS  |  August 30, 2016 at 3:07 pm

    @32 It is “unlimited” but I would love to see how often anyone goes above 300.

    “The RIT scale is theoretically infinite, but most students’ scores fall between the values of 100-300.”

    The growth metric doesn’t reflect that at Decatur, example of a school not on this list that has been on previous lists, has kids in 5th grade who are topping the final Math RIT score at almost 260 and Wildwood (#2 school on this list) doesn’t even have 8th graders doing that. Those kids didn’t come in in 3rd grade at 260, they have grown as expected in the 2 years they have been at the school, they just didn’t have to have a huge leap in each grade because they didn’t start that low to begin with.

    When comparing the start/end Math RIT averages to Skinner West, Blaine, Wildwood (top 3) and Decatur, Decatur’s starting and ending RIT scores are higher in most every grade to grade comparison – even if you look Decatur 3rd grade vs Blaine 4th grade Math (as an example which is really more apple-to-apples since Decatur teaches reading and math a grade above) Decatur’s final 3rd grade numbers are higher than Blaine’s 4th grade numbers. Decatur’s growth from start to end of the year isn’t going to be as much when you are already starting so high on the scale and despite it being “infinite” it really has a ceiling of sorts. The growth percentages are smaller, but they aren’t going backwards in between grades or leveling off, which would be more indicative of a problem.

    That’s my only issue with growth being weighted at 40%, nothing to do with ego or defensiveness, and not that it isn’t a critical part of being a successful educator/school. If you are physically capable of having your fastest mile per minute be a 5 min pace, and you start your efforts to reach it at a 6 mile per minute pace…you just don’t have as much to do to improve as someone who starts at a 10 mile per minute pace. Does that mean that when they both reach 5 min per mile, the one who didn’t have as far to go should be less valued? This approach is kind of saying that just like previous approaches have been dismissing schools that are able to foster large growth numbers in favor of schools who are made up of students who pretty much are going to hit top numbers from the start.

    Despite my uncertainty about how this data really pans out, I actually love the discussion this list is bringing up and that schools not usually on anyone’s radar are being recognized.

  • 34. cpsobsessed  |  August 30, 2016 at 3:52 pm

    @Learning CPS – I agree, the growth metric is funky and I’m not conceptually a fan of it having a high share of the calculation.
    On the other hand, it’s the only way to showcase schools with lower scores that are making good headway.
    So I suppose I can see why Chicago Mag chose to include both!
    This is the difficulty of trying to get one score to represent a school (or anything else of complexity.)

  • 35. JMOChicago  |  August 30, 2016 at 4:27 pm

    So, you believe that when the documentation for NWEA says: “The RIT scale is theoretically infinite, but most students’ scores fall between the values of 100 and 300” that a) the NWEA is not capable of registering a score higher than 300 for any individual student? Or b) that most students are incapable of exceeding 300 whether they are “gifted” or not “gifted”? or c) other?

  • 36. JMOChicago  |  August 30, 2016 at 4:42 pm

    Also: “This approach is kind of saying that just like previous approaches have been dismissing schools that are able to foster large growth numbers in favor of schools who are made up of students who pretty much are going to hit top numbers from the start.”

    I think that is exactly what it is saying. This is a problematic issue….why?

  • 37. Learning CPS  |  August 30, 2016 at 5:10 pm

    @CPSO – Yes.

    @ JMO Chicago – I think it is overcorrecting too far, that’s all. And only in the weighting of growth, not in the effort to include it. Perhaps it is too difficult to get a true, even measure of growth and attainment. that can be stood on as “best schools”. This approach is obviously trying to get closer to that than simply looking at top test scores, and I don’t have an issue with trying…just not sure they got there…yet.

    And as far as a top RIT score, if NWEA is saying that most kids don’t go above 300, then that is probably a good gauge on how high scores tend to get with not that they can’t get higher. Does it mean that I think no student could achieve a higher scores, not at all, just that it is probably more of a rarity vs a norm and to assume that growth is dependent on them going past a mark that the facilitators of the test note as a typical top mark seems to ask a lot. The highest RIT scores shown by any CPS student is between 260-268 or so. And those scores are all coming from SE schools. Could those kids go higher, maybe, but they certainly shouldn’t be looked at as not doing well enough because they didn’t.

  • 38. JMOChicago  |  August 30, 2016 at 5:40 pm

    “Could those kids go higher, maybe, but they certainly shouldn’t be looked at as not doing well enough because they didn’t.”**

    So if an SE student started the year at, say, 250 and went to 260…that should be counted as a better effort than a student in the same grade who started the year at 150 and went to 200 by the end of the year?

    I’m not saying that the above is incorrect, I’m just trying to understand how your frame on what education is supposed to accomplish is informing your questioning of the methodology. Nor am I disagreeing with the observation that there is no perfect way (yet) to measure a quality school.

    Based on my beliefs (informed by an MSEd) re: what the goal of what a school is supposed to accomplish, I do think weighting growth more than achievement is appropriate though I do not yet know if this (Chicago Magazine) formula is the best formula, or just better than preceding ones?

    If someone believes that the 10 growth points of an SE student are worth more than the 50 growth points of a non-SE student…what assumption is that based upon? That yearly standardized test score growth measured each year is finite (thus your analogy to the physical limits of the human body in minutes per mile)? That the NWEA has discovered the yearly limit of test achievement matched to a student’s grade level? Or that students who test into an SE as a 4-5 year old show slower growth in a subsequent year while their peers catch up? (Even though their peers are not provided with the same resources, homogeneous cohort, or shield from turnover/mobility as their SE counterparts?)

    If those 10 SE students growth points are worth more, what is the weighting need to be used to balance it out with the 50 growth points of a non-SE student?

    I don’t know the answers to those questions. But the answers to those questions are very much worth examining. Because at the end of the day, our assumptions and beliefs about all of this drive behavior, our choices for our students, and educational policy in Chicago.

    **I don’t think that anyone is saying that SE students don’t do well. I think the question being asked is “Is each student’s school doing the best job it can for that student? Could it do better? Which schools are taking the students they are given and dramatically improving their educational prospects–whether they test in to a school, have parents with the knowledge/resources to enter a lottery, or walk in to their neighborhood school?”

  • 39. harry potter  |  August 30, 2016 at 7:04 pm

    Just to toss this in there, every single teacher in the state of Illinois will now have a significant portion of their summative rating based on their students’ test score growth. Schools are looking much more closely at growth over attainment as accelerated growth on a yearly basis should end up in attainment in the long run.

  • 40. Don Justice  |  August 31, 2016 at 6:14 am

    SE Students may show slower growth, because unlike the Noble Street Charters, everything an SE teaches does not revolve around a test. The SAT and ACT still do not measure creativity and poorly look at critical thinking skills. SE students are focusing on learning that which will help them succeed beyond school, while neighborhood schools and charters focus their curricula on whatever a test measures — little more. Let’s not even talk about the emotional intelligence that is lost as schools are forced to shift from literature to non-fiction.

  • 41. Learning CPS  |  August 31, 2016 at 7:57 am


    “So if an SE student started the year at, say, 250 and went to 260…that should be counted as a better effort than a student in the same grade who started the year at 150 and went to 200 by the end of the year?”

    I never said that it should be counted as a better effort, but in this approach it seems it would be counted a significantly worse effort and I don’t think that is truth either.

  • 42. JMOChicago  |  August 31, 2016 at 8:43 am

    “SE students are focusing on learning that which will help them succeed beyond school, while neighborhood schools and charters focus their curricula on whatever a test measures — little more.”

    Having been in MANY neighborhood schools, I can tell you with absolute certainty that this is not the case across the board.

  • 43. JMOChicago  |  August 31, 2016 at 9:05 am

    “I never said that it should be counted as a better effort, but in this approach it seems it would be counted a significantly worse effort and I don’t think that is truth either.”

    How is improvement at all a “worse” effort? What “truth” are we looking for exactly? If the question is “which schools are significantly improving the educational outcomes for their students overall, no matter where they begin?”…what should the measure be?

    The question is not “where are we clustering students that score high on a specific test early?”

    Here is what is interesting to me.

    An aggregate score is used for growth. Some of those students in a class are going to improve much more than others. Some of them might even lose ground. For a school to show strong growth in the aggregate, that is a school that is really working hard.

    I am not a fan of standardized tests. They have way too many flaws. Unfortunately it is the measure we’re stuck with at the moment. However, I think the NWEA/MAP is a much better instrument than the PARCC in terms of differentiation and feedback…which is the real usefulness of any test for students and teachers.

    Don Justice: Your implication is that SE students are receiving a “better” education than their non-SE peers. No doubt SE students have advantages in the system that their non-SE peers don’t have…(aside from the fact that they tested very well on admissions tests at 4-5 years old). They are protected more from disruptive mobility that makes bringing a cohort of students along in the curriculum more difficult. Their administrators have more control over class size and composition. There is more predictability as to the composition and performance of the cohort from one year to the next. The families tend to skew higher income than most of the non-SE schools with more outside resources. Those are advantages. I know of one neighborhood school where a PhD teaches middle schools social science, while a Fulbright Alumni teaches them math. The school is 80%-ish low income. Classroom discussions on literature are lively, complex and challenging…even if the text is non-fiction. Students are given full access to music programs and, if they try out for band and make it, the music director teaches them their instrument. An instrument that is often donated/refurbished or purchased with collected with Campbell Soup labels. Half of the graduating class each year is accepted into Selective Enrollment high schools and programs…but all of the students are celebrated because they are taught leadership and team skills (as well as provided with experiences mentoring younger students) in the upper grades.

    They didn’t make this Chicago Magazine list. But are they providing an excellent education for their students given the resources provided? Yes, I think so. Could they do better? Yes, always room for improvement. Instead of stamping their feet and trying to find ways to discount the methodology, their staff is taking the feedback seriously and working to improve in areas where there is room to improve with the students that they serve. Neighborhood schools are not the educational wasteland that you are implying.

  • 44. Learning CPS  |  August 31, 2016 at 10:16 am

    @JMO – Just to be clear, at no point have I been suggesting that the schools on this list are not deserving (beyond the legitimate questions regarding the data on Blaine, which regardless is a good school) or that I’m trying to discount the data or that there isn’t something to learn from this approach that has been neglected in past rankings. I’m actually trying to understand exactly what schools not on the list who have been in the past can be doing better.

    The two primary differences in this methodology at the elementary level are the inclusion of growth and the 5Essentials. In terms of what schools not on this list can do to improve, I can see how all schools who are getting lower scores on the 5Essentials can identify opportunities for change by looking at how different elements are rated based on the behavioral statements included in the survey.

    I struggle to see what an SEES school specifically can do to increase growth ratios in students who are starting at such a high place and who won’t necessarily see the big leaps in growth scores that a school with kids starting at a lower place might. And by weighting that so strongly, I think it is doing a disservice to the teachers at SEES schools, who are already teaching an advanced curriculum, by saying they aren’t doing enough to achieve educational gains in their students.

    This ranking methodology, as I see it with the somewhat limited information from ChicagoMag and a lack of ability to see any schools ranked below the top 15, seems to reward schools with higher growth ratios, and many SE schools have lower growth ratios while still being way ahead of most schools in terms of actual scores. Which, since growth is weighted at 40%, leads to a list that suggests many SE schools are doing less to provide educational growth for their students. I don’t think that is the reality, just as saying SE schools are the best schools because their test scores are higher isn’t the only reality either. If anything, the very approach (bringing in a group of potentially top performing kids as the entire student population) that creates the student body at an SE school can work against it in this approach to define the “best” schools.

    I’m really not trying to be contentious about this, just digging into the data to try and see the story behind it.

  • 45. CPS MomOf2  |  August 31, 2016 at 11:27 am

    I’m not bothered by average growth at schools with already very high attainment levels, especially in selective enrollment schools. I would assume most kids in these schools are already performing at the top or near top of their potential and and average growth to slightly above average growth would show that the school doing its job. Remember we are talking about children whose brains are not fully developed. Even the most gifted students are still growing and to expect that they will reach some arbitrary Rit number ignores the fact that their brain just might not be developed enough to do that. On the other hand, high growth measures at schools with below average attainment scores shows that the school is doing a good job catching those kids up to the average. Hopefully those high growth numbers are sustained through all grades.

    Honestly I dont believe these rankings really tell you much about quality schools, and my kids go to one of the non selective schools on the list.

  • 46. JMOChicago  |  August 31, 2016 at 12:01 pm

    “I struggle to see what an SEES school specifically can do to increase growth ratios in students who are starting at such a high place and who won’t necessarily see the big leaps in growth scores that a school with kids starting at a lower place might. ”

    This comes from the assumption that there is a shared ceiling in the amount of growth for every student. For example, that 99% of kids can’t get beyond a 300 RIT when they are a third grader.

    Is there?

    Only the folks at the NWEA can tell us if their test has a ceiling where they “run out” of higher level questions above 300 RIT. I think that the research does not support a physiological limit for certain ages of students that would restrict any student of a certain age to 300 RIT.

    If there is a ceiling in the amount of growth for every student collectively…if that magic number is 300 RIT…then I have a problem with the theory behind SE schools and how we defining “gifted”. Because that means that it is the luck of the draw as to who gets to a higher level at the time when there are admissions spots available (generally Kindergarten). That a child who experiences a lot of their cognitive growth at ages 4-5 is going to have an advantage over a child who experiences more cognitive growth at age 8 in accessing a school environment with a more homogeneous cohort, year-by-year cohort stability, and an advanced curriculum.

    Over-relying on achievement scores with SE admissions to assess schools can mask a lot of problems. I think the NWEA/MAP does a good job of setting the ceiling VERY high so that atypical (“gifted”) students can see their growth. There are definitely NWEA/MAP researchers who are aware of the issue.

  • 47. Learning CPS  |  August 31, 2016 at 3:52 pm

    @JMO – You’re kind of losing me on your last post. I’m not sure what your point is now. I did some digging and according to this, there is a ceiling to the test and different versions. Not sure how that has anything to do with the process for SE selection, which we can all argue has it’s flaws.

    There are no restrictions as to when student might get there, nor do I think the expectation is every child will get there and it is probably safe to say since NWEA notes their highest norm scores in the 95th percentile in the 260’s and only at the high grade levels that most kids don’t get much past maybe 280 and that certainly isn’t expected in elementary school.

    Since some kids start at or close the high norm others, and SE kids are starting at highs above their grade level norm highs, expecting that the only way to show successful growth for them is to reach to the top score seems unrealistic – in particular since to get those more difficult questions requires them to have been taught concepts several grades ahead. Even SE schools tend to only teach 1-2 grades ahead. The test will show growth, but large growth is less likely. Hence the reason that a smaller growth metric in SE schools seems to make sense, but in a rating system that values big growth it won’t measure up.

  • 48. cpsobsessed  |  August 31, 2016 at 10:54 pm

    Looks like the Chicago Mag editor was on Chicago Tonight this evening discussing the issue and the methodology they used. I haven’t figured out how to watch it online yet.

  • 49. Just Sayin  |  September 1, 2016 at 4:34 pm

    According to the Chicago Mag. link, Lindblom is ranked #8 on the list of Top 20 High Schools.

  • 50. karet  |  September 1, 2016 at 6:38 pm

    This is a great high school list. I regularly have outstanding students who attended Mather, Prosser and Von Stueben, but I never hear those schools mentioned on this site (I teach at a local university). Hooray!

    Also pleased with Skinner North’s high ranking — apparently it IS possible for very high scoring kids to achieve high growth.

  • 51. cpsobsessed  |  September 1, 2016 at 9:18 pm

    @49, well yay for Lindblom! Boo for Chicago Mag not checking the list before printing the issue.

  • 52. Tone  |  September 2, 2016 at 1:05 pm

    Chicago Magazine can do anything it likes, but it clearly cooked the books with Blaine at #3 and a NA for 25% of the score.

    BS list.

  • 53. klm  |  September 2, 2016 at 8:22 pm

    OK, I know maybe I’m old-fashioned, but isn’t school supposed to be about learning/knowledge? Shouldn’t a school be judged by what its students can do at the time they are assessed, not by how much “progress” its students have made?

    I get that it’s easier for the offspring of educated, non-poor families that live in non-struggling neighborhoods to rock achievement tests (mainly because they actually can read, do math and show more understanding of science better than their disadvantaged peers).

    I’m sure that there are better colleges than the University of Chicago, MIT or Stanford, because these “better” colleges have students that make more “progress.” For example, moving from the 35th to the 45th percentile —a full 10 points —rather than the 0-1% at these Name schools (where they stay at the 98-99th percentiles, instead of moving up more than 10 percentile points [which is obviously statistically impossible]).

    I mean, who knew that some schools where many students can’t objectively do nearly as much academically as at other schools are “better,” than those other (sometimes much) higher-achieving schools?

    Not me.

    BTW, I love the spirit of the above “Best Schools” list. Yes, growth is essential. Yes, the collective, community-based cooperation and participation is so important to a school’s being successful, etc. Yes, kids can become well educated when teachers, parents, administrators and support staff do their magic. It happens here in Chicago and should be celebrated. Hooray!!!! (For real)

    But ……facts is facts.

    I know for a fact that there are some colleges that do great things, work with their students to get them into STEM careers, pharmacy or med school, etc. and it’s a wonderful thing. I’m thinking of, for example, of Xavier University in Louisiana that has been a model for getting African-American students prepared for, admitted into and successful long-term in the medical professions. They’ve patented the secret sauce for this and it’s a wonderful story of an American institution of higher learning making an enormous impact in reducing the disparities in the number of African-Americans that successfully become physicians, pharmacists, research biologists, etc. It’s a wonderful college, in this sense.

    That said, is Xavier a “better” college than one, like Northwestern or Amherst, where students don’t show as much “growth,” or relative success compared to the high average freshman ACT/SAT scores?

    In the same way, it’s great that Dixon is doing so well (I’m a big champion of Chatham and want it to remain as a bastion of the black middle-class, despite its recent crime issues that threaten its reputation as a viable neighborhood). Dixon really is a success story at a time when there has been so much bad news coming from the Southside. It’s proof that good things have always and will continue to happen in Chatham —-yeahhh!!!!

    But Dixon’s a “better” school than Edison RGC?

    For real?

    Come on.

  • 54. karet  |  September 3, 2016 at 10:17 pm

    Maybe you guys can confirm if I am interpreting Chicago Mag’s use of “growth” accurately. They say:

    “NWEA/MAP growth (math) and NWEA/MAP growth (reading) are the percentages of students meeting or exceeding the national averages for year-over-year growth on the NWEA/MAP tests.”

    I read this to mean that your school’s score won’t be any better (or worse) if your school’s students who exceed the national growth average are exceeding it by a huge amount or just barely. What is being measured is the *number* of students who are exceeding the national average (even if only by a tiny bit). “High growth” just means that most/all of your students are beating the average growth rate. That doesn’t seem like too difficult a bar for the SEES. How do you understand their explanation?

  • 55. CPS parent  |  September 4, 2016 at 12:25 am

    I think the list is pretty accurate. I like to see some schools on the list which were not recognized before. To me, it’s important that we don’t just rank the school by their test scores. For example, many people do not know that Healy school had more than half of the eighth grade students got accepted to Walter Payton this year. Great Job Healy!

  • 56. Alcott is a surprise to be ranked  |  September 5, 2016 at 1:56 pm

    @.3 Cynthia I wasn’t Lake View didn’t get mentioned, I was surprised Alcott High School got rated in the Top 20. It seems a lot of attention is being given to Alcott with the Latrhop Home coming down soon with new families moving in right down the block. Lake View seems to plagued by many things and the staff must know something as 30 of them left since the start of last school year, including a principal, asst. principal, and athletic/community director.

  • 57. cpsobsessed  |  September 5, 2016 at 10:02 pm

    @54 Karet – I agree, it does seem like the SE schools should be able to meet that benchmark fairly easily. But it doesn’t take starting levels into account. So maybe there is more upside when a school starts at a lower level? And less upside in the older elem grades when the starting scores are very high?

  • 58. karet  |  September 6, 2016 at 10:33 am

    @57. Honestly, I don’t know! If your kid’s RIT in math doesn’t go up from fall to spring, that would suggest s/he hasn’t learned any new material or concepts. I would say that’s a pretty good indicator that your school could be doing more, even in the upper grades. I suppose you could argue that some kids in SEES are so advanced that’s it’s hard for the schools to differentiate / adapt the curriculum. That’s a challenge, but it shouldn’t be an impossible one. (SN seems to be doing a pretty good job).

  • 59. cpsdad  |  September 6, 2016 at 2:10 pm

    There is a ceiling to the MAP because there are different tests given to different grades.

    “The RIT scale does have a ceiling to which it measures. Students who have reached the ceiling should be either moved to the next test level, or if exceeding the topmost RIT level, measured with a different assessment. Factors that determine a ceiling effect include:

    Whether the student is operating at the extreme end of the RIT scale – the number of items in the question bank are fewest at the highest and lowest extremes of the RIT scale.
    Which version of MAP is used to test a student – each version of MAP measures a different RIT range.”

    Primary 110 ——————————> 240

    MAP 2-5 160 —————————–> 260

    MAP 6+ 160 ————————————————–> 320


    Primary 110 —————————-> 220

    MAP 2-5 150 —————————–> 250

    MAP 6+ 160 ————————————————> 300

    I know a child hit a 260 in 3rd grade for math.For the next two years the student was still given a projected growth but could not possibly obtain it because they still gave them the MAP 2-5 test. Additionally I believe there is a limit to the number of questions offered (40-45) so even if they could continue they are cut off at some point.

  • 60. karet  |  September 6, 2016 at 10:10 pm

    @59, So … if the school had followed the NWEA guidelines, that kid should have been moved to the next level (and would not have hit a ceiling). The school failed to administer the test correctly.

  • 61. OgdenParent2016  |  September 7, 2016 at 11:48 am

    CPS has posted last years school level NWEA results:

    Some rather depressing math growth at Ogden. Looks like our 1+ status is a goner 😦

  • 62. Choose Amundsen or Lake View  |  September 11, 2016 at 8:10 am

    Ok, I know it’s a year away. If you had to choose between Amundsen or Lake View for a high school for your child, which one would it be and why or why not did or didn’t you choose one of the schools?

  • 63. cpsobsessed  |  September 11, 2016 at 12:17 pm

    @62: given that HS decisions won’t need to be made until March, I would be inclined to see how things shake out at Lake View this year, given there being an interim principal. I’d consider this a transition time for LVHS, but the fundamentals are in place. I’d just like to see the goals of the new leadership to make a personal decision.

    Amundsen has an IB program, which is a plus. LVHS talked about starting a special track (AP? Accelerated, I can’t recall the details) but that is another topic for the upcoming year.

    Amundsen has the advantage of the principal having been in place for a few years now, which can help with the schools strategic goals. But good strong visionary leader at LVHS could certainly help attract families there.

  • 64. Learning CPS  |  September 11, 2016 at 2:56 pm

    @ OdgenParent2016 – I wouldn’t stress too much about that right now. Just about every school on this ChicagoMag list and, many good ones not on it, have lower growth numbers with the 2015-16 scores. There is also a note on the spreadsheet that the percentiles are based on updated NWEA national norms which might be higher than the ones that were being used before. Growth is a useful metric, and was used heavily for the ChicagoMag ranking list, but I don’t think it is the end-all for how CPS is rating their schools. They touted attainment more on their press release about the scores and your attainment scores are still high.

  • 65. cpsobsessed  |  September 11, 2016 at 10:42 pm

    @61 – Ogden has one of the top attainment levels in the district in both math and reading. I’m sure it’ll remain Level 1+.

  • 66. cpsobsessed  |  September 11, 2016 at 10:48 pm

    Top Reading NWEA score Schools

    Reading RIT Grades 3-8
    YOUNG HS 247.4
    LANE TECH HS 244.5
    TAFT HS 237.8
    EDISON 236
    KENWOOD HS 235.9
    KELLER 233.9
    LINDBLOM HS 233.2
    DECATUR 232.5
    LENART 231.3
    OGDEN 230.1
    MCDADE 230
    MORGAN PARK HS 229.2
    LINCOLN 228.8
    DISNEY II 228.2
    WHISTLER 227.9
    COONLEY 226.9
    SKINNER 226.7
    HAWTHORNE 226.4
    BLAINE 226.1
    EDGEBROOK 226.1
    SOUTH LOOP 225.3
    ALCOTT ES 225.1
    JACKSON A 225.1
    BELL 225
    CULLEN 225

  • 67. cpsobsessed  |  September 11, 2016 at 10:49 pm

    Top Math NWEA Score Schools
    Grades 3-8

    YOUNG HS 268.1
    LANE TECH HS 263.5
    TAFT HS 253.4
    EDISON 250.6
    KELLER 248.6
    KENWOOD HS 247.9
    LENART 243
    DECATUR 242.5
    OGDEN 240.6
    HAWTHORNE 239.8
    MCDADE 239.4
    HEALY 238.4
    SKINNER 238.2
    LINCOLN 237.4
    DISNEY II 237.4
    MORGAN PARK HS 236.7
    ALCOTT ES 236.5
    BLAINE 235.8
    COONLEY 235.7
    HAINES 235.5
    WHISTLER 235.5
    BELL 234.5
    BURLEY 234.4

  • 68. cpsobsessed  |  September 11, 2016 at 11:15 pm

    Kind of interesting (and not super surprising) to rank on score alone. Keep in mind the ACs will have higher averages since they don’t have lower grades in their average. Hawthorne always impresses me, as they don’t have any selective element to admission.

  • 69. Arnold Davis  |  September 12, 2016 at 5:16 pm

    @62 and @63. I’ve been on the LSC at Lake View for the last two years. Speaking for myself, the new principal at Lake View, Mr. Karafiol, is great and the school is off to a terrific start. You can learn more about him here: I do not expect Lake View to lose any momentum. As for the folks who left, the former assistant principal and athletic director got outstanding positions based on their success at Lake View. While I am always sorry to see people go who are great at what they do, I am glad to see that the educational world sees Lake View High School students, faculty and staff as a go to place for talent.

  • 70. cpsobsessed  |  September 12, 2016 at 6:01 pm

    Thanks @Arnold. Please keep us updated throughout the year!

  • 71. Thank you  |  September 13, 2016 at 8:32 am

    Thank you Mr. Davis as I plan on following both schools closely to help my son make up his mind on where to go for high school. One question and this comes from me speaking to one of the 15 teachers who left Lake View as well. Did the last principal really bring back a temporary gym teacher to the building last year who was responsible for student files found in an alley in 2009 when he worked as an aide?

  • 72. Vikingmom  |  September 14, 2016 at 1:20 pm

    @62 re: Amundsen v Lakeview, I would recommend, besides going to the Open Houses (Amundsen’s is November 5), making a visit to each school while it is in session, just to get a feel.
    As many know, I’ve been very happy with Amundsen, having a daughter who just graduated from there and a son who just started. I do also know people who have been happy with sending their kids to Lakeview.

  • 73. Nicole  |  September 16, 2016 at 1:36 pm

    So I know Lindblom was added to the list after the fact but do we know how they decided where to rank it? I saw a tweet from SchoolSparrow last night ( that said Lindblom’s students boycotted last year’s PARCC exam, which I believe is a component of Chicago Mag’s rating methodology. Seems similar to Blaine being on the list despite not having data for important fields. I admit I’m not totally up to speed on this so maybe this has already been addressed.

What do you think?

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed




Blog Stats

  • 6,163,065 hits

%d bloggers like this: