Sun Times/Blaine Principal Analysis shows neighborhood schools outperform charters on MAP growth

September 1, 2014 at 6:03 pm 56 comments

A very compelling piece was published in the Sun Times today (coinciding with the first eve of school starting) where the current Blaine principal (who has been outspoken about the system) reports some results from an analysis of MAP test that shows that neighborhood schools greatly outpace Charters in GROWTH over the past year.  this is different from attainment as it actually shows how much students improved compared to where they started the year.  Math growth is around the same and attainment levels are around the same.  (Not surprisingly selectives and magnets are better than other schools.)  The Sun Times has corroborated his data.  

Take a read and see what you think of his analysis.  Is this a reason to stop expansion of charters?  Some charters?  All charters?

The data is certainly compelling.  I ranked the charters and found that they break out to roughly 1/3 in the top 33% of growth nationally, 1/3 middle. 1/3 lower.  Perhaps the top 1/3 of charters are worth maintaining?  For students in some neighborhoods, even having the chance to attend a school with average growth is an improvement, right?  

My questions: Why isn’t math growth on track with reading?  Will this trend continue with neighborhood schools showing really strong reading growth?  What’s going on at some of these schools with low attainment (ie low scores) but huge growth?  Hopefully someone is CPS is finding out what’s happening there to see if its replicatable.

All in all, you have to admire someone who is willing to step up to speak publicly critiquing CPS from inside, using data to support the argument.

ARTICLE AND LINK TO DATA

http://www.suntimes.com/news/otherviews/29378381-452/drop-cps-reform-strategy-cps-neighborhood-school-growth-outpaces-charters.html#.VATsFRGYbmS

For original CPS MAP data, charts, graphs, and other data files, please visit http://schoolscomparison.blogspot.com/

By Troy A. LaRaviere                                                         

When mayor Rahm Emanuel recently heralded a small gain on the average Chicago Public Schools elementary “MAP” test results, I knew something wasn’t quite right. It wasn’t what he said; it was what he didn’t say. You see, this is the first year the MAP scores can provide a more decisive apples-to-apples comparison of charter schools and traditional public schools.

The result? Public school students learned far more in one year than charter school students did.

Until now, schools were judged on student attainment scores, not student growth. This is important because — like magnet schools — charter schools lean heavily on their ability to enroll students who are more likely to have higher attainment than their neighborhood peers by virtue of the degree of parent involvement needed to enter a child into a charter school lottery. Chicago’s charter schools also expel students at more than 12 times the rate of our public schools, which calls into question their own confidence in their ability to effectively teach the most difficult to reach children. When you consider those factors, the attainment of charter school students could be more a result of their admissions and expulsion policies than their teaching.

This is where the MAP assessment comes in. The MAP is designed to measure teaching and learning. In fact, CPS trusts it so much that it uses the results to determine teacher and principal evaluation ratings. It’s also used to rate schools on CPS’s five-level rating system.

If CPS can use MAP growth results so broadly to rate teachers, principals, and schools, one would expect CPS to use those same results to rate its school reform strategy, which is dominated by the proliferation of privatized charter and turnaround schools, where a private operator replaces all or nearly all of a school’s staff. How did Emanuel’s reform schools do? What kind of learning growth did they foster? Why didn’t Mr. Emanuel say anything about it? Surely he knew.

So I downloaded the publicly available MAP results and conducted a preliminary analysis. For the sake of consistency I am deferring to the Sun-Times results, which were similar to mine. The MAP report lists the “growth percentile” assigned to each school based on student results. If a school gets a growth percentile of 99, then the average growth of the students in that school is greater than the average growth of 99 percent of schools in the United States that took the MAP assessment.

In terms of assessing the effectiveness of charter schools I believe the most accurate comparison is to public magnet schools since both charters and magnets have lottery admissions processes that increase the likelihood of enrolling students with involved parents. In essence, charters are privately-run magnet schools and therefore should be measured against publicly-run magnet schools. I believe that turnaround schools should be compared to neighborhood schools since they both must accept students within their attendance boundaries. Using the Sun-Times results, the comparisons are as follows:

READING

* The most dramatic performance gaps are in reading, where the public magnet school growth percentile is 83, while the charter score is 48.

* The public neighborhood percentile is at 75, while turnarounds are at 51.

* Although neighborhood schools must enroll any student in their attendance boundary, their students’ reading growth percentile is 27 points higher than that of lottery-driven charters schools. Neighborhood schools are at 75 and charters are at 48.

MATH

* In math, the public magnet school growth percentile is 67, while the lottery-driven charter schools are at 49.5 — over 17 points lower.

* The neighborhood school growth percentile is at 55 while the turnaround school percentile is at 43 — 12 points lower.

* Even with their admissions limitations, public neighborhood schools outperformed the growth in lottery-driven charter schools by more than five percentile points, with neighborhood and charter schools at 54.9 and 49.5 respectively.

A simple look at a list of the schools reveals even more. Of the 490 Chicago schools for which elementary grade MAP data was available, 60 of those schools are charter (12 percent), 24 are turnaround (5 percent), and 406 (83 percent) are traditional public schools. When sorted by growth percentile rank, I found the following:

* Although charters and turnarounds make up 17 percent of district schools, they account for none of the 60 schools with the highest growth percentiles.

* Of the 30 lowest performing schools in CPS more than half are charters or turnarounds.

* Of the 10 lowest performing schools in CPS, seven are charters or turnarounds.

* Nearly nine out of 10 charter/turnaround schools are in the bottom half of CPS performance.

In summary, charters and turnarounds are overrepresented among the schools with the lowest student growth, and not represented at all among schools with the highest student growth.

CPS testing and accountability officials told me their numbers looked similar to mine and that any minor differences may have been the result of the inclusion of one or two schools not included in the data available at the time of my analysis. This led to another striking revelation. Eight of the city’s charter schools — including five Learn Charter Schools — had no MAP growth data at all. When I asked how this was possible I was told these charters had not “opted in” to the MAP assessment. You read that correctly; CPS allows some charter schools not to participate in the assessment used to hold regular public schools accountable.

The 50th percentile represents the “average” for U.S. schools. The reading growth percentile scores of 83 and 75 for students in Chicago’s public magnet and neighborhood schools stands in stark contrast to the often-promoted picture of traditional Chicago public schools as “failing.” On the contrary, the 51st and 48th respective growth percentiles of turnarounds and charters clearly indicate that it is these reforms that are failing Chicago’s students. There may be a few exceptions, but exceptions don’t create good schools systems; critical mass does. Our public schools have developed this critical mass while charter schools have fallen short.

This situation sets up an inexcusably dire situation when considered in the context of the racial achievement gap. As large numbers of African American and Hispanic students are funneled into the low-growth charter/turnaround system, the high-growth public system is becoming increasingly Caucasian and Asian. The students on the low end of the achievement gap — the students who need the most growth — are being fed to a system that produces the least. In December, the Sun-Times reported that the achievement gap between white and black students was widening. It now appears we have identified a cause.

In the face of these results, the mayor’s next press conference on schools should be much different than his last. He should announce that CPS will cease its effort to divert funding from public neighborhood schools into his failed charter experiment. An immediate surge of investment in public neighborhood schools should follow. He should also announce an immediate publicity campaign to inform parents who made the charter “choice” of the learning growth disparity between these different types of schools so those parents can then make a more informed choice about where to send their children. Unfortunately many of the schools in those parents’ neighborhoods have been shut down. It is a tragic irony that a so-called “choice” system has left thousands of families with no choice at all.

In the past, when public school advocates have mentioned the difficulties of teaching in schools in low-income minority neighborhoods, charter and “choice” advocates have had a “no excuses” response. “Hold the public schools accountable!” has been the battle cry. Will the mayor now hold his charter schools accountable? Let’s hope Mr. Emanuel remains consistent with that “no excuses” mantra now that his own reforms have failed.

Looking forward to that next press conference.

Troy A. LaRaviere is principal at Blaine Elementary School, a parent at Kellogg Elementary School, a graduate of Chicago Public Schools and Chairperson of the Administrators Alliance for Proven Policy and Legislation in Education (AAPPLE).

 

Advertisements

Entry filed under: Uncategorized.

A walk in the PARCC? Hidden Gems High School Fair Sept 28th

56 Comments Add your own

  • 1. marcsims  |  September 1, 2014 at 6:19 pm

    How Chicago’s school choice system is tracking kids into separate high schools based on achievement http://www.wbez.org/news/big-sort-110502 The Big Sort WBEZ analysis of test scores found few schools in the city enroll a full span of students. Low-scoring and high-scoring students attend completely different high sc… View on http://www.wbez.org Preview by Yahoo

  • 2. cpsobsessed  |  September 1, 2014 at 6:24 pm

    Thanks @Marcims – another great data-based article. I tried looking at the top MAP growth schools and to see if they seemed to skew towards higher ingoing scores (to the point of the WBEZ article) but I really saw that the top growth schools are a mix of lower- and higher-performing raw score schools. I was a bit surprised by that. So there seem to be some schools that look ‘bad’ (ie Level 3) but are really knocking it out of the park on growth.

  • 3. cpsobsessed  |  September 1, 2014 at 6:31 pm

    FYI, here are the top Charter growth schools (all 74th percentile +) on reading:

    KIPP CHTR CREATE
    KIPP ASCEND CHTR CAMPUS
    CICS-AVALON /SO SHORE
    KIPP CHTR BLOOM
    UNO CHTR – SPC DANIEL ZIZUMBO
    UNO CHTR-TAMAYO
    UNO CHTR – ROGERS PARK ES
    CHGO MATH & SCI ACAD CAMPUS HS
    UNO CHTR – SANDRA CISNEROS
    PERSPECTIVES CHTR IIT
    ROWE
    UNO CHTR – ESMERALDA SANTIAGO
    UNO CHTR 15

  • 4. peoplebelieveinfantasy  |  September 1, 2014 at 6:32 pm

    What drives school achievement is the iq of the student. Charters with high iq students do well, neighborhood schools with high iq students do well (Edgebrook?).

    Charters can be a scam, look at the turkish schools under fbi investigation. But regular schools are higher cost (pensions cost money).

    Over populations, the ability to learn is mostly determined nine months before a student is born, by the iq of the parents.

  • 5. Growth Percentile  |  September 1, 2014 at 7:42 pm

    “I tried looking at the top MAP growth schools and to see if they seemed to skew towards higher ingoing scores (to the point of the WBEZ article) but I really saw that the top growth schools are a mix of lower- and higher-performing raw score schools.”

    The growth percentile metric seems really suspect. As I understand it, it is supposed to be based on (according to the methodology document posted by Principal LaRaviere): “Average growth of the school compared to national average growth for schools with the same average pretest score. The school is assigned a percentile representing where it would fall on the national distribution. The national average is the 50th percentile.”

    That is, the Decaturs, Skinners, Coonleys, ERGCs, Bells, etc. would be compared to each other. Yet, all the highest achieving schools have very high growth percentiles, all in the mid to high 90s, with most at 99. (I only looked at the reading scores but assume it’s similar for math.) While I suppose this is theoretically possible because these schools are also being on a national basis, and maybe there is phenomenal Chicago exceptionalism, it is incredibly unlikely. There a correlation of about 0.65 between attainment and growth. That’s not supposed to happen the way the metric is described.

    This means that the growth metric is not really a sensible way of comparing across schools at different attainment levels. It is true that Principal LaRaviere did also report another comparison based only on lower attainment schools, which has at least some merit to it. (His claim that it doesn’t matter that the SEES are included in his other comparisons is plainly wrong.) But the growth metric in my view is sufficiently suspect (given that it does not appear to follow its stated methodology) that I question any analysis based on it.

  • 6. cpsobsessed  |  September 1, 2014 at 7:54 pm

    @Growth Percentile. Ohhhh…. very interesting. Thank you for reading that closely and for pointing that out. So the growth % is scaled to the school to reflect it’s ingoing level. Which makes sense that like schools are compared to like schools. But then making direct comparisons is both equitable in one way and not in others. I need to mull that over some more.

    Do you know where the report can be found that looks at the lower attainment schools?

  • 7. Growth Percentile  |  September 1, 2014 at 8:09 pm

    “Do you know where the report can be found that looks at the lower attainment schools?”

    In Principal LaRaviere’s shared google docs drive, there’s a folder labeled “Chart Photos”. He also discusses this at the link below. I have to say, while I admire his initiative in all of this, his discussion is more than a little disingenuous there. He cherry picks a couple of examples where high attainment schools have somewhat lower growth. I don’t think there can be much doubt that there is a strong correlation between attainment and growth metrics as reported (though I think the math numbers and the combined numbers may be a little less consistent than the reading numbers–I looked at reading only because there was a convenient spreadsheet provided for that and I did not see an analogous one for math, though the underlying math numbers are certainly among the spreadsheets posted).

    http://comparisonfaq.blogspot.com/

  • 8. Alicia W  |  September 1, 2014 at 9:53 pm

    LOL, “Growth gap”! Of course it is easier for (neighborhood) schools with a lower starting percentile base to achieve higher “growt”. A school going from an absolute MAP percentile of 50 to 55 is 10% “growth”. While a high performing (charter) school going from 80 to 85 absolute percentile (up the same 5 absolute points), only shows a 6.25% growth. Hey, Payton was 99 percentile last year and only 99 percentile this year…they are a lowly 0% growth school. I’d rather have my kid in the “lower growth” school with the brighter kids.

    Listen folks, when someone starts throwing out “growth” stats without referencing the base percentile, they are trying to pull a fast one over on you. Either Troy doesn’t understand statistics or he is being disingenuous and trying to make the competition look bad. Why is Troy & the unions afraid of charters? He should focus on improving his sub par school instead of worrying about other schools.

  • 9. IBobsessed  |  September 1, 2014 at 10:20 pm

    The parents at Blaine would be very surprised to hear that their school is sub par.

    Um, because it is not sub par in any way commonly referenced in discussions of school quality. Wonder what agenda makes Alicia
    assume LaRaviere’s school is sub par.

    Question for Alicia: Would you still want your child go to a school with the highest starting base percentile if your child was achieving significantly below that percentile? Say if they were at 50%tile? I’d choose the school that showed they can grow students at the 50th %tile. Who knows if the school with the 85 base would know how to meet that child’s needs.

  • 10. Growth Percentile  |  September 1, 2014 at 10:24 pm

    Listen folks, when someone starts a post with “LOL”, the rest of it is guaranteed to be nonsensical beyond belief.

  • 11. cpsobsessed  |  September 1, 2014 at 10:36 pm

    Thanks for the follow-up link @Growth Percentile. I read his responses to some of the questions people have posed, and I think they’re solid and well thought out.

    I think this grouping of schools for the Growth Percentile seems a valid way to compare schools. Is the school’s percentile above average or not compared to schools with similar ingoing scores. Seems valid to me. We do a similar thing where I work with marketing research studies to compare brands of different sizes. (kids and brands are comparable, right?)

    I do agree about removing the selective schools from the mix, since they’re an anomaly in most school districts.

  • 12. Growth Percentile  |  September 1, 2014 at 11:32 pm

    “I read his responses to some of the questions people have posed, and I think they’re solid and well thought out.”

    But pretty much the entirety of his FAQ page goes toward arguing that growth is not related to attainment, when it plainly is. Now, I would cut him some slack as he is partly going by the methodology document. Though I also think that to find his anecdata on high attainment/low growth schools, he would have looked through the data enough to realize that most high attainment schools are high growth and decided, consciously or not, to ignore it. I just don’t see how his responses on that front can really be solid when they are mostly wrong.

    “I think this grouping of schools for the Growth Percentile seems a valid way to compare schools. Is the school’s percentile above average or not compared to schools with similar ingoing scores. Seems valid to me.”

    As I said initially, there is some merit to this approach (though again, his statement that one does not need to resort this this approach is wrong). When there is reason, however, to question the metric of interest, as I believe is clearly the case here, then I’d really want to resolve that first before relying too much on comparisons of that metric across groups.

    I am curious what the informed pro charter camp has to say.

  • 13. SoxSideIrish4  |  September 1, 2014 at 11:37 pm

    That is why CPS changed the Ratings of schools last week…bc some schools won’t be able to have growth like other schools. Schools that have attained 90% in reading and math, regardless of growth, will be level 1.

    As for Alicia W~Blaine is FAR from sub par and you are very well aware of that!

  • 14. CLB  |  September 1, 2014 at 11:53 pm

    @8 That is not how MAP growth figures are calculated. For any student, the growth percentile is set by comparing the change in the students score from the prior spring to the current one to the average change in other students who started at the same prior spring score.

    Percentile changes are not the same as an integer change in the score itself. The distance between scores in the 50th to 55th percentile is much less than the distance between the 80th and 85th percentile.

  • 15. CLB  |  September 2, 2014 at 12:53 am

    @11 @12 Given the strong attainment-growth correlation (it’s .61 in math for all schools, grade 3-8), LaRaviere should have compared traditional schools and charters with similar attainment levels in 2013 if he wanted to show that traditional schools do better than charter schools in terms of student growth from the same starting position in CPS.

    However, as I read him, his argument is that when we look at how well traditional schools and charters did at student growth compared to the national norm group, traditional schools did better at boosting growth than did charter schools in CPS (because the point of comparison is schools with similar initial positions).

  • 16. Bic  |  September 2, 2014 at 6:53 am

    Report Excerpt:

    “Generous teacher pensions continue as Illinois’ financial crisis worsens
    State has worst U.S. credit rating; Chicago on same path as bankrupt Detroit

    The group’s Labor Day report found more than 100,000 retired Illinois educators had been paid back what they invested into the system just 20 months after leaving work, a financial burden linked to union collective bargaining, which can cost taxpayers $2 million or more per teacher over the course of retirement.

    Meanwhile, the Illinois financial situation is only worsening. Creditors have found the state and its largest city, Chicago, to be on the same path as Detroit. In March Moody’s cut Chicago’s credit rating to Baa1 from A3, giving it the lowest credit rating of any major U.S. city other than Detroit. Illinois has the worst credit rating of any state in the nation.

    As a result, the pension is about $54 billion underfunded,… Compare that number to the state’s annual budget of $35 billion and the situation looks even more desperate.

    More than half of Illinois state educators retire at age 59 or younger and receive $2 million in benefits after their career ends, the institute estimates. Because of a guaranteed cost of living adjustment of 3 percent annually after 25 years in retirement, many of these individuals are earning more than double what they were making at the height of their career, the institute found.

    Along with the annual cost-of-living adjustment, teacher salary spikes are also putting pressure on the pension system, watchdog groups warn.

    In the final four years of her career as Butler School District Superintendent, Sandra Renner saw her salary spike 31 percent to $288,240 — giving her a starting pension of $210,480, upon retirement, according to Open the Books data.

    Two years ago school administrator Mohsin Dada also received a nice pay boost. His income jumped 137 percent from $156,160 to $358,750 in his final year before retirement — giving him a pension of $254,700.

    However, Mr. Dada decided retirement wasn’t for him, because that same year he was appointed as chief financial officer of the North Shore School System — collecting a $239,895.95 salary, according to Open the Books. Between his pension and salary, Mr. Dada is clearing near a half-million dollars annually.

    Because local school systems are only on the hook to pay an increased salary for a few years, there’s a big disconnect when it comes to who really is footing the bill and the impact it is having on the pension system, Mr. Dabrowski said.

    Many unions try to make these spikes part of the teachers’ salary negotiations, and the school systems oblige, knowing they will only be responsible for four years of higher salary. Then the burden shifts to the state pension system, where it will be responsible for footing the higher salary for the entirety of the retirees’ lifetime

    “The stage has been set for a big political battle between the unions, government workers, taxpayers and the poor and disadvantaged, who will see some of their benefits cut as pension costs climb,” he said.

    “The state is on the verge of economic collapse, and the alternatives are massive tax increases or massive cuts in services that the state can’t support,” Mr. Dabrowski said. “It’s unfair to ask taxpayers to pay more if you still have workers who can retire in their 50s on $2 million salaries without trying to reform those things first.””

    http://www.washingtontimes.com/news/2014/sep/1/generous-teacher-pensions-continue-as-illinois-fin/?page=all

  • 17. cpsobsessed  |  September 2, 2014 at 7:08 am

    @Growth Percentile: I’m not sure I’m following your objection to the calculation. When I *thought* the growth percentile was compared the same across all schools, my first thought was “well, charters and neighborhoods may have different types of populations with different ingoing abilities, so now I have to look at how attainment (score at beginning of year) factors in.

    But this adjustment factor where each school is compared only to schools with similar attainment does this for me. It’s kind of like how the gifted test gives you a percentile based on your child’s birth month rather than comparing all 4 year olds.

    So what we DON’T know is 1) whether neighborhood or charter grew more at an absolute level (one could figure that out from the data) but we DO know 2) how each grew given its incoming population.

    Is it that you feel #1 is a more valid measure?

  • 18. Growth Percentile  |  September 2, 2014 at 9:27 am

    “However, as I read him, his argument is that when we look at how well traditional schools and charters did at student growth compared to the national norm group, traditional schools did better at boosting growth than did charter schools in CPS (because the point of comparison is schools with similar initial positions).”

    I read him to be saying that that is a valid comparison because attainment is not related to growth, because each school is being compared to other schools at similar attainment levels (nationally). Thus, including SEES is valid because they are being judged against other very high attainment schools, so that a high growth score does not necessarily follow from having high attainment. (Even if that were true, I am not sure it makes sense, but that is a separate issue.) But attainment is plainly correlated with growth. The high attainment SEES almost all have very high growth scores. I don’t think it’s due to chance.

  • 19. Growth Percentile  |  September 2, 2014 at 9:30 am

    “Is it that you feel #1 is a more valid measure?”

    No, not at all. My point is that the growth metric doesn’t seem to be what it purports to be. If you accept that I’m right on that, then it’s hard to know how to base any analysis on that metric, as we don’t really know what it is measuring.

    The reason I say that the growth metric does not seem to be what it purports to be is that each school is supposed to be compared wrt growth against other schools with similar starting attainment scores. Yet, virtually all of the high attainment schools have very very high growth scores and, more generally, there is a strong positive correlation between attainment and growth. It’s not supposed to be this way. Among e.g. all the high attainment schools, some of them should have really low growth scores and some really high scores and some in the middle. Yet virtually all the high attainment schools have high growth scores. (It’s possible the CPS schools are different in this respect than the national norms but that seems unlikely.) This makes me think that something is seriously wrong with the growth metric.

  • 20. cpsobsessed  |  September 2, 2014 at 9:39 am

    Ah, got it. When you are looking at the “high attainment schools are those mostly selectives? Or also some of the other high attainment schools (top magnets, schools in $ neighborhoods)?

    Sent via BlackBerry from T-Mobile

  • 21. Growth Percentile  |  September 2, 2014 at 10:00 am

    “When you are looking at the “high attainment schools are those mostly selectives? Or also some of the other high attainment schools (top magnets, schools in $ neighborhoods)?”

    As I said above, I looked only at the reading scores, and may be slightly different for the math scores. I just sorted by the attainment percentile (which I think strictly speaking is the post not pre score and therefore not quite right but I was too lazy to link up with the pre scores from a different spreadsheet and I don’t think it makes a difference). The average reading growth score of the 25 or so schools with a 99 attainment level was 98 growth. Yes, mix of the usual suspects: SEES/top magnets/neighborhood$.

  • 22. Growth Percentile  |  September 2, 2014 at 10:00 am

    Also, the correlation statistics is taken across all schools, though could be driven in part by the very highest attainment schools.

  • 23. AE  |  September 2, 2014 at 10:09 am

    I have long been confused by the MAP data, at least as it relates to my kids’ school (high achieving neighborhood w/ RGC). The “National School Growth Percentile” at my school is 99 for reading, but the “Percentage of Students Making National Average Growth” in reading is only 64. My quick review of the data suggests this is the same at other “high achieving” schools (both selective and high scoring neighborhood schools) — They all seem to have a 99 in the “National School Growth Percentile” category, but much much (much!) lower numbers when it comes to “% of Students Making National Average Growth.” Anyone know why?

  • 24. Missy  |  September 2, 2014 at 11:24 am

    Wow this is the information I needed last week. I was trying to decide which school, my neighborhood cluster magnet or the new charter. I looked up all the information that was available for both. I did visit both schools unfortunately it was summer; I didn’t see them in action. I did visit the principal, teachers, & staff of both schools. On one hand the neighborhood school ratings blew my mind a 6 on greatschools.org out of 10. The “MAP” score was 9 out of 10 but the percentage of exceeds was 0%. On the other hand the new charter didn’t have any data available. Even with their other schools that opened in Fall 2013 hardly any data. I tell you what they had that really stood out, tons and tons of parents and faculty reviews. The majority of those reviews were positive to the 10th degree! Some of their other network schools had high ratings and some were low, but the parent’s reviews were pretty consistent.
    I kept thinking to myself, I need to talk to education professional that understands the inner workings of what is going. Of course, I didn’t have a clue who th at person should be. Thank God for CPSOBESSED.COM!!! I truly didn”t understand how the percentage was formulated, and will this benefit my above average daughter. I am literate, but I believed I may have perceived the growth aspect in the wrong format. I rationalized the growth percentage as possibly i.e a 5th grade student that may have started the year in a 3rd or 4th grade level and grew to their present grade level or close to their grade level.
    I am so happy you posted this information. Thank you so much. Everything you stated was exactly how I felt. I honestly wasn’t able to make an informed decision. My daughter was in a lottery to be accepted in this charter school. Currently, I feel like we are the ones pulling the lottery ticket with the charter school. Did we pick a 6, 8 or 10 rating school or 1, 2, 3 rating?
    In conclusion, my suggestion would be for CPS to figure out a way to motivate parents and staff, to make a descriptive review about their school online i.e the school’s Facebook page, the school’s website, greatschools.org, etc. Especially now since parents have this “sense” of “choice”. They/we are using as many resources as possible to try and find the right fit. I totally agree, charter schools MUST be held accountable to the same standards if not higher to CPS schools. They are the ones making these great claims! The proof is in the pudding! That’s what I am gathering Common Core is all about. Everything should be transparent because they are still receiving tax dollars. We came so far and I just don’t want to see us regress as a people. So I really would like to be involved concerning this. It is important to me.
    Thank you again for your professional critique and constructive criticism and have a wonderful school year!

  • 25. CLB  |  September 2, 2014 at 2:17 pm

    “…virtually all of the high attainment schools have very very high growth scores and, more generally, there is a strong positive correlation between attainment and growth. It’s not supposed to be this way.”

    I don’t see why we should assume a nil correlation between attainment and growth. The factors that produce strong attainment may also contribute to strong growth, but they are not determinative. The growth measure controls for attainment in the national norm sample by comparing schools with the same attainment.

  • 26. CLB  |  September 2, 2014 at 2:35 pm

    @23 “They all seem to have a 99 in the “National School Growth Percentile” category, but much much (much!) lower numbers when it comes to “% of Students Making National Average Growth.” Anyone know why?”

    Because the % of students in any school who make the average national growth varies. There is no reason to assume that a school with a 99 percentile growth rank has 99% of its students meet the national growth average. At a school in the 99th percentile for growth but with 68% of students meeting the average, some of the students who did meet it might have significantly exceeded the average growth, bumping the school into the 99th percentile when RITs are averaged.

    A percentile tells you a rank: at the 99th percentile that score had higher average growth than 99% of the other schools. But that does not mean that all those schools in the top 1% had the same performance. Some might have done significantly better than others even within that 1% range.

  • 27. Growth Percentile  |  September 2, 2014 at 9:01 pm

    “I don’t see why we should assume a nil correlation between attainment and growth. The factors that produce strong attainment may also contribute to strong growth, but they are not determinative. The growth measure controls for attainment in the national norm sample by comparing schools with the same attainment.”

    Even if the factors that provide high attainment also contribute to strong growth, the high attainment schools are being compared against each other (supposedly). If their growth scores are based on comparisons with each other, then some should have low growth, some in the middle and some high–even if compared to lower attainment schools the high attainment schools also have high growth.

  • 28. CLB  |  September 2, 2014 at 11:37 pm

    @27 If there is a nil correlation between growth and attainment, we would expect high attainment schools to have varying growth ranks among the national norm sample. But we are only examining a subset of the norm sample (and a non-random one at that). The observed pattern in Chicago alone says nothing about the pattern in the norm group and therefore the quality of the growth measure.

    But there is no reason to assume that the correlation is nil. It seems plausible that student characteristics and conduct that produce high attainment ranks also produce high growth ranks to some degree.

  • 29. Growth Percentile  |  September 3, 2014 at 6:39 am

    “If there is a nil correlation between growth and attainment, we would expect high attainment schools to have varying growth ranks among the national norm sample. But we are only examining a subset of the norm sample (and a non-random one at that). The observed pattern in Chicago alone says nothing about the pattern in the norm group and therefore the quality of the growth measure.”

    The observed pattern in Chicago alone tells you something–I would say quite a lot–about the pattern in the norm group. I agree as a purely theoretical matter the CPS schools could be so exceptional as to have a high correlation when in fact there is none in the national group (and I noted this in some of my comments above). I can’t really prove this without results for the national group, of course, but the results for the CPS schools are striking. (The non-randomness of the sample is really the only thing that can save your point.)

    “But there is no reason to assume that the correlation is nil. It seems plausible that student characteristics and conduct that produce high attainment ranks also produce high growth ranks to some degree.”

    Again, the reason to assume a zero correlation is that each school is supposed to be compared to other schools of the same attainment level. All the highest attainment schools are being compared to each other. The characteristics can result in very high absolute growth, but if all those high attainment schools are compared to each other, then many of them have to have low ranks, and we don’t see that. (Now, it’s possible the 99 percentile attainment schools are being compared to e.g. the top quintile rather than other 99s, and this could drive some of the pattern. Taking a quick look at the data suggests this can’t be driving everything.)

    In fact, it does appear that student/family characteristics could be driving growth scores at the higher attainment schools. If this is so, then it is not appropriate to combine schools together in the analysis, as is done in the primary analysis that Principal LaRaviere reports (though he also reports the additional analysis I noted above).

  • 30. Vincent Johnson  |  September 3, 2014 at 11:01 am

    I read something similar about Blaine and other schools on the City Notes blog. I highly recommend checking it out for all the posts, not just the one about Blaine. http://danielkayhertz.com/2014/08/21/gentrification-and-integration-in-chicago-public-schools/

  • 31. Patricia  |  September 3, 2014 at 12:16 pm

    @Growth Percentile. “In fact, it does appear that student/family characteristics could be driving growth scores at the higher attainment schools. If this is so, then it is not appropriate to combine schools together in the analysis, as is done in the primary analysis that Principal LaRaviere reports (though he also reports the additional analysis I noted above).”

    I appreciate you conveying your point and have been noodling this a bit. When I first read this post, I thought, how can LaRaviere’s analysis accurately compare “magnets” to “charters”. It is apples to oranges. The justification of applying for a lottery makes them comparable seemed out-of-touch to me. I think this is the fundamental flaw in the analysis that leads to questions about what is really driving the data.

    I went to the cps website and mapped out all the “magnets” and then all the “charters”. You see clear geographic differences with more charters further West and South. So geography as well as socio-economic factors drive who enrolls in charters and magnets. Does the analysis further break out comparing specific geographies? Is the analysis comparing for example Hawthorne to CICS? Or am I missing something?

    I have said before that I think comparing charters to neighborhood schools is in some respects a silly discussion because they can’t accurately be compared. There is a need for both and the question is proportion. My guess is 75% neighborhood and 25% charter is a good mix that can serve students well. It would be far more productive to recognize that neighborhood schools serve a need AND charters serve a need. Map out the strengths and weaknesses of both neighborhood and charters……….then take the air time to make both better. Why should education be an all-or-nothing model? What else in America works that way? (Other forms of government are a different story 😉

    I do appreciate LaRavier’s analysis, but also take it with a grain of salt because he clearly is doing this for his own political gain/agenda. Gotta love the election cycles!

  • 32. CLB  |  September 3, 2014 at 12:57 pm

    Some schools do have high growth percentiles and lower attainment ones.

    SHERWOOD 99th growth, 46th attainment
    HANSON PARK 99th growth, 44th attainment
    DUNNE TECH ACAD 99th growth, 42nd attainment
    MADISON 99th growth, 39th attainment
    WARREN 99th growth, 35th attainment
    CULLEN 99th growth, 29th attainment
    NASH 99th growth, 24th attainment
    DOOLITTLE 99th growth, 23rd attainment
    CALDWELL 99th growth, 21st attainment

    These are for math.

    There is no a priori reason for a geographic subset of the norm population to have the same properties of the norm population. If it did, we wouldn’t need the norm group. It is not a random sample.

    But again, I believe that there is an attainment-growth link, even in the national norm group. The idea that they are not correlated seems dubious. Some of the factors that cause students to have high/low attainment also cause them to have high/low growth.

    LaRaviere is making a very narrow point but a legitimate one. The mayor, BBB, and many charter advocates (not necessarily charter school administrators) claim that schools must be held accountable using test scores. And CPS argues that the NWEA MAP is a good test because it measures growth rather than attainment, which is often a function of socio-economic status rather than school quality. Accepting those positions for the sake of argument, the test data shows that charter schools are less capable, on average, of boosting growth than traditional schools are, even after excluding SE schools from the analysis. For a system that claims to be evidence-based and data-driven, it is very puzzling that it continues to reduce the funding of and number of neighborhood schools and increase the funding for and the number of charter schools. When the test data that they laud undermines their policy, they ignore the data.

  • 33. Growth Percentile  |  September 3, 2014 at 6:27 pm

    “Some schools do have high growth percentiles and lower attainment ones.”

    I didn’t say anything to the contrary. I said that almost all (not all but almost all) the highest attainment schools had extremely high growth percentiles. And that there is a substantial positive correlation between attainment and growth among schools overall.

    “There is no a priori reason for a geographic subset of the norm population to have the same properties of the norm population. If it did, we wouldn’t need the norm group. It is not a random sample.”

    You can certainly reach statistically reliable conclusions from a random sample (as to whether a statement about the population is true). I also acknowledged above that the sample here is not random. But the results are so striking from the non-random sample (maybe let’s call it a convenience sample) I think you can still reach reliable conclusions even if you can’t fit it into a statistical test.

    “But again, I believe that there is an attainment-growth link, even in the national norm group.”

    So what the point about the national norm exactly?

    “The idea that they are not correlated seems dubious.”

    I don’t know what to add on this beyond what I have said above.

    “Accepting those positions for the sake of argument,”

    So your point is that if you accept positions you don’t believe in you can demonstrate CPS’s inconsistency?

    “even after excluding SE schools from the analysis”

    As I said, when he goes beyond this and looks only at lower attainment schools, I find some merit in that analysis. Though, again, I question anything based on a growth percentile metric that seems to be measured in a way that is inconsistent with the stated methodology.

  • 34. SoxSideIrish4  |  September 4, 2014 at 9:32 am

    Color Coded Reading Growth~Charters, Rahm, #CPS FAILURES https://docs.google.com/file/d/0BytSj0QyFz1ecG5IUFNlWTlHTG8/edit?pli=1

  • 35. CLB  |  September 4, 2014 at 8:39 pm

    @Growth Percentile

    You are correct that something is funny with “National Growth Percentile.” I had assumed that it was a function of aggregating the individual score data into school level data and then turning it into percentiles, and that is indeed part of it, but the other part is the odd nature of what they mean by national growth percentile in the SQRP data.

    Step 1: The average pretest and posttest scale scores are computed at each grade level in the school (grades 3-8 for NWEA and grades 9-11 for EXPLORE/PLAN/ACT).
    Step 2:For each grade level, the national 50th percentile posttest score is determined using school-level norms provided by the assessment publisher. The posttest norm for each grade level is adjusted for the average pretest score, meaning it is the national average score for a school with the same average pretest score at that grade level.
    Step 3:The 50th percentile posttest scores for each grade level are weighted by the number of students in the grade level and averaged in order to calculate an all-grades score. This score represents the 50th percentile nationally for a school that had the same pretest scores and the same proportion of students in each grade level. This “national average comparison score” will be different for every school, based on the school’s pretest scores and proportion of students in each grade level.
    Step 4:The school’s actual posttest scores for each grade level will be weighted by the number of students in the grade level and averaged. The resulting score will be compared to the “national average comparison score” to determine the school’s percentile.

    Specifically, CPS will calculate the difference in terms of standard deviation units using a school-wide standard deviation. The standard deviations are then converted to percentiles using a normal distribution curve. The benchmarks in the SQRP correlate with the following standard deviations:
    10th percentile = -1.31058
    30th percentile = -0.53884
    40th percentile = -0.26631
    70th percentile = 0.510073
    90th percentile = 1.253565

    I’m with them until the “Specifically…” sentence. I don’t know where the s.d.’s are coming from. I could not duplicate their percentile results. It seems that rather than ranking the CPS schools compared to national norm schools directly using NWEA data, CPS is creating a new school-level measure itself and ranking the CPS schools compared to each other. In this case, the non-random sampling is irrelevant.

    The problem is with the aggregation. NWEA itself does not have a combined-grade score for a school. This is because RIT score variability narrows as grades go up, so direct, aggregate grade x-to-grade y comparisons have no real meaning. And NWEA does not intend its data to be used to assess the quality of schools or performance v. each other. CPS is weighting the grades by the number of students per grade but has no adjustment to offset RIT differences across grades.

    Very bizarre.

  • 36. Growth Percentile  |  September 4, 2014 at 10:41 pm

    “The problem is with the aggregation. NWEA itself does not have a combined-grade score for a school. This is because RIT score variability narrows as grades go up, so direct, aggregate grade x-to-grade y comparisons have no real meaning. And NWEA does not intend its data to be used to assess the quality of schools or performance v. each other. CPS is weighting the grades by the number of students per grade but has no adjustment to offset RIT differences across grades.”

    Interesting generally. I agree with this in part. I think you could still look at an aggregate measure in that you are comparing growth relative to a benchmark school that has the same distribution of students across grades and starting scores within each grade. But you would need to compare the difference in the aggregate score from the benchmark relative to the distribution of that benchmark, not relative to other CPS schools. As you note, RIT scores narrow for later grades, so among other things the distribution of students across grades, as well as the scores within each grade will matter. If the school were, to take an extreme example, all 3rd graders versus all 8th graders, the distributions of scores would matter. The percentile thresholds that CPS uses that you reference above can’t possible make sense. Not immediately obvious to me how it skews things and it may not affect the charter school argument that much. But the methodology certainly seems wrong.

    I did also take a quick look at the policy documents that you quote from. I have a hard time understanding any legitimate rationale for the adjustment below that is made. Basically for very high attainment students, they are assumed to have at least 50th percentile growth (among very high attainment students), which makes no sense to me.

    “Another adjustment that will be made to students’ scores when calculating the posttest average used in this metric applies only to students taking the NWEA-MAP. For students who maintain an attainment percentile of 99% on both their pre- and posttest scores but do not make national average growth (i.e., 50th percentile), their posttest scale score will be adjusted to the RIT scale score that corresponds to 50th percentile growth based on their pretest score. As with the adjustment for outliers described above, this adjusted score does not replace the student’s score on record and is only used for calculating the posttest average used in this metric. While the student’s actual score will be used in the attainment percentile, the adjusted score will also be used in the “Percent of Students Making National Average Growth Metric” (see metric definition in the “Indicators in Elementary Model” section below).”

    I also wonder if growth distributions are available for higher starting RIT values than in the report below, as the high end of the range in the tables I see here might not be high enough for some of the higher attainment schools.

    http://legacysupport.nwea.org/sites/www.nwea.org/files/resources/2012%20RIT%20Scale%20School%20Norms%20User's%20Guide.pdf

  • 37. cpsobsessed  |  September 5, 2014 at 7:52 am

    Isn’t that a complicated way of saying “weighted average by grade”?

    Sent via BlackBerry from T-Mobile

  • 38. Data Cruncher  |  September 5, 2014 at 7:58 am

    @36

    It appears from your quote, for students who consistently score high attainment scores, artificial manipulation of data has been performed to make the students look good from the angle of growth. Otherwise, it would indeed be difficult to argue that a student who scored 99% in two consecutive tests has been making much progress. Unless, of course, there is an expected trajectory against which to measure children’ intellectual growth over the ages of, say, 6 to 18. Does the test use such a freestanding benchmark approach? If not, the data manipulation is really something interesting to examine closely.

  • 39. Growth Percentile  |  September 5, 2014 at 8:27 am

    “Isn’t that a complicated way of saying “weighted average by grade”?”

    Yes, it is. The problem, as CLB points out, is that these averages and the expected growth of these averages depend on the characteristics of the school. E.g., a school that had a greater proportion of lower grades would probably have higher expected growth because RIT increases are greater in the lower grades. Similarly, the proportion of high versus low scores among students would affect expected growth too. The comparison to expected growth is (supposedly) being done on an individualized basis. But the deviations from expected growth, which determine the growth percentiles, are seemingly being compared across CPS schools, which doesn’t make much/any sense.

    Again, it is not immediately obvious to me that this creates a bias in the charter school comparison (especially the one for lower attainment schools) but there are enough oddities to cast this whole exercise into question. That is, how they calculate these percentiles really doesn’t make sense.

  • 40. Growth Percentile  |  September 5, 2014 at 8:30 am

    “Unless, of course, there is an expected trajectory against which to measure children’ intellectual growth over the ages of, say, 6 to 18.”

    There are (or should be) expected growth levels by attainment level (though the published document I referenced above doesn’t seem to cover the highest attainment levels and are for school averages not individual students, which I think will differ). So for a 99 percentile attainment student at a given RIT score, there should be a distribution of growth in score over the next year. E.g., the 20th percentile growth might be 3 points and the 80th might be 8 points (just making the numbers up). But the adjustment above seems to assume that all students maintaining 99 percentile attainment pre/post achieved at least 50 percentile growth. Everyone (by which I mean the 99ers) is (weakly) above average!

    There was no justification given for this adjustment that I saw. There is also an outlier adjustment, which was implemented to limit the impact of outlier students (at the top or bottom) from overly influencing the school score, which more or less makes sense. But not obvious what legitimate reason there is for this adjustment.

  • 41. Data Cruncher  |  September 5, 2014 at 9:26 am

    So, the consistently good students look good in attainment scores (e.g. 99%), but just average in terms of growth (50% or so)? But from CPS website, to take the example of Edison RGC, its 2013 NWEA reading and math growths were both 99%. This is very different from the earlier theory that consistently excellent students only make mediocre growth (50%). Has CPS or the test-giver assumed that, for a school of near universal high scorers, maintaining mediocre individual growths is something actually remarkable, thus meriting a 99% collective appraisal? Conversely, a school can look really impressive with the growth it makes, even though its students are not as good academically. If this was the case, it seems that, to compare schools, neither attainment nor growth score in itself is an adequate benchmark.

  • 42. CLB  |  September 5, 2014 at 9:33 am

    @40 I get what they are doing with the 99th percentile attainment students who make growth below the mean for 99th pctl. attainers. What surprises me is that they don’t do this for students starting at least at the 96th pctl and up. Basically, despite vague protests to the contrary, the RIT score is fairly noisy for students scoring at the 95th pctl. and up. Northwestern’s gifted program had blogged that they were frequently seeing inconsistent RIT scores for students above the 95th pctl. How noisy is unclear because NWEA refuses to release the recent technical reports to the public (I emailed and called and was repeatedly refused access; in public submissions to RFPs, the technical reports are redacted as proprietary info.). Even though ISATs were “secure” tests, the technical reports were available publicly. With NWEA, both the technical reports and the items used are secret. Accountability means “just trust us” when it comes to testing.

    I’m guessing that CPS found this noise was really bad for CPS students scoring in the 99th pctl. for attainment (this was the case with my daughter; her winter RIT was higher than her spring RIT in math but still in the 99th pctl. in the spring. Obviously, she did not lose knowledge, but the growth was negative). To compensate, students showing negative growth but attaining at the 99th pctl. are assigned 50th pctl. growth status for the SQRP purposes. Students who have growth above the 50th pctl. keep their true growth figures for the SQRP.

    The only way to replicate what CPS has done is to either get them to cough up the actual “national average comparison score” for the schools or replicate them yourself by looking up the school’s 2013 attainment for each grade, matching it to the grades in the NWEA 2012 School Norms report, and the running the weighting for each school. With over 480 schools and 6 grades in most of them, this is an onerous task.

  • 43. Growth Percentile  |  September 5, 2014 at 9:44 am

    “So, the consistently good students look good in attainment scores (e.g. 99%), but just average in terms of growth (50% or so)?”

    In theory, a student with a 99 percentile attainment is being compared with other students with the same entering RIT score. There’s a distribution of how those very high attainment students grow. Some of them have to be at the bottom, some middle and some top (even though the bottom growth students are in all likelihood still very high attainment). What I take CPS to be doing is for any student that is 99 attainment pre and post, they put that student at 50th percentile growth even if his/her growth was e.g. only 20th percentile. I find it very odd that virtually all the 99 percentile attainment schools have high 90s percentile growth, when they should be on a distribution given that they are being compared to each other. (I know it’s not a random sample but still very odd.) I don’t know if this adjustment accounts for much/all of this oddity.

  • 44. Growth Percentile  |  September 5, 2014 at 9:46 am

    “To compensate, students showing negative growth but attaining at the 99th pctl. are assigned 50th pctl. growth status for the SQRP purposes. Students who have growth above the 50th pctl. keep their true growth figures for the SQRP.”

    Is this noise only downward? Why adjust only the low outliers? Makes the school growth metric pretty biased upward when most students are 99/99, which I suspect is the case at some of the highest attainment schools.

    “looking up the school’s 2013 attainment for each grade, matching it to the grades in the NWEA 2012 School Norms report”

    Are there more detailed charts elsewhere? Because I don’t think the scores reported go high enough for some of the high attainment schools.

  • 45. klm  |  September 5, 2014 at 9:49 am

    To really make a fair comparison, we’d have to have a true apples-to-apples comparison, which is very difficult. The facts is, often parents with kids that are having problems in school (academic and/or behavior) will pull their kids out of “regular” public school and give charters a try –and some charters are happy to have the bodies to get the per-student funds (just like non-charter CPS schools). The hope is that an often more “strict” or more structured environment may help the student come along better (or that’s how charter schools are often ‘marketed’), etc. Accordingly, charters often get kids that aren’t necessarily the strongest students, at least in relative terms. The above-discussed results MAY reflect this.

    Don’t get me wrong, I know that there there are charters that are just plain “not good schools,” academically (often due to high teacher turnover, the use of novice teachers that are often less effective –they haven’t honed the fine art of teaching yet, etc.). However, even among a high-risk population (urban, low-income, non-Asian minority), charters often get more than the usual share of highest-high-risk students.

    I’m not sure that the above stats are necessarily an indicator of charter schools’ comparative weakness next to CPS schools. We all know many CPS schools rock, academically. However, many are not offering an academic environment that meets the standards of people that want a high-performing school for their kids (hence, this blog –people trying to get their kids into one of the ‘good’ ones), as objectively measured. Same with charter schools.

    Sometimes, I wish CPS teachers and adminstrators would just worry about what’s going on in their own schools and not worry so much about what’s going on in charters.

    If somebody doesn’t like charter schools, don’t send your kids to one. Meanwhile, people in Lawndale, Roseland and Englewood often like having the charter “option” (where before there was none) –and who can blame them? Not me, no matter what I think about some questionable charter operators. It would be easy for me to be anti-charter, since my kids go to fantastic, high-scores CPS schools that I love (I don’t need a charter school for my kids). Many people are not so lucky and I’m not fixated on taking away their abilty to send their kids to a school other than their local CPS school –one that I would never send my own kids to.

    People need to examine indidual schools closely, considering all relevant inputs and outputs, then determine what is working and not working.

  • 46. Data Cruncher  |  September 5, 2014 at 10:01 am

    “I find it very odd that virtually all the 99 percentile attainment schools have high 90s percentile growth, when they should be on a distribution given that they are being compared to each other.”

    Indeed, and it seems this can only be explained by CPS rewarding those schools of consistently high scorers by placing them on top of a base of 50% growth minimum. Maybe this reward varies in size across the good schools, depending on how high their students have attained en masse. We know too little of this complicated metric. I sympathize with CLB’s frustration.

  • 47. Growth Percentile  |  September 5, 2014 at 10:05 am

    “To compensate, students showing negative growth but attaining at the 99th pctl. are assigned 50th pctl. growth status for the SQRP purposes. Students who have growth above the 50th pctl. keep their true growth figures for the SQRP.”

    And why e.g. take a student who scores at 5th percentile growth and put him/her at 50th percentile growth. I could see dropping students e.g below 10th or above 90th (or, probably better, assigning them the 10th or 90th percentile growth scores), but why assume they are at 50th percentile. There are actually students who are at 5/10/15/etc percentile growth. Not everyone can be above average except in CPS.

  • 48. CLB  |  September 5, 2014 at 10:05 am

    @41 CPS is giving the students who scored at the 99th pctl. in attainment a minimum of 50th pctl growth (the mean growth for their attainment). Students in the 99th pctl. who scored higher on growth keep their true growth scores. Students at other pctl. levels, keep their true growth scores. What this means is that at schools where 50% or more of the students score at the 99th pctl., those 50% will have at least 50th pctl. growth figures. This is what is biasing the growth percentiles upward within CPS for the high attainment schools. So those Edison RGC growth figures are in effect “fixed”. The 99th pctl. attaining students who automatically given growth levels of at least the 50th pctl. for 99 pctl achievers. I explain the reasons for this in @42.

    What Growth Percentile and I had been arguing over was whether the high-growth:high-attainment relationship in CPS was due to a bias in the growth percentile calculation (his position) or an artifact of non-random sampling & aggregation procedures (my position). I was wrong about the non-random sampling; it’s irrelevant. The growth pctl. calculation is both biased toward high attaining schools and also, because of aggregation, biased toward schools that have more high-achieving students in the lower grades.

    No one said (or should have said) that students achieving high attainment, esp. the 99th pctl., always have mediocre growth. Rather, the magnitude of their growth is less than students at lower attainment levels, and it is much harder to accurately measure their growth. Put differently, if I had the same set of teachers and resources at one school whose students attained at the 50th pctl. on average and at another school where they attained at the 90th pctl., the absolute growth scores are going to be lower at the 90th pctl. school. To account for this, NWEA provides different growth norms for students at each RIT level. But this means that in order to compare schools, CPS must convert equal-interval RIT scores into unequal interval percentile scores. This makes meaningful comparisons difficult to understand.

    This is not a problem for NWEA because they are not advertising the MAP as a test for accurately comparing schools to each other. They intend the test to be used to detect whether a student is learning more or less for any given level of achievement and to provide hints at what areas he or she is weak or strong in. CPS, however, does have a problem because it is using the test to evaluate the quality of schools, which NWEA has repeatedly said is not what the test is designed to do and cannot do unless other factors that affect student scores (socio-economic status, health, school resources, parental involvement, etc.) are brought into play.

  • 49. Chris  |  September 5, 2014 at 10:15 am

    “it would indeed be difficult to argue that a student who scored 99% in two consecutive tests has been making much progress. ”

    Um, what? The second test, given at a later time, is going to (or at least should) have a “harder” scaled score. Now, it’s perfectly possible for that 99, 99 pairing to be a *regression*, but it’s also possible for it to be a blow the doors off progression (was last in on 99th first test, was highest overall scorer on second).

    Either way, it is not at all difficult to argue that there was “much progress”, it’s simply somewhat harder to *prove*.

  • 50. Data Cruncher  |  September 5, 2014 at 10:22 am

    “Either way, it is not at all difficult to argue that there was “much progress”, it’s simply somewhat harder to *prove*.”

    Without adequate proof, an argument can’t stand with ease.

  • 51. Growth Percentile  |  September 5, 2014 at 10:37 am

    “The growth pctl. calculation is…, because of aggregation, biased toward schools that have more high-achieving students in the lower grades.”

    I don’t think this is exactly right. At least as stated, the “national average comparison score” (that you describe @35 above) controls for the population across grades and the scores within grades. (I have some doubts as to whether CPS implemented it correctly, but that’s a separate matter.) I’m not sure the growth percentile for e.g. schools with more lower grade students** will be biased up/down (on average). But it will be skewed toward the extremes, because deviations for such a school from the “national average comparison score” will be larger than for other schools where less growth is to be expected because of student composition.

    ** I think it is having more lower grade students that matters. Having them be “high-achieving” works in the opposite direction because variance in growth is I think less for high achieving students versus low achieving in a given grade.

  • 52. CLB  |  September 5, 2014 at 11:06 am

    @49. In many cases, the RIT score between t and t+1 is lower but still high enough that the student stays in the 99th percentile for achievement (CPS calls it “attainment”; NWEA calls it “status”). True it is an issue of true progress v. detected progress, but schools are being ranked, teachers and principals are being evaluated, and many students are being grouped based on the detected progress.

  • 53. Lemonjello Jones  |  September 20, 2014 at 5:08 pm

    1 percentile to 2 percentile = 100% growth!!! 😀
    98 percentile to 99 percentile = 1% growth… 😦

    When reality hurts, and you are starting at a very low point, just use growth stats.

  • 54. red  |  September 21, 2014 at 6:52 pm

    So why are kids from charters getting into selective enrollment schools? On the westside of Chicago the charter may not be perfect, but it’s safe, homework is assigned every night and it’s a college prep curriculum. I live in Lawndale and I would never send my kid to a neighborhood school around here. The parents are atrocious and kids want to fight every day. I tried the neighborhood school and it didn’t work for me.

  • 55. Tier4Mom  |  September 30, 2014 at 8:53 pm

    From my understanding, the higher achieving schools will not be ranked based on growth because it is much harder to show significant growth once students are already way above grade level.

  • 56. TB  |  October 3, 2014 at 8:42 am

    Here’s a handy guide to understating the growth scores the state is using. http://reportcards.dailyherald.com/schools/

What do you think?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Trackback this post  |  Subscribe to the comments via RSS Feed


Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,637 other followers

Archives

Categories

Get up to the minute obsessive updates on Twitter

  • Friday 3/3 High School "letters" will be posted to the online portal. Count down begins... 4 months ago

Blog Stats

  • 5,848,522 hits

%d bloggers like this: