Opinion: ELL students and changing minds

Alexander Ooms, a senior fellow at the Donnell-Kay Foundation, is a member of the board of the Charter School Institute, the West Denver Preparatory Charter School and the Colorado chapter of Stand for Children.

Additional attention to English Language Learner (ELL) students is unquestionably a good thing. Particularly given the large percentages of ELL students both in Denver and across Colorado, there can be no doubt that this is a critical issue.  There is simply not enough concerted attention on how schools support ELL students — and especially on specific strategies at both the district and school level to see what is most effective.

What there should not be, however, is opinion substituting for fact.

A recent discussion on these page does exactly that. The claim is that any attempt at quantitative assessment — through state and district tools such as School Performance Frameworks, or representation of this data on sites like ColoradoSchoolGrades.com  — unjustly punish schools with high percentages of ELL students.

These kids, so the theory goes, don’t learn as fast as their non-ELL peers, and schools who have more of them will always do worse on academic growth.  And growth percentiles are the primary driver in most assessments.  By holding all schools equally accountable for the academic growth of their students, as a member of the Denver school board put it, these systems are shamefully guilty of:

“accountability blinders that punish schools and kids for their English-proficiency differences by trying to lump them all into the same bucket as native and fluent English speakers”

Well, there is a blindness here, but it’s not the assessments.  It’s us. Conventional wisdom dictates that including scores from ELL students will depress academic growth — and I’ll admit that I believed it as well (although to a lesser extent than some).  I doubt I’m the only one.  But we are all mistaken, as this perspective could be Exhibit A for the blind acceptance of opinion and conjecture at the expense of data.

For what the data shows is that, over the past five years across Colorado, ELL students do better on median growth percentiles than non-ELL students — on aggregate, on every subject, and at virtually every grade level. Let’s start with a view across Colorado (red signifies where ELL students did worse than non-ELL students):

In aggregate, there is one subject score in the past five years where ELL students performed worse than non-ELL students. One.  So perhaps the claim “While we all want students to grow and succeed, [ELL] student growth must be reviewed under a different lens” takes on a different meaning.

Think you should have lower expectations for ELL students on academic growth, and for the schools and teachers that educate them?  Well, think again. Here is the 2011 Colorado growth data by grade level:

ELL students performed worse than their non-ELL peers on just three subject scores — two of them in 4th grade — out of 21 scores total. One of these three is much lower (4th grade reading).  But in six subject scores, ELL students were 5 points or more better than their non- ELL peers. Not, perhaps, what the many commentators on the original piece had in mind.

When questioned about a view contrary to what he had advocated previously, the economist John Maynard Keynes is reputed to have said: “When the circumstances change, I change my mind. What do you do, Sir?” So, I’ll admit it: I found the above surprising.  This data presents an entirely different set of circumstances than I previously envisioned.

But the mind resists.  Perhaps the ELL student growth is only higher when aggregated at the state level, and the results are different in urban centers like Denver?   Well, no – in Denver, the gap between ELL students and their non-ELL peers actually increases.  Over the past five years, ELL students in DPS have outperformed non-ELL students on growth, every year and in every subject, by an average of between 2 and 5 percentage points:

This data, which one can compile for oneself using CDE’s DataLab, turns the argument inside out.  Schools with large proportions of ELL students have higher growth percentiles — and thus better rankings — then if they had no ELL students at all. In fact, if there are any schools at a structural disadvantage on this issue, it is those who are without ELL students, not those with them.

How does this impact policy?  Let’s look at the four schools that were cited as specific examples — where a high proportion of ELL students were assumed to be damaging their median growth percentile scores, and thus their ColoradoSchoolGrade rankings:

Not once do ELL students score lower than non-ELL students.  Not in any school.  Not in any subject.  All of these schools have higher growth in their ELL students than their non-ELL students. And it’s not a small advantage, there is an overall average of 10 points higher. All four of these schools have growth in their ELL populations well above the state median (50).  All four have growth in their non-ELL students well below the state median. What is being penalized again?

Complain about their low rankings on assessments all one wants — but don’t blame the ELL students and their teachers. Without their scores, the schools would have all ranked lower, not higher, on assessment data. It is the ELL scores that are holding these schools up, not pushing them down.

When conditions change, I change my mind.  If there is a lesson in the past few years, it is that many conditions previously assumed simply don’t hold up under the sunlight of data and assessment. There are schools with mostly high-poverty students with remarkable academic growth.  There are schools with affluent school populations performing below the state averages.  There are specific demographic groups where students are experiencing growth far beyond assumptions.  There are schools where the achievement gap persists, masked by aggregated averages.  Conditions are not what many people thought they were.

And this is the fundamental point of a site like ColoradoSchoolGrades.com. It is to present data to eliminate the misconceptions of schools and students that have no bearing in fact.  It is to try to reduce subjective opinion and unwarranted bias. It is to provide data as an entry point to further inquiry and a more complete understanding.

So when a sitting school board member criticizes this effort by writing:

As I retrospectively consider last year’s board decision to phase out Montbello High School, for example, imagine how differently the school’s performance would have seemed had we considered the 25 percent of students who are not ready to take the CSAP in English.  Not being a statistician, I will defer to those who can calculate the probability of a 25 percent bump in overall achievement if those scores are even just disaggregated from the whole.  I wonder if the board’s decision would have been different if we knew then what I know now.

Well, we don’t have to speculate here, and we don’t need a complicated statistical analysis.  We need to have transparency and access to data — which, thanks in large part to the organizations maligned elsewhere in that post, are increasingly common. It’s now remarkably easy to find out exactly what the growth percentile would have been in Montbello had it not included ELL students (and it’s in the chart above). Note first that while there are 25 percent ELL students in the school overall, the cohort used to measure academic growth had an even higher proportion — a full 40 percent of students were ELL.  That, under the initial assumptions, was the injustice that unfairly sealed their doom.

So what would the schools median growth percentile been with a “bump in overall achievement” from eliminating the scores of the ELL students?  A full five points, um,  lower.  Yup, pull out the ELL students, and the academic growth scores at Montbello would have been even worse.  The decision to close Montbello might not have been different, but the conversation probably should have been more informed.

And that’s the point of data and assessments, isn’t it?  To rid us collectively of false assumptions and the dogma of our preconceived notions.  The erroneous and misshapen critique that evaluation systems punish schools with high ELL populations proves the fundamental point of why these systems are so critically important.  Don’t rely on speculation. Don’t assume that you know how good or bad a school is based on conjecture. Don’t conjure up some false reasoning for why some schools are penalized by the results. Start with the basic data, ask questions, and dig deeper.

And the growth data for ELL students surfaces some really interesting further questions: Why do ELL students have higher growth?  Which ELL students (based on the CELA test) are progressing the most, and how quickly?  Do school with a specific ELA programs see faster growth than those that don’t?  Are there schools whose ELL students — particularly ones with initial low CELA scores — who seem to be able to move these students along even faster?

CDE apparently has started to analyze CELA cohort growth scores, and hopefully they will quickly make these available.  There is a lot more to do here, and this subject deserves increased attention, not less.  But it also needs the right kind of inquiry.

We should try to answer a lot more questions regarding ELL students, but we should do so with an attention to data and to fact – not based on the assumptions, accusations, and preconceived notions that have far too often held all of us in thrall. Circumstances are often not what they first seem.

About our First Person series:

First Person is where Chalkbeat features personal essays by educators, students, parents, and others trying to improve public education. Read our submission guidelines here.