First Person

Opinion: ELL students and changing minds

Alexander Ooms, a senior fellow at the Donnell-Kay Foundation, is a member of the board of the Charter School Institute, the West Denver Preparatory Charter School and the Colorado chapter of Stand for Children.

Additional attention to English Language Learner (ELL) students is unquestionably a good thing. Particularly given the large percentages of ELL students both in Denver and across Colorado, there can be no doubt that this is a critical issue.  There is simply not enough concerted attention on how schools support ELL students — and especially on specific strategies at both the district and school level to see what is most effective.

What there should not be, however, is opinion substituting for fact.

A Denver Public Schools student does class work in this EdNews file photo

A recent discussion on these page does exactly that. The claim is that any attempt at quantitative assessment — through state and district tools such as School Performance Frameworks, or representation of this data on sites like ColoradoSchoolGrades.com  — unjustly punish schools with high percentages of ELL students.

These kids, so the theory goes, don’t learn as fast as their non-ELL peers, and schools who have more of them will always do worse on academic growth.  And growth percentiles are the primary driver in most assessments.  By holding all schools equally accountable for the academic growth of their students, as a member of the Denver school board put it, these systems are shamefully guilty of:

“accountability blinders that punish schools and kids for their English-proficiency differences by trying to lump them all into the same bucket as native and fluent English speakers”

Well, there is a blindness here, but it’s not the assessments.  It’s us. Conventional wisdom dictates that including scores from ELL students will depress academic growth — and I’ll admit that I believed it as well (although to a lesser extent than some).  I doubt I’m the only one.  But we are all mistaken, as this perspective could be Exhibit A for the blind acceptance of opinion and conjecture at the expense of data.

For what the data shows is that, over the past five years across Colorado, ELL students do better on median growth percentiles than non-ELL students — on aggregate, on every subject, and at virtually every grade level. Let’s start with a view across Colorado (red signifies where ELL students did worse than non-ELL students):

In aggregate, there is one subject score in the past five years where ELL students performed worse than non-ELL students. One.  So perhaps the claim “While we all want students to grow and succeed, [ELL] student growth must be reviewed under a different lens” takes on a different meaning.

Think you should have lower expectations for ELL students on academic growth, and for the schools and teachers that educate them?  Well, think again. Here is the 2011 Colorado growth data by grade level:

ELL students performed worse than their non-ELL peers on just three subject scores — two of them in 4th grade — out of 21 scores total. One of these three is much lower (4th grade reading).  But in six subject scores, ELL students were 5 points or more better than their non- ELL peers. Not, perhaps, what the many commentators on the original piece had in mind.

When questioned about a view contrary to what he had advocated previously, the economist John Maynard Keynes is reputed to have said: “When the circumstances change, I change my mind. What do you do, Sir?” So, I’ll admit it: I found the above surprising.  This data presents an entirely different set of circumstances than I previously envisioned.

But the mind resists.  Perhaps the ELL student growth is only higher when aggregated at the state level, and the results are different in urban centers like Denver?   Well, no – in Denver, the gap between ELL students and their non-ELL peers actually increases.  Over the past five years, ELL students in DPS have outperformed non-ELL students on growth, every year and in every subject, by an average of between 2 and 5 percentage points:

This data, which one can compile for oneself using CDE’s DataLab, turns the argument inside out.  Schools with large proportions of ELL students have higher growth percentiles — and thus better rankings — then if they had no ELL students at all. In fact, if there are any schools at a structural disadvantage on this issue, it is those who are without ELL students, not those with them.

How does this impact policy?  Let’s look at the four schools that were cited as specific examples — where a high proportion of ELL students were assumed to be damaging their median growth percentile scores, and thus their ColoradoSchoolGrade rankings:

Not once do ELL students score lower than non-ELL students.  Not in any school.  Not in any subject.  All of these schools have higher growth in their ELL students than their non-ELL students. And it’s not a small advantage, there is an overall average of 10 points higher. All four of these schools have growth in their ELL populations well above the state median (50).  All four have growth in their non-ELL students well below the state median. What is being penalized again?

Complain about their low rankings on assessments all one wants — but don’t blame the ELL students and their teachers. Without their scores, the schools would have all ranked lower, not higher, on assessment data. It is the ELL scores that are holding these schools up, not pushing them down.

When conditions change, I change my mind.  If there is a lesson in the past few years, it is that many conditions previously assumed simply don’t hold up under the sunlight of data and assessment. There are schools with mostly high-poverty students with remarkable academic growth.  There are schools with affluent school populations performing below the state averages.  There are specific demographic groups where students are experiencing growth far beyond assumptions.  There are schools where the achievement gap persists, masked by aggregated averages.  Conditions are not what many people thought they were.

And this is the fundamental point of a site like ColoradoSchoolGrades.com. It is to present data to eliminate the misconceptions of schools and students that have no bearing in fact.  It is to try to reduce subjective opinion and unwarranted bias. It is to provide data as an entry point to further inquiry and a more complete understanding.

So when a sitting school board member criticizes this effort by writing:

As I retrospectively consider last year’s board decision to phase out Montbello High School, for example, imagine how differently the school’s performance would have seemed had we considered the 25 percent of students who are not ready to take the CSAP in English.  Not being a statistician, I will defer to those who can calculate the probability of a 25 percent bump in overall achievement if those scores are even just disaggregated from the whole.  I wonder if the board’s decision would have been different if we knew then what I know now.

Well, we don’t have to speculate here, and we don’t need a complicated statistical analysis.  We need to have transparency and access to data — which, thanks in large part to the organizations maligned elsewhere in that post, are increasingly common. It’s now remarkably easy to find out exactly what the growth percentile would have been in Montbello had it not included ELL students (and it’s in the chart above). Note first that while there are 25 percent ELL students in the school overall, the cohort used to measure academic growth had an even higher proportion — a full 40 percent of students were ELL.  That, under the initial assumptions, was the injustice that unfairly sealed their doom.

So what would the schools median growth percentile been with a “bump in overall achievement” from eliminating the scores of the ELL students?  A full five points, um,  lower.  Yup, pull out the ELL students, and the academic growth scores at Montbello would have been even worse.  The decision to close Montbello might not have been different, but the conversation probably should have been more informed.

And that’s the point of data and assessments, isn’t it?  To rid us collectively of false assumptions and the dogma of our preconceived notions.  The erroneous and misshapen critique that evaluation systems punish schools with high ELL populations proves the fundamental point of why these systems are so critically important.  Don’t rely on speculation. Don’t assume that you know how good or bad a school is based on conjecture. Don’t conjure up some false reasoning for why some schools are penalized by the results. Start with the basic data, ask questions, and dig deeper.

And the growth data for ELL students surfaces some really interesting further questions: Why do ELL students have higher growth?  Which ELL students (based on the CELA test) are progressing the most, and how quickly?  Do school with a specific ELA programs see faster growth than those that don’t?  Are there schools whose ELL students — particularly ones with initial low CELA scores — who seem to be able to move these students along even faster?

CDE apparently has started to analyze CELA cohort growth scores, and hopefully they will quickly make these available.  There is a lot more to do here, and this subject deserves increased attention, not less.  But it also needs the right kind of inquiry.

We should try to answer a lot more questions regarding ELL students, but we should do so with an attention to data and to fact – not based on the assumptions, accusations, and preconceived notions that have far too often held all of us in thrall. Circumstances are often not what they first seem.

First Person

Why the phrase ‘with fidelity’ is an affront to good teaching

PHOTO: Alan Petersime

“With fidelity” are some of the most damaging words in education.

Districts spend a ton of money paying people to pick out massively expensive, packaged curriculums, as if every one of a thousand classrooms needs the exact same things. Then officials say, over and over again, that they must be implemented “with fidelity.” What they mean is that teachers better not do anything that would serve their students’ specific needs.

When that curriculum does nothing to increase student achievement, it is not blamed. The district person who found it and purchased it is never blamed. Nope. They say, “Well, the teachers must not have been implementing it with fidelity.”

It keeps happening because admitting that schools are messy and students are human and teaching is both creative and artistic would also mean you have to trust teachers and let them have some power. Also, there are some really crappy teachers out there, and programs for everyone are often meant to push that worst-case-scenario line a little higher.

And if everyone’s doing just what they’re supposed to, we’ll get such good, clean numbers, and isn’t that worth a few thousand more dollars?

I was talking with a friend recently, a teacher at an urban school on the East Coast. He had been called to task by his principal for splitting his kids into groups to offer differentiated math instruction based on students’ needs. “But,” the principal said, “did the pacing guide say to differentiate? You need to trust the system.”

I understand the desire to find out if a curriculum “works.” But I don’t trust anyone who can say “trust the system” without vomiting. Not when the system is so much worse than anything teachers would put together.

Last year, my old district implemented Reading Plus, an online reading program that forces students to read at a pace determined by their scores. The trainers promised, literally promised us, that there wasn’t a single reading selection anywhere in the program that could be considered offensive to anyone. God knows I never learned anything from a book that made me feel uncomfortable!

Oh, and students were supposed to use this program — forced-paced reading of benign material followed by multiple-choice questions and more forced-pace reading — for 90 minutes a week. We heard a lot about fidelity when the program did almost nothing for students (and, I believe quite strongly, did far worse than encouraging independent reading of high-interest books for 90 minutes a week would have done).

At the end of that year, I was handed copies of next year’s great adventure in fidelity. I’m not in that district any longer, but the whole district was all switching over to SpringBoard, another curriculum, in language arts classes. On came the emails about implementing with fidelity and getting everyone on the same page. We were promised flexibility, you know, so long as we also stuck to the pacing guide of the workbook.

I gave it a look, I did, because only idiots turn down potential tools. But man, it seemed custom-built to keep thinking — especially any creative, critical thought from either students or teachers — to a bare minimum.

I just got an email from two students from last year. They said hi, told me they missed creative writing class, and said they hated SpringBoard, the “evil twin of Reading Plus.”

That district ran out of money and had to cut teachers (including me) at the end of the year. But if they hadn’t, I don’t think I would have lasted long if forced to teach from a pacing guide. I’m a good teacher. Good teachers love to be challenged and supported. They take feedback well, but man do we hate mandates for stuff we know isn’t best for the kids in our room.

Because, from inside a classroom full of dynamic, chaotic brilliance;

from a classroom where that kid just shared that thing that broke all of our hearts;

from a classroom where that other kid figured out that idea they’ve been working on for weeks;

from that classroom where that other kid, who doesn’t know enough of the language, hides how hard he works to keep up and still misses things;

and from that classroom where one kid isn’t sure if they trust you yet, and that other kid trusts you too much, too easily, because their bar had been set too low after years of teachers that didn’t care enough;

from inside that classroom, it’s impossible to trust that anyone else has a better idea than I do about what my students need to do for our next 50 minutes.

Tom Rademacher is a teacher living in Minneapolis who was named Minnesota’s Teacher of the Year in 2014. His book, “It Won’t Be Easy: An Exceedingly Honest (and Slightly Unprofessional) Love Letter to Teaching,” was published in April. He can be found on Twitter @mrtomrad and writes on misterrad.tumblr.com, where this post first appeared.

First Person

What I learned about the limits of school choice in New York City from a mother whose child uses a wheelchair

PHOTO: Patrick Wall

As a researcher interested in the ways online platforms impact learning and educational decision-making, I’ve been trying to understand how New York City parents get the information to make a crucial decision: where to send their children to school.

So for the past six months, I’ve been asking local parents about the data they used to choose among the system’s 1700 or so schools.

I’ve heard all sorts of stories about the factors parents weigh when picking schools. Beyond the usual considerations like test scores and art programs, they also consider the logistics of commuting from the Bronx to the East Village with two children in tow, whether the school can accommodate parents and children who are still learning English, and how much money the parent-teacher association raises to supplement the school’s budget.

But for some families, the choice process begins and ends with the question: Is the building fully accessible?

The federal Americans with Disabilities Act requires public buildings constructed after 1992 to be fully accessible to people in wheelchairs. However, most New York City public school buildings were constructed prior to that law, and high construction costs have limited the number of new, fully accessible buildings.

As a result, a shocking 83 percent of New York City schools have been found non-compliant with the ADA, according to a two-year federal Department of Justice investigation whose findings the city Department of Education largely disputes. Recently, the city’s Office of Space Management has begun surveying buildings for full accessibility, but more work remains to be done.

One parent’s struggle to find a school suitable for her son, who has a physical disability but no cognitive issues, illustrates what a major role accessibility plays in some families’ decision-making.

Melanie Rivera is the mother of two and a native New Yorker living in Ditmas Park in Brooklyn’s District 22 who shared her story with me — and gave me permission to share it with others. Here is what she told me, in her own words:

My son Gabriel is seven years old. He was born with a condition called arthrogryposis, which affects the development of his joints. His hips, knees, and feet are affected and he has joint contractures, so his legs don’t bend and straighten the way most people’s do. In order to get around, he uses a combination of crutches and a wheelchair.

Before I had my differently-abled son, I was working in a preschool for children with special needs. The kids I worked with had cognitive developmental disabilities.

Despite my professional experience, I was overwhelmed when it was my turn to help my child with different abilities navigate the public school system. I can only imagine the students falling by the wayside because their parents don’t have that background.

When I was completing my son’s kindergarten application, I couldn’t even consider the academics of the school. My main priority was to tour the schools and assess their level of accessibility.

There are only a couple of ADA-accessible schools in my district, and there was no way of indicating on my son’s kindergarten application that he needed one. When we got the admissions results, he was assigned to his zoned school – which is not accessible.

I entered lengthy and extensive mediation to get him into an ADA-accessible school. At that point, I knew I would just have to take what I could get. For families whose children have special needs, “school choice” can ring hollow.

The process of finding any accessible school was a challenge. The DOE website allows families to search for ADA-accessible schools. But the site describes most schools as “partially accessible,” leaving it up to parents to call each school and say, “What do you mean by this?”

When I called the schools and asked, “Are you a barrier-free school?” the staff in the office didn’t know what the term meant. They might reply, “Oh yeah, we have a ramp.” I’d have to press further: “But can you get to the office? Can you get to every floor in the building?” The response was often, “Oh, I don’t know.”

Even the office staff didn’t know. But for my son’s sake, I needed to know.

Gabriel deserves the full range of academic and social experiences. So every day I make sure he’s learning in the least-restrictive environment — from the classroom, to phys ed, to field trips.

I believe the Department of Education also wants to make schools accessible and to place students with different abilities in settings where they’ll flourish, but the current system is not equipped to follow through on those good intentions. While I see gradual changes, I still know that if I don’t find the best placement for my son the system definitely won’t.

At the school level, administrators should know the details of their own school’s accessibility. Teachers should learn to include children with different abilities in their classrooms. Such a commitment means recognizing the value of inclusivity — not viewing accessibility as something ADA says you must do.

Before I had Gabriel, I never thought about accessibility. I never looked at street cutouts or thought about how to enter a store with steps. We’re probably all guilty of perpetuating exclusion at one point or another.

Recognizing that will allow us to change the status quo. It will allow every individual with a physical disability to fully participate in the public school system.

Claire Fontaine is a researcher at Data & Society, a research institute in New York City focused on social, cultural, and ethical issues arising from technological development. Kinjal Dave is a research assistant at Data & Society. You can read more about their project, which seeks to better understand the ways in which diverse New York City parents draw on school performance data, online dashboards, and school review websites when researching schools for their children.