Saturday, January 21, 2006

Education City in the news

My biggest reason for skepticism about Qatar's Education City was that you just never heard about it in the US media. Well, CNET has just run a nice series about Education City.

I think the articles pretty well reflect what I saw and heard there. One aspect deserves highlighting:

Early on, the Qatar Foundation and the government agreed not to interfere with academic policies or admissions, two key requirements for participating universities.

This is really the key to understanding that Education City is for real. The universities take this academic freedom very seriously. When a rumor spread that a Jewish faculty member thought that he wouldn't be able to go to the campus, the Dean immediately got involved (the rumor wasn't true).

That's why education is the perfect way to pull the country (and, perhaps, the region) ahead. You either allow the free and open exchange of ideas by all people or you'll never be world class.

Monday, January 16, 2006

What counts in What Works

The What Works Clearinghouse may be the best idea that the US Department of Education has ever had, and I'm not saying that just because I have a paper in it.

The What Works Clearinghouse masquerades as a website intended to help teachers and administrators sort through the piles of what passes for "research" in the educational community so that they can find out what has been shown, through careful experimentation, to actually work.

But the real mission of the What Works Clearinghouse is to raise the bar - to change the conversation about effectiveness in education. So, when the principal of North High School tells the local paper that they started having math pep rallies and now test scores are up, South High School doesn't just start doing that. They call North and ask "Do you really think it was the pep rallies? What about those 3 new teachers you hired? What about the afterschool tutoring program you started? Do you have any proof that the pep rallies caused the rise in test scores, or was it something else?"

And when that starts happening, the educational market moves. Schools stop buying textbooks or computer software that haven't been proven to increase student learning more than what they're currently doing.

The thing is, the WWC is working.

It hasn't reached principals and adminstrators yet, but the publishers see it coming. They're scared. Terrified.

They know that they need to get some research done, fast.

But they don't quite get it yet.

Consider the What Works Clearinghouse's report on Middle School Math. The WWC identified 10 studies that were either very well designed or moderately well-designed. Of those 10 studies, though, only two show statistically significant results. In other words, its not enough to do a well-designed study. You also need to show, in your study, that your educational materials are actually better than the comparison. And that's hard to do.

Part of this may be that the studies were too small or the the data wasn't analyzed properly, but part of it is due to the fact that its really hard for educational materials to make a difference. How much of a difference does a textbook really make? Compared to a teacher? Compared to school funding or administration?

The fact is, most textbooks are the same. So, when you compare them to each other, you don't see much difference. Lots of other factors are more important in mathematics achievement.

This isn't to say that educational materials don't matter. They matter a lot. But you can't expect to produce educational materials that a just a bit different from what students are currently using and see a big difference. You need to create educational materials that are really different.

And here's where the publishers really don't quite get it.

They think that getting in the WWC is a factor of marketing. They're afraid that, if they're not in there, they won't be able to sell.

But getting into the WWC is up to development, not marketing. You need to produce different materials to produce a different effect.

Consider this press release from Pearson. The release claims that they did a WWC-quality study for their Pearson Prentice Hall Algebra 1 text and that everything's great. The press release points to this report.

Here's what the report says:

The report bullets "Users of the program consistently performed as well as students who used other programs."

In other words, their text did not work any better than what they compared it to. There isn't any indication that even high-performing students, who are singled out as improving the most, improved relative to the control group.

The main result, improvement from pre- to post-test, is a red herring. Considering that this was a year-long study, I'd certainly hope that students improve from pre- to post-test. In a year of math class, they'd better learn something. Improvement from pre- to post-test is explicitly not what the What Works Clearinghouse cares about.

In the scientific community, this would be a disappointing result. In the educational community, its a press release.

This page is powered by Blogger. Isn't yours?