www.AALF.org

AALF

Anytime Anywhere Learning
More information »


News and Thoughts

What Do We Mean By Success?

We learn a great deal from what others are doing. In this issue, we are focusing on several 1:1 initiatives in Latin America. Each of the initiatives are large and show a commitment at either the provincial/state or federal levels. There are a number of other examples of commitments to 1:1 at these levels – Australia and the state of Maine are two about which we hear frequently. But even this level of government commitment doesn’t mean implementations are problemfree. As you’ll read in the article summarizing a report from the Inter-American Development Bank (IDB), Peru’s countrywide OLPC initiative has not only met with a number of challenges, but, according to the IDB, there is “little solid evidence regarding the effectiveness of this program.”

True, this initiative and many others have run into problems. Were these avoidable? Most likely a number of them were. But they did not occur because there was fault with the idea of 1:1. We – readers and the media – need to separate the discussions of implementing a vision from condemning the vision itself.

Some problems were the result of, as IDB calls it, ‘magical thinking’ – the idea that just adding laptops to a school without making other changes automatically changes everything. Some problems may have been the result of not paying enough attention to lessons learned from the many 1:1 initiatives of the last 20 years, which form the basis of planning and implementation frameworks such as AALF’s 21 Steps to 21st Century Learning (Note: an expanded, revised edition of the 21 Steps to 21st Century Learning guidebook will be available from AALF this fall). Some were the result of new, and, too often, buggy hardware and software - perhaps this large scale user testing of the technology should not have been combined with the start of such a large-scale 1:1 initiative. Mistakes were made, but the report and the media coverage of the report seem to point to two additional, and central, problems.

First, there’s the question of how to measure success. Is it based on results of standardized testing or are there new measures at which we need look? Although we tend to speak of 21st century skills and new ways to assess these skills, too often the success or failure of an initiative (as reported to the public) is based on the results of standardized tests. Don’t new skills call for new forms of assessment? Clearly, there is a disconnect between the vision and goals for an initiative and how we evaluate it.

A second, perhaps more urgent, issue is how information about initiatives is communicated to stakeholders as well as to the world at large. How much of what we read or hear is the real story? Read more deeply into the IDB report (as well as Claudia Urrea’s response to the report) and you’ll see the results aren’t as negative as the headlines would indicate. The initiative has been effective in achieving positive shifts in a number of important areas – reasoning abilities, verbal fluency, and processing speed showed improvement across a variety of assessments. Unfortunately, bad news makes better sound bytes, while successes can seem rather bland. So there’s a real challenge to re-focus media attention on what has been accomplished and what this can mean for the future of students in Peru and others around the world.

Yes, there have been problems, but there’s an opportunity to learn from them. That’s what we need to be doing.
August 8th, 2012 @ 10:52AM