ALTRO!

 

E così cambiò la mia vita. Avrò avuto vent'anni, forse ventuno, e volevo solo studiare storia e filosofia: per questo pendolavo tra la Città Universitaria (storia) e Villa Mirafiori (Filosofia), anche se di base preferivo Villa (e vedi un po'). E proprio lì, un giorno, credo fosse fine novembre, un amico mi chiese di accompagnarlo a seguire una lezione di Pedagogia generale. "Senti Giuseppe, a me non interessa". "Neanche a me, ma pare che serva per l'insegnamento".

"Può servire per l'insegnamento": la cosa, negli anni novanta, sembrava avere senso. "Ok, andiamo, vediamo, poi decidiamo". Il corso (annuale: allora esistevano questi corsi così, che iniziavano a novembre e finivano a maggio) era già cominciato da qualche settimana. Entrammo nell'aula (l'aula due) per ultimi e ci sedemmo in fondo, attraversando imbarazzati lo spazio tra la cattedra e i nostri posti. Una mossa sciocca. Infatti il professore riconobbe immediatamente nella trentina di volti "due ragazzi nuovi". "Ci sono due ragazzi nuovi, vedo. Bene, oggi procediamo così: qualche volontario racconterà a questi due nuovi la storia che abbiamo sin qui letto, gli altri interverranno per aggiungere cose che ritengono rilevanti". Io, sbalordito, pensai: "vabbe' questo è pazzo: è chiaro che nessuno avrà il coraggio di prendere parola e arrischiare figuracce davanti a lui". E invece iniziarono, rivolti a noi, dando le spalle al professore, a raccontare la storia di questa colonia di sbandati nell'Ucraina Sovietica dilaniata dalla guerra civile, e di questo pedagogista che sbotta, abbandona Rousseau per fare a cazzotti con un adolescente bandito (Zadorov) per poi costruire un'umanità Nuova fatta di collettivo e prospettiva, rimanendo in equilibrio tra marxismo e sistema sovietico (a un certo punto per i kulaki si mette male). Il professore interveniva di tanto in tanto, voce tra le voci, per aggiungere e approfondire, ma erano le ragazze e i ragazzi, di testa loro, a dettare tempi e contenuti della narrazione.

Per quanto trovassi esaltanti lezioni di filosofia antica di Giannantoni (mi trasportavano di peso lì, a Elea, con Parmenide), pensai che una roba simile fosse semplicemente strabiliante. Quel professore, agendo dietro le quinte, ché il protagonista dello spettacolo non era certo lui, aveva usato, diremmo oggi, una valutazione "autentica" e "formativa" per "includere" due tizi nuovi, che altrimenti si sarebbero trovati disorientati. "Ma tu guarda che cosa meravigliosa è questa pedagogia, io da grande voglio fare questa cosa qui, voglio insegnare come lui".

Giuseppe ed io modificammo il piano di studi e ci laureammo su argomenti pedagogici (con altri docenti) e oggi lui insegna all'Università di Ferrara. Nel corso del tempo quel professore prese a chiamarmi Ivan, su suggerimento di un suo assistente (che oggi insegna a L'Aquila), perché pare avessi le fattezze d'un marinaio dell'incrociatore Aurora.

Ho poi imparato che le capacità del professor Nicola Siciliani de Cumis non sono alla portata di tutti: io, per esempio, certe vette non le ho mai raggiunte.

E ancora a proposito di accountability e ricerca: bussare coi piedi!

Questi quesiti, somministrati dall'Invalsi, avrebbero forse senso all'interno di una indagine negoziata coi contesti. Indagare le associazioni tra aspettative e rendimento è utile, ma questi sono quesiti che, per il solo fatto di essere posti, attivano dinamiche che vanno gestite con attenzione e previste (penso, ad esempio, alla letteratura sulla minaccia di stereotipo). Però questi item sono rovesciati su studentesse e studenti e sui contesti senza che questi abbiano voce in capitolo. E non sono chiariti i benefici per chi risponde né quelli per i contesti misurati .

Ecco, a Roma si dice che dagli ospiti si bussa coi piedi, ma a me sembra che l'Invalsi interpreti male il detto. Si tratta di un invito a recare doni con le mani, non a sfondare a calci le porte.

 

Ho lavorato per anni nella ricerca Pisa. Ho una posizione critica su come viene utilizzata l’indagine e alcune delle preoccupazioni qui sotto espresse sono complessivamente fondate, anche se, in generale, ho l’impressione che si attribuiscano a Pisa le criticità e le debolezze che ciascun contesto manifesta nella propria politica educativa. Ricordo che non esiste un test valido in assoluto: dipende dalle finalità per cui somministriamo e dall’uso che facciamo dei dati. Per cui: siamo proprio sicuri che il problema sia Pisa? Pisa confronta risultati rilevati con prove semistrutturate. Non è metodologicamente attrezzata per dirci quale sistema educativo, meno che mai quale scuola, è qualitativamente migliore. Chi trae inferenze simili gioca sull’analfabetismo scientifico dell’utenza. E inoltre confonde misurazione e valutazione.

Però: chi ha scritto la lettera e, più in generale, chi critica le prove Pisa, è a conoscenza del fatto che oltre la metà delle domande sono a risposta aperta? È a conoscenza del fatto che molti quesiti richiedono di esplicitare criticamente il proprio punto di vista? O che una dimensione fondamentale del costrutto di “reading literacy” è la valutazione critica del contenuto e/o della forma di quanto letto?

Insomma: ho l’impressione che l’indagine Pisa sia più contestata che conosciuta.

Tuttavia rimango concorde con il succo delle critiche: in generale l’utilizzo fatto delle misure è aberrante e invalida in molti contesti la possibilità di utilizzarle per informare la nostra valutazione dell’equità e dell’efficacia dei sistemi educativi.

Qui trovate un ottimo intervento di Bruno Losito sulla faccenda.

 

Cristiano Corsini

 

 

1 La lettera del mondo accademico sul ruolo svolto dall'OCSE-PISA negli ultimi anni

 

 

Dear Dr Schleicher,

 

We write to you in your capacity as OECD’s (Organisation for Economic Co-operation and Development) director of the Programme of International Student Assessment (Pisa). Now in its 13th year, Pisa is known around the world as an instrument to rank OECD and non-OECD countries (60-plus at last count) according to a measure of academic achievement of 15-year-old students in mathematics, science, and reading. Administered every three years, Pisa results are anxiously awaited by governments, education ministers, and the editorial boards of newspapers, and are cited authoritatively in countless policy reports. They have begun to deeply influence educational practices in many countries. As a result of Pisa, countries are overhauling their education systems in the hopes of improving their rankings. Lack of progress on Pisa has led to declarations of crisis and “Pisa shock” in many countries, followed by calls for resignations, and far-reaching reforms according to Pisa precepts.

 

We are frankly concerned about the negative consequences of the Pisa rankings. These are some of our concerns:

 

• While standardised testing has been used in many nations for decades (despite serious reservations about its validity and reliability), Pisa has contributed to an escalation in such testing and a dramatically increased reliance on quantitative measures. For example, in the US, Pisa has been invoked as a major justification for the recent “Race to the Top” programme, which has increased the use of standardised testing for student-, teacher-, and administrator evaluations, which rank and label students, as well as teachers and administrators according to the results of tests widely known to be imperfect (see, for example, Finland’s unexplained decline from the top of the Pisa table).

 

• In education policy, Pisa, with its three-year assessment cycle, has caused a shift of attention to short-term fixes designed to help a country quickly climb the rankings, despite research showing that enduring changes in education practice take decades, not a few years, to come to fruition. For example, we know that the status of teachers and the prestige of teaching as a profession have a strong influence on the quality of instruction, but that status varies strongly across cultures and is not easily influenced by short-term policy.

 

• By emphasising a narrow range of measurable aspects of education, Pisa takes attention away from the less measurable or immeasurable educational objectives like physical, moral, civic and artistic development, thereby dangerously narrowing our collective imagination regarding what education is and ought to be about.

 

• As an organisation of economic development, OECD is naturally biased in favour of the economic role of public [state] schools. But preparing young men and women for gainful employment is not the only, and not even the main goal of public education, which has to prepare students for participation in democratic self-government, moral action and a life of personal development, growth and wellbeing.

 

• Unlike United Nations (UN) organisations such as UNESCO or UNICEF that have clear and legitimate mandates to improve education and the lives of children around the world, OECD has no such mandate. Nor are there, at present, mechanisms of effective democratic participation in its education decision-making process.

 

• To carry out Pisa and a host of follow-up services, OECD has embraced “public-private partnerships” and entered into alliances with multi-national for-profit companies, which stand to gain financially from any deficits—real or perceived—unearthed by Pisa. Some of these companies provide educational services to American schools and school districts on a massive, for-profit basis, while also pursuing plans to develop for-profit elementary education in Africa, where OECD is now planning to introduce the Pisa programme.

 

• Finally, and most importantly: the new Pisa regime, with its continuous cycle of global testing, harms our children and impoverishes our classrooms, as it inevitably involves more and longer batteries of multiple-choice testing, more scripted “vendor”-made lessons, and less autonomy for teachers. In this way Pisa has further increased the already high stress level in schools, which endangers the wellbeing of students and teachers.

 

These developments are in overt conflict with widely accepted principles of good educational and democratic practice:

 

• No reform of any consequence should be based on a single narrow measure of quality.

 

• No reform of any consequence should ignore the important role of non-educational factors, among which a nation’s socio-economic inequality is paramount. In many countries, including the US, inequality has dramatically increased over the past 15 years, explaining the widening educational gap between rich and poor which education reforms, no matter how sophisticated, are unlikely to redress.

 

• An organisation like OECD, as any organisation that deeply affects the life of our communities, should be open to democratic accountability by members of those communities.

 

We are writing not only to point out deficits and problems. We would also like to offer constructive ideas and suggestions that may help to alleviate the above mentioned concerns. While in no way complete, they illustrate how learning could be improved without the above mentioned negative effects:

 

1 Develop alternatives to league tables: explore more meaningful and less easily sensationalised ways of reporting assessment outcomes. For example, comparing developing countries, where 15-year-olds are regularly drafted into child labour, with first-world countries makes neither educational nor political sense and opens OECD up for charges of educational colonialism.

 

2 Make room for participation by the full range of relevant constituents and scholarship: to date, the groups with greatest influence on what and how international learning is assessed are psychometricians, statisticians, and economists. They certainly deserve a seat at the table, but so do many other groups: parents, educators, administrators, community leaders, students, as well as scholars from disciplines like anthropology, sociology, history, philosophy, linguistics, as well as the arts and humanities. What and how we assess the education of 15-year-old students should be subject to discussions involving all these groups at local, national, and international levels.

 

3 Include national and international organisations in the formulation of assessment methods and standards whose mission goes beyond the economic aspect of public education and which are concerned with the health, human development, wellbeing and happiness of students and teachers. This would include the above mentioned United Nations organisations, as well as teacher, parent, and administrator associations, to name a few.

 

4 Publish the direct and indirect costs of administering Pisa so that taxpayers in member countries can gauge alternative uses of the millions of dollars spent on these tests and determine if they want to continue their participation in it.

 

5 Welcome oversight by independent international monitoring teams which can observe the administration of Pisa from the conception to the execution, so that questions about test format and statistical and scoring procedures can be weighed fairly against charges of bias or unfair comparisons.

 

6 Provide detailed accounts regarding the role of private, for-profit companies in the preparation, execution, and follow-up to the tri-annual Pisa assessments to avoid the appearance or reality of conflicts of interest.

 

7 Slow down the testing juggernaut. To gain time to discuss the issues mentioned here at local, national, and international levels, consider skipping the next Pisa cycle. This would give time to incorporate the collective learning that will result from the suggested deliberations in a new and improved assessment model.

 

We assume that OECD’s Pisa experts are motivated by a sincere desire to improve education. But we fail to understand how your organisation has become the global arbiter of the means and ends of education around the world. OECD’s narrow focus on standardised testing risks turning learning into drudgery and killing the joy of learning. As Pisa has led many governments into an international competition for higher test scores, OECD has assumed the power to shape education policy around the world, with no debate about the necessity or limitations of OECD’s goals. We are deeply concerned that measuring a great diversity of educational traditions and cultures using a single, narrow, biased yardstick could, in the end, do irreparable harm to our schools and our students.

 

2 La replica di PISA

 

The letter by Dr Heinz-Dieter Meyer and other academics (OECD and Pisa tests are damaging education worldwide – academics, theguardian.com, 6 May) makes a series of false claims regarding the Organisation for Economic Co-operation and Development's Pisa programme. There is nothing that suggests that Pisa, or other educational comparisons, have caused a "shift to short-term fixes" in education policy. On the contrary, by opening up a perspective to a wider range of policy options that arise from international comparisons, Pisa has provided many opportunities for more strategic policy design. It has also created important opportunities for policy-makers and other stakeholders to collaborate across borders. The annual International Summit of the Teaching Profession, where ministers meet with union leaders to discuss ways to raise the status of the teaching profession, is an example. Not least, while it is undoubtedly true that some reforms take time to bear fruit, a number of countries have in fact shown that rapid progress can be made in the short term, eg Poland, Germany and others making observable steady progress every three years.

 

Equally, there are no "public-private partnerships" or other "alliances" in Pisa of the type Dr Meyer implies. All work relating to the development, implementation and reporting of Pisa is carried out under the sole responsibility of the OECD, under the guidance of the Pisa governing board. The OECD does, of course, contract specific technical services out to individuals, institutions or companies. Where it does, these individuals, institutions or companies are appointed by the OECD following an open, transparent and public call for tender. This transparent and open process ensures that each task is carried out by those entities that demonstrate they are best-qualified and provide the best value for money. No individual academic, institution or company gains any advantage from this since the results of all Pisa-related work are placed in the public domain.

 

Furthermore, in the article by Peter Wilby (Pisa league tables killing 'joy of learning', 6 May) it is stated that Pearson is overseeing the Pisa 2015 assessment, which is not the case. Pearson was one of a number of contractors who have been appointed through a competitive tendering process to develop and implement Pisa 2015. Pearson's contract to develop the assessment framework has been completed and has now come to an end.

Andreas Schleicher

Acting director of education, OECD

 

 

 

Cristiano Corsini

Professore associato di Pedagogia sperimentale,

Presidente del Corso di Laurea Magistrale in Scienze Pedagogiche (LM-85),

Università d'Annunzio di Chieti-Pescara,

Dipartimento di Scienze filosofiche, pedagogiche ed economico-quantitative.

 

 

COPYRIGHT © 2012 | ALL RIGHTS RESERVED

PRIVACY STATEMENT | TERMS & CONDITIONS