PISA studies

from Wikipedia, the free encyclopedia
PISA logo

The PISA studies of the OECD are international studies of pupil achievement , in three annual basis the OECD and an increasing number are since 2000 in most Member States carried out by partner countries and that aim, for everyday and measure job-relevant knowledge and skills the age of fifteen. The acronym PISA is resolved differently in the two official languages of the OECD: English as the Program for International Student Assessment ( Program for International Student Assessment ) and French as programs international pour le suivi of the acquis of the élèves ( International Program on Tracking the users of pupils achieved ).

This article deals with the international study by the OECD. The OECD concept expressly provides for the possibility of participating states to add national components to the international test . This option was used to varying degrees in Germany from 2000 to 2018:

  • Ten times as many students were tested for the PISA-E comparison of federal states as for the German contribution to the international study.
  • In PISA-International-Plus , some school classes were tested a second time after one year in order to measure learning progress over the course of the ninth or tenth school year.

In addition, the OECD offers expansion modules that are only used by some of the countries , e.g. B .:

In 2011/12, the OECD had another study carried out that examines the skills of 16 to 65 year olds: the PIAAC , referred to by the media as “PISA for adults”.

concept

The following features differentiate PISA significantly from previous school performance assessments :

  • PISA is carried out on behalf of the governments (in Germany: the Conference of Ministers of Education , in Switzerland: the Confederation and the cantons ).
  • PISA is to be continued on a regular basis .
  • PISA examines students of an age group , not a school grade .
  • PISA does not concentrate on a single school subject , but examines the three areas of reading competence , mathematical competence and basic science education . This educational term is referred to in English as literacy and numeracy .
  • Tasks are embedded in “personally or culturally relevant contexts”.
  • PISA is not based on the intersection of national curricula , but claims to "... go beyond the measurement of school knowledge and capture the ability to use area-specific knowledge and skills to cope with authentic problems."
  • This is intended to serve the development of human capital, which the OECD defines as "the knowledge, skills, competences ... that are relevant for personal, social and economic well-being" (OECD 1999).

The OECD's contractual task is policy advice . PISA should not only provide a description of the current situation, but also trigger improvements. Insofar as PISA is based on its own educational concept, the claim is made at least implicitly to have a retroactive effect on the national curricula .

Each PISA study covers the three areas of reading, math and science. In each round, an area is examined in depth: reading skills in 2000, mathematics in 2003, natural sciences in 2006. This cycle is currently being run through a third time (2018, 2021, 2024). The results are published in December of the following year, the technical reports a few months later.

In addition, a cross-cutting theme is examined in each study: 2000 learning strategies and self- regulated learning , 2003 problem solving , 2006 information technology basic education . This additional examination is not carried out in all states.

Test tasks

After each test round, some of the test items are "released" and published. All approved tasks can also be found online on the websites of the OECD and the national project leaders. The test booklets typically contain about twenty items. Each task unit consists of introductory material and one to seven subsequent tasks.

The “Lake Chad” task unit from the PISA 2000 reading test shows, by way of example, how broad the PISA literacy concept is. The introduction to this task does not contain any reading text in the conventional sense, but mainly consists of two diagrams (“non-textual reading material”) that show the fluctuations in water levels over the last 12,000 years and the disappearance and emergence of large animal species. This is followed by five tasks. Exemplary:

  • Question 1: "How deep is Lake Chad today?" ( Multiple choice , five options)
  • Question 2: "In which year does the graph in Figure A start?" (Free text)
  • Question 3: "Why did the author choose this year as the beginning of the graph?" (Free text)

Questions 4 and 5 are then again in multiple choice format.

Implementation and evaluation

PISA test documents

PISA is one of several projects with which the OECD has been increasingly involved in educational monitoring since the 1990s . The coordination and final editing of the international reports are the responsibility of a small working group at the OECD headquarters in Paris, headed by the German Andreas Schleicher . Politically, the project is steered by a council of government representatives; scientifically, it is accompanied by a committee of experts and sub-committees. The creation and evaluation of the test tasks was put out to tender and awarded to a consortium of several companies in the test industry under the leadership of the Australian Council for Educational Research (ACER).

National project centers have been set up in the participating countries. A sample of at least 5,000 students is drawn in each state; in some states, especially to enable regional comparisons, a multiple of this.

The test comprises a two-hour “cognitive” test session, followed by a nearly one-hour questionnaire session (“Questionnaire”). In the cognitive test, not all students work on the same tasks; In 2003, thirteen different test books (as well as a short book in some countries in special schools ) were used; Of a total of 165 different tasks, each individual student only had about 50 to work on.

The student solutions are coded by trained assistants, digitally recorded and transmitted to the international project center in Australia for further analysis. Most of the tasks are ultimately only rated as either "wrong" or "right". Depending on how many students have solved a task correctly, the task is assigned a certain "difficulty value". Depending on how many tasks a student has solved, the student is assigned a certain range of “plausible” “competence values”. The difficulty and competence value scales are subsequently scaled so that the competence values ​​in the OECD country mean have the mean value 500 and the standard deviation 100. In order to compensate for the fact that the test books were of different difficulty and that individual tasks could not be assessed in individual countries, for example due to printing errors, the entire "scaling" of the difficulty and competence values ​​is carried out with the aid of a complex mathematical model of student response behavior, the so-called item -Response theory calculated.

The task difficulty values ​​allow a “didactic” interpretation of the test results: if a student has achieved 530 competency points, for example, then there is a 62 percent probability (the number 62 percent has been determined arbitrarily) a task with a difficulty of 530. If you now look at published sample exercises with a difficulty value in the vicinity of 530, you get an impression of what a competence value of 530 means. However, it must be noted that the test takes place under considerable time pressure (just over two minutes per task).

Almost all further evaluations are based on examining the statistical distribution of student competence scores in the participating states or more finely disaggregated populations.

Quantitative results

PISA measures student performance in points on an arbitrary scale. The point values ​​can only be interpreted when they are placed in a context. This is done regularly by comparing different countries. The reports of the OECD and its project partners consist to a large extent of country rankings.

Development of services 2000–2015

The most elementary and most observed statistic summarizes student performance into mean values. The following table summarizes the results to date for the predominantly German-speaking countries, some other OECD countries and some non-OECD countries (italics); OECD ranking in brackets.

Competency means
country mathematics Reading ability Natural sciences
2000 2003 2006 2009 2012 2015 2000 2003 2006 2009 2012 2015 2000 2003 2006 2009 2012 2015
GermanyGermany Germany 490 ± 3 (20) 503 ± 3 (16) 504 ± 4 (14) 513 ± 3 (10) 514 506 484 ± 3 (21) 491 ± 3 (18) 495 ± 4 (14) 497 ± 3 (16) 508 509 487 ± 2 (20) 502 ± 4 (15) 516 ± 4 (8) 520 ± 3 (9) 524 509
LiechtensteinLiechtenstein Liechtenstein 514 ± 7 536 ± 4 ± 536 ± 4 535 483 ± 4 525 ± 4 ± 499 ± 3 516 476 ± 7 525 ± 4 ± 520 ± 3 525
LuxembourgLuxembourg Luxembourg 446 ± 2 (26) 493 ± 1 (20) 490 ± 1 (22) 489 ± 1 (24) 490 486 441 ± 2 (26) 479 ± 2 (23) 479 ± 1 (22) 472 ± 1 (30) 488 481 443 ± 2 (26) 483 ± 1 (24) 486 ± 1 (25) 484 ± 1 (29) 491 483
AustriaAustria Austria 515 ± 3 (11) 506 ± 3 (15) 505 ± 4 (13) 496 ± 3 (18) 506 497 507 ± 2 (10) 491 ± 4 (19) 490 ± 4 (14) 470 ± 3 (31) 490 485 519 ± 3 (8) 491 ± 3 (20) 511 ± 4 (12) 494 ± 3 (24) 506 495
SwitzerlandSwitzerland Switzerland 529 ± 4 (7) 527 ± 3 (7) 530 ± 3 (4) 534 ± 3 (3) 531 521 494 ± 4 (17) 499 ± 3 (11) 499 ± 3 (11) 501 ± 2 (11) 509 492 496 ± 4 (18) 513 ± 4 (9) 512 ± 3 (11) 517 ± 3 (10) 515 506
BelgiumBelgium Belgium 520 ± 4 (9) 529 ± 2 (6) 520 ± 3 (8) 515 ± 2 (8) 515 507 507 ± 4 (11) 507 ± 3 (9) 501 ± 3 (10) 506 ± 2 (8) 509 499 496 ± 4 (17) 509 ± 3 (11) 510 ± 3 (13) 507 ± 3 (15) 505 502
FinlandFinland Finland 536 ± 2 (4) 544 ± 2 (1) 548 ± 2 (1) 541 ± 2 (2) 519 511 546 ± 3 (1) 543 ± 2 (1) 547 ± 2 (2) 536 ± 2 (2) 524 526 538 ± 3 (3) 548 ± 2 (1) 563 ± 2 (1) 554 ± 2 (1) 545 531
FranceFrance France 517 ± 3 (10) 511 ± 3 (13) 496 ± 3 (17) 497 ± 3 (16) 495 493 505 ± 3 (14) 496 ± 3 (14) 488 ± 4 (17) 496 ± 3 (18) 505 499 500 ± 3 (12) 511 ± 3 (10) 495 ± 3 (19) 498 ± 4 (21) 499 495
ItalyItaly Italy 457 ± 3 (24) 466 ± 3 (26) 462 ± 2 (27) 483 ± 2 (29) 485 490 487 ± 3 (20) 476 ± 3 (25) 469 ± 2 (24) 486 ± 2 (23) 490 485 478 ± 3 (23) 483 ± 3 (24) 475 ± 2 (26) 489 ± 2 (27) 494 481
JapanJapan Japan 557 ± 6 (1) 534 ± 4 (4) 523 ± 3 (6) 529 ± 3 (4) 536 532 522 ± 5 (8) 498 ± 4 (12) 498 ± 4 (12) 520 ± 4 (5) 538 516 550 ± 6 (2) 548 ± 4 (2) 531 ± 3 (3) 539 ± 3 (2) 547 538
CanadaCanada Canada 533 ± 1 (6) 532 ± 2 (5) 527 ± 2 (5) 527 ± 2 (5) 518 516 534 ± 2 (2) 528 ± 2 (3) 527 ± 2 (3) 524 ± 2 (3) 523 527 529 ± 2 (5) 519 ± 2 (8) 534 ± 2 (2) 529 ± 2 (5) 525 528
MexicoMexico Mexico 387 ± 3 (27) 385 ± 4 (29) 406 ± 3 (30) 419 ± 2 (34) 413 408 422 ± 3 (27) 400 ± 4 (29) 410 ± 3 (29) 425 ± 2 (34) 424 423 422 ± 3 (27) 405 ± 3 (29) 410 ± 3 (30) 416 ± 2 (34) 415 416
NetherlandsNetherlands Netherlands disq. 538 ± 3 (3) 531 ± 3 (3) 526 ± 5 (6) 523 512 disq. 513 ± 3 (8) 507 ± 3 (9) 508 ± 5 (7) 511 503 disq. 524 ± 3 (5) 525 ± 3 (6) 522 ± 5 (8) 522 509
TurkeyTurkey Turkey k. T. 423 ± 7 (28) 424 ± 5 ​​(29) 445 ± 4 (32) 448 420 k. T. 441 ± 6 (28) 447 ± 4 (28) 464 ± 4 (32) 475 428 k. T. 434 ± 6 (28) 424 ± 4 (29) 454 ± 4 (32) 463 425
United StatesUnited States United States 493 ± 8 (19) 483 ± 3 (24) 474 ± ​​4 (25) 487 ± 4 (25) 481 470 504 ± 7 (15) 495 ± 3 (15) ± () 500 ± 4 (14) 498 497 499 ± 7 (14) ± () 489 ± 4 (21) 502 ± 4 (17) 497 496

Shanghai (China) took first place in all subjects in 2009 and 2012. In 2015, Singapore took first place. In addition to Finland, Japan and Canada, South Korea , New Zealand , Australia and Hong Kong, which does not belong to the OECD, are regularly in the top group. Before Turkey and Mexico, alongside Italy, Portugal , Greece and Luxembourg are regularly at the bottom of the table .

When breaking down by language groups, the following is noticeable:

  • In Belgium, performance in the Dutch-speaking part of the country is much better than in the French-speaking part; they are often even above the Dutch results in the top international field.
  • In Switzerland, the differences between the German and French language groups are rather small; the Italian-speaking Switzerland is back something.
  • The results from South Tyrol are remarkable and are consistently in the top international group. The schools with German as the language of instruction did slightly better than the Italian ones.
  • In Finland, the approximately five percent Swedish-speaking minority scores 10 to 35 points worse than the Finnish-speaking majority.
  • In Canada, the English-speaking majority scores significantly better than the French-speaking minority.

The results from Liechtenstein have higher error bars because hardly more than 350 fifteen-year-olds live there. At least the various problems of sampling do not apply because, as in Luxembourg, a complete test of all students was carried out. In addition, Liechtenstein is the only country that is not tested by national organizations, but by the St. Gallen University of Education in neighboring Switzerland.

On the major differences between the German federal states → PISA-E .

The correlation with the TIMSS studies , which are being continued in parallel to PISA in some countries, is moderate, which is officially explained with different content and normalization effects due to different participants.

Levels of competence and risk groups

In order to give the numerical results a clear meaning, the consortium arbitrarily divides the point scale into six "competence levels" and one level below that of absolute incompetence. Based on the tasks to be solved at a level, a verbal description of what students can typically do at a certain level is worked out. It should be noted that the proportion of pupils at a certain level in the OECD average is constant because the structure of the difficulty and performance scales is determined by the structure. Only the mostly minor differences between states can be interpreted.

Students “below” level 1 are internationally referred to as “at risk”. However, the German project management expanded the term “risk group” and included level 1 in it. This was shortened in parts of the public and, in contrast to statements in international reports, received as if almost a quarter of all fifteen-year-olds were not able to calculate and read meaningfully.

Influence of social background

After the two-hour test session to measure cognitive performance, the students work on a questionnaire with questions about family background, the school environment, learning habits and the like. The official result reports and numerous secondary studies show how these context variables affect cognitive test performance.

In PISA 2000 it was found that the connection between test results and parental occupation is stronger in Germany than anywhere else. However, this result was not replicated in the following rounds; the strongest correlation was found in Hungary in 2003 and in the Czech Republic in 2006. The German parameters (quantile differences, gradients and correlation coefficients of the test performance as a function of an occupational classification or a socio-economic-cultural index) were predominantly in the upper part of a broad middle field; some of the deviations from the OECD average were statistically insignificant.

These evaluations are based on different social indices, some of which only take into account the parents' occupation, some also take into account their educational qualifications and the level of cultural possessions in the household. There is disagreement between the German consortium and the international project management about the appropriate quantification of social background; In the German report on PISA 2006, a different index is used throughout than in the international report.

Another example of data interpretation is that social status and immigrant background are highly correlated. The data alone do not say to what extent the poor performance of migrant children can be attributed to their below-average social situation or, for example, to their inadequate linguistic integration .

In Germany it is surprising that immigrants of the first generation (454 points, mathematics achievement 2003) do on average better than children born in the country of immigrant parents (second generation, 432 points); Non-immigrant students: 525 (OECD-wide comparables in the same order: 475, 483, 523). From this it was partly concluded that children of the second generation generally performed worse in Germany than children of the first generation. However, the further breakdown explained this paradoxical result by the fact that the proportions of the most important countries of origin within the children of the first and second generation are significantly different (e.g. larger proportion of young people from Turkey within the second generation; cf. the explanations to students with a migration background ). In the same country of origin, the results of the second generation are consistently better than those of the first.

The poor performance of Turkish young people is a quantitatively significant problem (mathematics performance 2003: first generation 382, ​​second generation 411). Surprisingly, students with a migrant background performed slightly better on language-heavy tasks than on relatively language-free tasks; the reasons for this are unclear.

Gender-specific performance divergences

The results from PISA 2003 show that girls have a considerable lead in reading (34 points internationally, 42 points in Germany). The lead of boys in mathematics is four times lower (11 points internationally, 9 points in Germany). Girls are ahead in problem solving (2 points internationally, 6 points in Germany), although this difference is not significant. No significant gender difference was found in the natural sciences either (internationally, boys are ahead with 6 points, 6 points in Germany). Wuttke (2007) tries to show that this result is due to the mixture of tasks from different areas; In accordance with national tests that are more closely related to the curriculum, she finds the everyday observation confirmed that on average boys achieved higher results in physics and girls in biology.

PISA 2009 has shown that the differences in skills between girls and boys have remained practically unchanged in Germany since PISA 2000. The OECD report Equally prepared for life? How 15 year-old boys and girls perform in school examines gender-specific differences in performance and is primarily based on the results from PISA 2009 (as well as the IGLU and TIMSS studies ). The researchers conclude that gender biases (“role models” and “stereotypes” in the language of PISA) influence girls' educational outcomes. Girls have a lower self-assessment with regard to mathematical competencies because stereotypes and wrong role models force this, which is why they perform worse - and not the other way around. The decision on the future educational and professional 'll v in the case of girls. a. influenced by social developments and less by the girls and young women themselves. The more prevalent diagnosed language weaknesses among boys are, where they are discussed at all, v. a. Explained by their pedagogically incompatible behavior, boys did not fit into a feminized learning environment such as school and were more likely to reject activities such as reading that were rated as “unmanly”.

Results of the PISA 2015 study

Results of the PISA study (2015)
The results of PISA 2015 were presented on December 6, 2016. A total of 600,000 students from 72 countries took part
mathematics Natural sciences Reading comprehension Overall average
1. SingaporeSingapore Singapore 564
2. Hong KongHong Kong Hong Kong 548
3. MacauMacau Macau 544
4th TaiwanRepublic of China (Taiwan) Taiwan 542
5. JapanJapan Japan 532
6th China People's RepublicPeople's Republic of China People's Republic of China * 531
7th Korea SouthSouth Korea South Korea 524
8th. SwitzerlandSwitzerland Switzerland 521
9. EstoniaEstonia Estonia 520
10. CanadaCanada Canada 516
11. NetherlandsNetherlands Netherlands 512
12. DenmarkDenmark Denmark 511
12. FinlandFinland Finland 511
14th SloveniaSlovenia Slovenia 510
15th BelgiumBelgium Belgium 507
16. GermanyGermany Germany 506
17th IrelandIreland Ireland 504
17th PolandPoland Poland 504
19th NorwayNorway Norway 502
20th AustriaAustria Austria 497
21st New ZealandNew Zealand New Zealand 495
22nd VietnamVietnam Vietnam 495
23. AustraliaAustralia Australia 494
23. RussiaRussia Russia 494
23. SwedenSweden Sweden 494
26th FranceFrance France 493
27. United KingdomUnited Kingdom United Kingdom 492
28. Czech RepublicCzech Republic Czech Republic 492
29 PortugalPortugal Portugal 492
30th ItalyItaly Italy 490
Organization for economic cooperation and developmentOECD OECD average 490
31. IcelandIceland Iceland 488
32. SpainSpain Spain 486
33. LuxembourgLuxembourg Luxembourg 486
34. LatviaLatvia Latvia 482
35. MaltaMalta Malta 479
36. LithuaniaLithuania Lithuania 478
37. HungaryHungary Hungary 477
38. SlovakiaSlovakia Slovakia 475
39. IsraelIsrael Israel 470
39. United StatesUnited States United States 470
41. CroatiaCroatia Croatia 464
42. KazakhstanKazakhstan Kazakhstan 460
43. ArgentinaArgentina Argentina 456
44. GreeceGreece Greece 454
45. MalaysiaMalaysia Malaysia 446
46. RomaniaRomania Romania 444
47. BulgariaBulgaria Bulgaria 441
48. Cyprus RepublicRepublic of Cyprus Cyprus 437
49. United Arab EmiratesUnited Arab Emirates United Arab Emirates 427
50. ChileChile Chile 423
51. TurkeyTurkey Turkey 420
51. Moldova RepublicRepublic of Moldova Moldova 420
53. UruguayUruguay Uruguay 418
54. MontenegroMontenegro Montenegro 418
55. Trinidad and TobagoTrinidad and Tobago Trinidad and Tobago 417
56. ThailandThailand Thailand 415
57. AlbaniaAlbania Albania 413
58. MexicoMexico Mexico 408
59. GeorgiaGeorgia Georgia 404
60. QatarQatar Qatar 402
61. Costa RicaCosta Rica Costa Rica 400
62. LebanonLebanon Lebanon 396
63. ColombiaColombia Colombia 390
64. PeruPeru Peru 387
65. IndonesiaIndonesia Indonesia 386
66. JordanJordan Jordan 380
67. BrazilBrazil Brazil 377
68. Macedonia 1995Macedonia Macedonia 371
69. TunisiaTunisia Tunisia 367
70. KosovoKosovo Kosovo 362
71. AlgeriaAlgeria Algeria 360
72. Dominican RepublicDominican Republic Dominican Republic 328
1. SingaporeSingapore Singapore 556
3. JapanJapan Japan 538
3. EstoniaEstonia Estonia 534
4th TaiwanRepublic of China (Taiwan) Taiwan 532
5. FinlandFinland Finland 531
6th MacauMacau Macau 529
7th CanadaCanada Canada 528
8th. VietnamVietnam Vietnam 525
9. Hong KongHong Kong Hong Kong 523
10. China People's RepublicPeople's Republic of China People's Republic of China * 518
11. Korea SouthSouth Korea South Korea 516
12. New ZealandNew Zealand New Zealand 513
12. SloveniaSlovenia Slovenia 513
14th AustraliaAustralia Australia 510
15th GermanyGermany Germany 509
15th NetherlandsNetherlands Netherlands 509
15th United KingdomUnited Kingdom United Kingdom 509
18th SwitzerlandSwitzerland Switzerland 506
19th IrelandIreland Ireland 503
20th BelgiumBelgium Belgium 502
20th DenmarkDenmark Denmark 502
22nd PolandPoland Poland 501
22nd PortugalPortugal Portugal 501
24. NorwayNorway Norway 498
25th United StatesUnited States United States 496
26th FranceFrance France 495
26th AustriaAustria Austria 495
28. SwedenSweden Sweden 493
28 SpainSpain Spain 493
28. Czech RepublicCzech Republic Czech Republic 493
Organization for economic cooperation and developmentOECD OECD average 493
31. LatviaLatvia Latvia 490
32. RussiaRussia Russia 487
33. LuxembourgLuxembourg Luxembourg 483
34. ItalyItaly Italy 481
35. HungaryHungary Hungary 477
36. ArgentinaArgentina Argentina 475
36. LithuaniaLithuania Lithuania 475
36. CroatiaCroatia Croatia 475
39. IcelandIceland Iceland 473
40. IsraelIsrael Israel 467
41. MaltaMalta Malta 465
42. SlovakiaSlovakia Slovakia 461
43. KazakhstanKazakhstan Kazakhstan 456
44 GreeceGreece Greece 455
45 ChileChile Chile 447
46 BulgariaBulgaria Bulgaria 446
47. United Arab EmiratesUnited Arab Emirates United Arab Emirates 437
48. RomaniaRomania Romania 435
48. UruguayUruguay Uruguay 435
50. Cyprus RepublicRepublic of Cyprus Cyprus 433
51. MalaysiaMalaysia Malaysia 431
52. Moldova RepublicRepublic of Moldova Moldova 428
53. AlbaniaAlbania Albania 427
54. TurkeyTurkey Turkey 425
55. Trinidad and TobagoTrinidad and Tobago Trinidad and Tobago 425
56. ThailandThailand Thailand 421
57. Costa RicaCosta Rica Costa Rica 420
58. QatarQatar Qatar 418
59. ColombiaColombia Colombia 416
59. MexicoMexico Mexico 416
61. GeorgiaGeorgia Georgia 411
61. MontenegroMontenegro Montenegro 411
63. JordanJordan Jordan 409
64. IndonesiaIndonesia Indonesia 403
65. BrazilBrazil Brazil 401
66. PeruPeru Peru 397
67. LebanonLebanon Lebanon 386
67. TunisiaTunisia Tunisia 386
69. Macedonia 1995Macedonia Macedonia 384
70. KosovoKosovo Kosovo 378
71. AlgeriaAlgeria Algeria 376
72. Dominican RepublicDominican Republic Dominican Republic 332
1. Hong KongHong Kong Hong Kong 537
2. CanadaCanada Canada 537
3. SingaporeSingapore Singapore 535
4th FinlandFinland Finland 526
5. IrelandIreland Ireland 521
6th EstoniaEstonia Estonia 519
7th Korea SouthSouth Korea South Korea 517
8th. JapanJapan Japan 516
9. NorwayNorway Norway 513
10. GermanyGermany Germany 509
10. MacauMacau Macau 509
10. New ZealandNew Zealand New Zealand 509
13. PolandPoland Poland 506
14th SloveniaSlovenia Slovenia 505
15th AustraliaAustralia Australia 503
15th NetherlandsNetherlands Netherlands 503
17th DenmarkDenmark Denmark 500
17th SwedenSweden Sweden 500
19th BelgiumBelgium Belgium 499
19th FranceFrance France 499
21st PortugalPortugal Portugal 498
21st United KingdomUnited Kingdom United Kingdom 498
23. TaiwanRepublic of China (Taiwan) Taiwan 497
23. United StatesUnited States United States 497
25th SpainSpain Spain 496
26th RussiaRussia Russia 495
27. China People's RepublicPeople's Republic of China People's Republic of China * 494
Organization for economic cooperation and developmentOECD OECD average 493
28. SwitzerlandSwitzerland Switzerland 492
29 LatviaLatvia Latvia 488
30th CroatiaCroatia Croatia 487
30th Czech RepublicCzech Republic Czech Republic 487
30th VietnamVietnam Vietnam 487
33. ItalyItaly Italy 485
33. AustriaAustria Austria 485
35. IcelandIceland Iceland 482
36. LuxembourgLuxembourg Luxembourg 481
37. IsraelIsrael Israel 479
38. ArgentinaArgentina Argentina 475
39. LithuaniaLithuania Lithuania 472
40. HungaryHungary Hungary 470
41. GreeceGreece Greece 467
42. ChileChile Chile 459
43. SlovakiaSlovakia Slovakia 453
44. MaltaMalta Malta 447
45. Cyprus RepublicRepublic of Cyprus Cyprus 443
46. MalaysiaMalaysia Malaysia 443
47. UruguayUruguay Uruguay 437
48. RomaniaRomania Romania 434
48. United Arab EmiratesUnited Arab Emirates United Arab Emirates 434
50. BulgariaBulgaria Bulgaria 432
51. TurkeyTurkey Turkey 428
52. Costa RicaCosta Rica Costa Rica 427
52. KazakhstanKazakhstan Kazakhstan 427
52. MontenegroMontenegro Montenegro 427
52. Trinidad and TobagoTrinidad and Tobago Trinidad and Tobago 427
56. ColombiaColombia Colombia 425
57. MexicoMexico Mexico 423
58. Moldova RepublicRepublic of Moldova Moldova 416
59. ThailandThailand Thailand 409
60. JordanJordan Jordan 408
61. BrazilBrazil Brazil 407
62. AlbaniaAlbania Albania 405
63. QatarQatar Qatar 402
64. GeorgiaGeorgia Georgia 401
65. PeruPeru Peru 398
66. IndonesiaIndonesia Indonesia 397
67. TunisiaTunisia Tunisia 361
68. Dominican RepublicDominican Republic Dominican Republic 358
69. Macedonia 1995Macedonia Macedonia 352
70. AlgeriaAlgeria Algeria 350
71. KosovoKosovo Kosovo 347
71. LebanonLebanon Lebanon 347
1. SingaporeSingapore Singapore 552
2. Hong KongHong Kong Hong Kong 536
3. JapanJapan Japan 529
4th CanadaCanada Canada 527
4th MacauMacau Macau 527
6th EstoniaEstonia Estonia 524
6th TaiwanRepublic of China (Taiwan) Taiwan 524
8th. FinlandFinland Finland 523
9. Korea SouthSouth Korea South Korea 519
10. China People's RepublicPeople's Republic of China People's Republic of China * 514
11. IrelandIreland Ireland 509
11. SloveniaSlovenia Slovenia 509
13. GermanyGermany Germany 508
13. NetherlandsNetherlands Netherlands 508
15th New ZealandNew Zealand New Zealand 506
15th SwitzerlandSwitzerland Switzerland 506
17th DenmarkDenmark Denmark 504
17th NorwayNorway Norway 504
17th PolandPoland Poland 504
20th BelgiumBelgium Belgium 503
21st AustraliaAustralia Australia 502
21st VietnamVietnam Vietnam 502
23. United KingdomUnited Kingdom United Kingdom 500
24. PortugalPortugal Portugal 497
25th FranceFrance France 496
26th SwedenSweden Sweden 496
27. AustriaAustria Austria 492
27. RussiaRussia Russia 492
27. SpainSpain Spain 492
Organization for economic cooperation and developmentOECD OECD average 492
30th Czech RepublicCzech Republic Czech Republic 491
31. United StatesUnited States United States 488
32. LatviaLatvia Latvia 487
33. ItalyItaly Italy 485
34. LuxembourgLuxembourg Luxembourg 483
35. IcelandIceland Iceland 480
36. LithuaniaLithuania Lithuania 475
36. CroatiaCroatia Croatia 475
36. HungaryHungary Hungary 475
39. IsraelIsrael Israel 472
40. ArgentinaArgentina Argentina 469
41. MaltaMalta Malta 464
42. SlovakiaSlovakia Slovakia 463
43. GreeceGreece Greece 459
44. KazakhstanKazakhstan Kazakhstan 448
45. ChileChile Chile 443
46. BulgariaBulgaria Bulgaria 440
46. MalaysiaMalaysia Malaysia 440
48. RomaniaRomania Romania 438
48. Cyprus RepublicRepublic of Cyprus Cyprus 438
50. United Arab EmiratesUnited Arab Emirates United Arab Emirates 433
51. UruguayUruguay Uruguay 430
52. TurkeyTurkey Turkey 424
53. Trinidad and TobagoTrinidad and Tobago Trinidad and Tobago 423
54. Moldova RepublicRepublic of Moldova Moldova 421
55. MontenegroMontenegro Montenegro 419
56. Costa RicaCosta Rica Costa Rica 416
56. MexicoMexico Mexico 416
58. AlbaniaAlbania Albania 415
58. ThailandThailand Thailand 415
60. ColombiaColombia Colombia 410
61. QatarQatar Qatar 407
62. GeorgiaGeorgia Georgia 405
63. JordanJordan Jordan 399
64. BrazilBrazil Brazil 395
64. IndonesiaIndonesia Indonesia 395
66. PeruPeru Peru 394
67. LebanonLebanon Lebanon 376
68. TunisiaTunisia Tunisia 371
69. Macedonia 1995Macedonia Macedonia 369
70. AlgeriaAlgeria Algeria 362
70. KosovoKosovo Kosovo 362
72. Dominican RepublicDominican Republic Dominican Republic 339

* PISA tests were only carried out in the provinces of Guangdong and Jiangsu and the cities of Shanghai and Beijing , which together have a population of around 200 million.


Results of the PISA 2009 study in the main subject of reading literacy

As in the first study from 2000, the focus of the 2009 PISA study was on reading literacy. The comparison with the first study documents a positive development on a broad basis for Germany:

  1. Reading literacy has increased significantly since 2000. This is mainly due to the decline in the proportion of young people with very poor reading skills.
  2. In 2009, young people with a migration background had significantly better reading skills with 470 points. The increase compared to 2000 is 26 points. The difference to young people without a migration background (514 on average) has decreased significantly.
  3. The social gradient, i.e. the dependence of skills on the social background, has decreased significantly compared to 2000. In this regard, Germany is now moving in line with the OECD average. Children from working-class families and children from families whose parents carry out routine activities benefited in particular. The development was not at the expense of the young people from education-related groups: They continued to perform well.
  4. The relationship between family language and reading skills has decreased. This speaks in favor of better compensation for disadvantages by schools.
  5. The joy of reading has increased significantly.
  6. The expansion of the grammar school did not lead to a decrease in performance at this school, but to an increase in the competencies of the student body as a whole.

In some areas the positive developments are less pronounced:

  1. There is still a very large difference in reading skills between boys and girls. This discrepancy has increased slightly, but not significantly. This increase is due to a better ability of girls to handle non-linear texts. Boys, on the other hand, achieve a higher level of mathematical and scientific competence on average. The gender differences are less pronounced there and are only a quarter of the lead for girls in reading.
  2. Despite its flattening, the social gradient is still high.
  3. The enormous improvement in reading skills in children with a migration background is also present in young people with a Turkish background, but with 18 points it is significantly less pronounced than in the other young people with a migration background.
  4. The majority of young people still say that they do not read for pleasure.

reception

PISA 2000, PISA 2003 and PISA 2006 have generated a lot of media coverage in some participating states; In Germany the word “PISA” has become the epitome of all problems in the education system.

In Germany

Germany did not take part in any international school comparison in the 1970s and 80s. The change of direction began with participation in the mathematics study TIMSS 1995. The mediocre results were discussed extensively by education politicians and subject teachers, but only briefly reached the general public.

The publication of the first PISA results at the end of 2001 was prepared for several weeks in advance reports and achieved such overwhelming media coverage that soon there was talk of a “PISA shock”, which was reminiscent of the “ Sputnik shock ” and the debate of the 1960s remembered the " educational catastrophe" conjured up by Georg Picht .

At the end of 2002, PISA hit the headlines again because it was the first time a performance comparison between the federal states was published. The north-south divide was relatively unsurprising; In many comments it was relativized by the reference that the high school graduation rate in Bavaria was too low and that a certain statistical parameter for social selectivity was particularly high (see main article PISA-E ).

A few days before the publication of PISA 2006 (partial results had also leaked to the press this time) the conflict between the OECD and the German project group escalated. Andreas Schleicher referred to the OECD report, according to which improvements that went beyond the statistical error range were not achieved in the three areas of science, mathematics, reading performance or in the coupling of the results with social origin. The scientific task was essentially redesigned, the performance of the few test tasks that were used in both 2003 and 2006 remained unchanged. Environmental issues would have favored Germany in 2006. In contrast, the German project manager Manfred Prenzel claimed that the results could very well be compared. These different assessments are also in the official reports.

CDU minister of education, in particular the Hessian minister Karin Wolff , took this as an opportunity to demand Schleicher's release. He was acutely accused of violating a self-imposed embargo by commenting on a prior publication. The basic allegations against him were:

  • Schleicher assumes the role of a “supervisor” who specifies how the data may be interpreted.
  • Spend a lot of money measuring trends, and now you will find out how problematic comparisons over 3 and 6 years are.

The OECD rejected these allegations. The interpretation corresponds to the results of the study. A comparability over the years is certainly given, but not in the newly tested individual areas. It was planned that a new area would be tested every three years. After reading in 2000, this was mathematical knowledge in 2003 and scientific knowledge in 2006.

Some CDU ministers of education were loudly considering withdrawing from the PISA study. The education monitor of the business- oriented initiative New Social Market Economy or the Institute for Quality Development in Education were named as alternatives . At the same time, the OECD threatened the German project leader with withdrawing the “PISA” label from the comparative PISA-E test if he scaled the data using his own method, as this would then have nothing to do with PISA.

The PISA conflict is part of a dispute between conservative education politicians in Germany and international organizations. The point of contention was always the tripartite school system and the question of whether this disadvantaged migrant children and children from the lower classes. In December 2006 a corresponding paper by the EU Commission was rejected, and in spring 2007 the UN observer for the human right to education, Vernor Muñoz , was severely attacked for his critical report. As early as the summer of 2007, the international annual OECD study Education at a Glance was described by conservative teachers' associations as "ideological". A major problem with the Pisa study is that the OECD only differentiates between students who were born in Germany and those who were born abroad. In this way, the school successes of migrant children and children with "purely German roots" can only be compared to a limited extent.

In Austria

In Austria, the “PISA shock” occurred too late: after the delusion of having performed significantly better than Germany in 2000, the result from 2003 was perceived as a “crash”. Thereupon, Education Minister Elisabeth Gehrer arranged for a review by the statisticians Erich Neuwirth , Ivo Ponocny and Wilfried Grossmann , who in their investigation report published in 2006 uncovered numerous inconsistencies in the sampling and data analysis. In particular, in 2000 the sample in vocational schools was not drawn correctly. In a foreword to this research report, the PISA coordinator of the OECD, Andreas Schleicher, downplayed the corrections and claimed that the OECD had previously pointed out that an interpretation of the Austrian results was only "permissible to a limited extent". After a further delay, in early 2007 the OECD almost ignored the results of Neuwirth et al. transferred to the international dataset and thus officially corrected downwards the Austrian results from 2000. According to these corrected data, there has never been a crash; In 2000 and 2003, Austrian student performance was equally in the middle; there were no statistically significant changes.

The apparent miscalculation of the PISA result for 2000 is revealing because the alleged PISA crash from 2000 to 2003 was supposed to certify that the newly assumed conservative government was responsible for the crash through its education policy. The Conservative Education Minister Elisabeth Gehrer (ÖVP), who resigned after the elections in October 2006, wanted to refute the fall by the Schlechter assessment she had commissioned of the results for 2000 that had come about under a social democratic government. Both times, the Austrian PISA result was instrumentalized in party politics or calculated upwards and then downwards, without the PISA regulations having stipulated sufficiently clearly how results should be calculated correctly and random samples should be standardized worldwide.

The result of the PISA study 2009 also raises fundamental questions regarding validity and informative value. The PISA consortium officially admitted that numerous questionnaires showed clear signs of a boycott proclaimed by the teachers' unions. Nevertheless, the Austrian sample was evaluated after those questionnaires had been eliminated. The result in all three areas was significantly worse than in the test runs before (2000–2006) and after (2012), which suggests that the boycott falsified the result. In cross-year trend comparisons, the OECD now excludes the Austrian data for 2009 and describes them as missing without providing an explanation.

As in Germany, social differences are striking. The governing parties (ÖVP and FPÖ) preferred to refer to the poor German language skills of foreign children. The then Minister Gehrer continued to note that parents were misconduct when they did not care enough for their children. The opposition (SPÖ and Greens) suggested introducing a comprehensive school instead of the highly diversified school system. This idea is strongly influenced by the Finnish model. In the school system there, there are extreme differences in performance within, but hardly between schools. In Austria, however, the opposite was felt.

A technical-statistical problem that has become virulent in the case of Austria still seems to be unsolved, because PISA is still not carried out according to uniform criteria in the individual countries. In Austria, apprentices, migrants without knowledge of German and special school students are also tested. For PISA 2009 in Tyrol, for example, only three pupils with a migration background were drawn at the Neustift secondary school, who had not had a regular school career, had only lived in Austria for a few years and, at the age of 16, did not seem to belong to the target population .

In South Tyrol

Since 2003, the Italian PISA center INVALSI has enabled the regions and autonomous provinces to participate in PISA with such a large sample that a separate evaluation is permissible. For the Autonomous Province of Bolzano - South Tyrol, the German-language test books are taken from Austria. The province regularly achieves results well above the national average.

In Luxemburg

Luxembourg ranks below the OECD average in the three subjects examined - reading, math and science - and ranks behind most other European countries. When interpreting the results, a. the following backgrounds are important:

  • The proportion of foreigners is 44.5% (2013, statec.lu).
  • The multilingualism is very pronounced (Luxembourgish, German, French and English and often another language such as Portuguese)
  • The entirety of the student body was tested, so no bias is possible due to favorable sample selection.

The Finland model

Finland was generally seen as the “test winner” in public reception in Germany and Austria. Numerous explanations for Finland's excellent performance have been suggested (see also: Finland's education system ):

  • a reading tradition rooted in the Reformation ,
  • high motivation to learn to read through films in the original language with subtitles in television and cinema,
  • Community feeling in a small country: every individual is important
  • comparatively low social differences in the population,
  • due to the comparatively low immigration, few problems with immigration-related lack of language skills,
  • an unstructured comprehensive school system ,
  • excellent staffing of schools, including social pedagogues ; where necessary, a second teacher comes to class,
  • Higher quality of teachers: Teachers are selected from the top 10 percent of a year in an extensive process before, during and after their studies.
  • Class sizes of usually less than 20 students,
  • excellent material equipment of the schools: friendly buildings, library , canteen,
  • extensive autonomy of schools combined with effective quality control . Instead of prescribing detailed curricula, the Finnish education bureaucracy restricts itself to setting learning objectives and developing nationwide tests to check how well the objectives have been achieved.
  • Familiarity with standardized tests.

The enthusiasm for Finland also generated critical voices, who pointed out that alcoholism was widespread among Finnish school children and that the suicide rate was alarming. From a statistical point of view, Finland's good performance is put into perspective as soon as demographic, especially social, background variables are checked.

The school structure debate

Supporters of the comprehensive school used the PISA results for a new edition of the German and Austrian school structure debate. They referred in particular to:

  • the excellent performance of Finland and some other countries,
  • the above-average correlation between the German test results and the social or migration background,
  • the strong correlation between choice of school type and family background.

Opponents object that the PISA results are by no means clear:

  • “Test losers” also have comprehensive school systems.
  • In an intra-German comparison, countries that, like Bavaria, consistently adhere to a structured school system with hard admission requirements for higher schools, do best.
  • The conditions in Germany and Finland are not comparable for a number of reasons; it is completely speculative to attribute the Finnish success primarily to the school structure.
  • The teachers in Finland are selected differently and better trained than in Germany, so that it is not the type or structure of the school but the quality of the teachers that has a say in the educational standard.

Political reactions

As a direct reaction to the PISA shock, the German education ministers decided to develop nationwide " educational standards " and to found the institute for quality development in education , which operationalized these standards in the form of test tasks.

Impact on schools

It was politically intended from the start that PISA should have an impact on the reality of schools. Participating maths didactics hoped, for example, to implement their idea of ​​meaningful teaching ( Meyerhöfer in Jahnke / Meyerhöfer 2007). The influence of the PISA sample tasks is tangible, for example, when new mathematics curricula place greater emphasis on working with graphics and tables.

criticism

The PISA studies not only generated exceptional media coverage, but also heated scientific debates. Due to the complexity of the subject, criticism is an interdisciplinary undertaking in which educators, psychologists and other scientists with statistical expertise (mathematicians, physicists, economists) participate. Depending on their origin, they have published their comments in widely scattered, sometimes remote places. It was only after a certain delay that the first edited volumes appeared, which bundled the previously scattered criticism (Jahnke / Meyerhöfer 2006, expanded 2007; Hopmann / Brinek / Retzl 2007).

In May 2014, education professor Heinz-Dieter Meyer (State University of New York) and school principal Katie Zahedi published an open letter pointing out the negative consequences of PISA and calling for the three-year test cycle to be interrupted in order to pause for thought; Hundreds of pedagogy professors , representatives of teachers' associations and prominent intellectuals (including Noam Chomsky , Konrad Liessmann ) joined this call .

Purpose of PISA

The utilitarian educational goal of PISA is criticized in particular by Francophone authors: First of all, it causes a distortion of the test results in favor of Anglo-Saxon countries and then pressure to adapt curricula in such a way that skills that are directly relevant to everyday life are given greater weight. This threatens, for example, the specificity of French math classes, which attach great importance to strict evidence. In this context, reference is made to the economic goals of the OECD and to the lack of transparency and lack of democratic legitimacy in the decision-making processes in PISA. A similar objection is that PISA, with its focus on mathematics, mother tongue and natural sciences, promotes the marginalization of social science and arts subjects.

The math didactic Thomas Jahnke criticizes the basic idea of ​​wanting to “standardize” education (cf. educational standards ), and interprets PISA as a market development for the test industry . The philosopher Konrad Paul Liessmann criticizes PISA as an economic attempt to basically abolish (humanistic) education and replace it with simple knowledge (as opposed to education). He deplores the transformation of the educational institution school into a vocational school for children and with it the end of the conscious and intellectual person and its reduction to an employee and consumer.

Scientists who were centrally involved in the PISA study, such as the educational researcher Eckhard Klieme , definitely confirm in this context that the study is also an instrument of educational policy and that the OECD is pursuing an agenda with PISA. They counter this, however, by making their own contributions to the public debate as responsible scientists in order not to allow PISA to be instrumentalized and that all results are openly available. Because in the end the studies are nevertheless an "instrument of education that tells us a lot about the problems of our education system and contributes to more honesty and transparency".

Methodology: validity of the instruments

Following the tests in 2000 and 2003, only a small part of the tasks used (the "instruments" in the language of psychology) were published. A large number of authors have criticized these sample exercises, particularly the mathematics didactician Meyerhöfer. In a didactic analysis using methods of objective hermeneutics , he shows that PISA does not meet the requirement of testing a special “mathematical literacy”.

The translation problem, which has not been resolved since the very first comparative school studies, distorted international comparisons in various ways:

  • Origin of the tasks (mainly from the Anglo-Saxon area and the Netherlands)
  • different readability of different languages
  • Texts tend to get longer when translating; Task texts are around 16% longer in German than in English (Puchhammer in Hopmann / Brinek / Retzl 2007).
  • When translators understand the task, they tend to offer help (Freudenthal 1975).
  • If translators don't see all the pitfalls, the task can be much more difficult.
  • Manifest translation errors have occurred in individual tasks.

Another problem is the different familiarity with the task format. Meyerhöfer speaks here of "testability"; The meaning of “testwiseness” has long been discussed in the USA. Wuttke (2007) discovered that up to 10 percent of German-speaking students do not understand the multiple-choice format and tick more than one alternative answer.

In addition, it must be taken into account that the same tasks in different linguistic and cultural contexts cannot be regarded as equivalent in terms of their difficulty. A difference in the results is to be expected, which is not in the performance of the test subjects, but in the linguistic translatability and interpretability of the task.

Methodology: validity of statistics

When evaluating PISA and similar studies, the basic problem arises that performance differences within each country are much greater than typical differences between countries. A measurement accuracy in the lower percentage range is therefore required in order to be able to make statistically significant statements about such differences. In PISA this is formally achieved by using very large samples (around 5000 students / state). However, the official standard errors do not take into account possible systematic biases (Wuttke 2007). Such distortions are caused, among other things, by:

  • Unreliable initial data (there are no original lists with all fifteen-year-olds; the sampling is extremely complicated and cannot be checked).
  • Performance-Based Participation (In 2007, it was revealed that some states were so low in participation that students were rewarded with up to $ 50 or a day off for participation).
  • inconsistent exclusion of students with learning disabilities (Hörmann in Hopmann / Brinek / Retzl 2007)
  • Some countries, including Finland, have excluded dyslexics from the test (OECD: Technical Reports).

Interpretation of the results

From a systems-theoretical point of view, it is criticized that the system boundaries in PISA are not appropriate (countries are compared even if they do not have a uniform education system); that equating student performance with school system performance is not justified (because there are a number of other input variables besides the school system); that a system comparison can only provide food for thought, but no political recommendations for action (Bank 2008). Specifically, one can, for example, cast doubt on the role model function of the Finnish school system if one subtracts the input variable "migrant share" and compares it not with a fictitious all-German school system, but specifically with the Saxon or Bavarian one (Bender 2006). If one puts the PISA successes in relation to the education expenditure, the Finnish system even appears to be comparatively inefficient (Bank 2008).

The educational goal “literacy” postulated by PISA leads to a blurring of the boundaries between the individual test areas. The results are highly correlated. This is why Heiner Rindermann (2006) argues that PISA can be interpreted as an intelligence test to a good approximation .

Further analysis

The criminological research institute Niedersachsen e. V. found in 2007 that those groups performed worst in PISA that are characterized by the highest media consumption .

See also

literature

PISA studies, international

PISA studies, Germany

  • J. Baumert et al: PISA 2000. Basic competences of schoolchildren in an international comparison. German PISA consortium. Leske + Budrich, Opladen 2001, ISBN 3-8100-3344-8 .
  • M. Prenzel et al. (Ed.): PISA 2003. Results of the second international comparison. German PISA consortium. Summary. Leibniz Institute for Science Education, Kiel 2004 ( short version ( memento from June 18, 2006 in the Internet Archive ); PDF; 235 kB)
  • M. Prenzel et al. (Ed.): PISA 2003. The educational level of young people in Germany - results of the second international comparison. Waxmann, Münster 2004, ISBN 3-8309-1455-5 .
  • M. Prenzel et al. (Ed.): PISA 2006. The results of the third international comparative study. Waxmann, Münster a. a. 2007, ISBN 978-3-8309-1900-1 .
  • OECD: PISA 2006 - School Achievement in International Comparison - Scientific Competencies for Tomorrow's World W. Bertelsmann Verlag, Bielefeld 2008, ISBN 978-3-7639-3582-6 .
  • PISA Consortium Germany (Ed.): PISA 2006 in Germany. The skills of young people in the third country comparison. Waxmann, Münster a. a. 2008, ISBN 978-3-8309-2099-1 .
  • Andreas Frey, Päivi Taskinen, Kerstin Schütte, PISA-Konsortium Deutschland (ed.): PISA 2006 Skalen Handbuch. Documentation of the survey instruments. Waxmann, Münster a. a. 2009, ISBN 978-3-8309-2160-8 .
  • Eckhard Klieme , Cordula Artelt, Johannes Hartig, Nina Jude, Olaf Köller, Manfred Prenzel , Wolfgang Schneider, Petra Stanat (eds.): PISA 2009. Balance after a decade. Waxmann, Münster a. a. 2010, ISBN 978-3-8309-2450-0 .

Overview of school performance studies

Summaries, reviews, critiques

  • Dittmar Graf: PISA's crooked tour - Why the flagship study is not scientific. In: Skeptiker (magazine) . No. 3, 2017, pp. 112-120.
  • Hans Brügelmann : PISA & Co: Benefits and limits of performance comparisons - on an international, institutional and individual level. Preparatory text for a contribution to the Encyclopedia Educational Science Online. Online at Hans Brügelmann PISA & Co: Benefits and limits of performance comparisons 1 - on an international, institutional and individual level2 [ Access : June 30, 2012]
  • Johann C. Fuhrmann, Norbert Beckmann-Dierkes: “Finland's PISA Successes: Myth and Transferability” , KAS International Information 07/2011. Berlin 2011, pp. 6–22.
  • Joachim Wuttke: PISA: Addenda to a debate that has not been conducted. In: Communications of the Society for Didactics of Mathematics. Volume 87, 2009, pp. 22-34. (didaktik-der-mathematik.de ; PDF; 843 kB)
  • Armin von Bogdandy, Matthias Goldmann: The Exercise of International Public Authority through National Policy Assessment. The OECD's PISA Policy as a Paradigm for a New International Standard Instrument. In: International Organizations Law Review. Volume 5, 2008, pp. 241-298. (Available online as NYU Institute for International Law and Justice Working Paper ( January 12, 2016 memento on the Internet Archive ))
  • Volker Bank: The value of the comparison. In: Chemnitz European Studies. Volume 8, 2008, pp. 257-274.
  • Stefan Hopmann, Gertrude Brinek, Martin Retzl (eds.): According to PISA, PISA. PISA According to PISA. LIT-Verlag, Vienna 2007, ISBN 978-3-8258-0946-1 . (Bilingual anthology with contributions from seventeen researchers)
  • Joachim Wuttke: The insignificance of significant differences. PISA's claim to accuracy is illusory. In: T. Jahnke, Meyerhöfer: PISA & Co - Critique of a Program. 2nd Edition. Franzbecker, Hildesheim 2007. Online version (2013) in SSOAR (PDF; 1.4 MB)
  • Thomas Jahnke, Wolfram Meyerhöfer (Ed.): PISA & Co - Critique of a Program. 2nd Edition. Franzbecker, Hildesheim 2007, ISBN 978-3-88120-464-4 . (Anthology with contributions by nine researchers)
  • Erich Neuwirth, Ivo Ponocny, Wilfried Grossmann (eds.): PISA 2000 and PISA 2003. In-depth analyzes and contributions to the methodology. Leykam, Graz 2006, ISBN 3-7011-7569-1 (comprehensive erratum on the Austrian results from PISA 2000)
  • Peter Bender: What do PISA & Co. tell us if we get involved with them? in: Jahnke, Meyerhöfer (2006), pp. 281–337.
  • Hans Brügelmann , Hans Werner Heymann : PISA - findings, interpretations, conclusions. In: Pedagogy. Volume 54, Issue 3, Berlin 2002, pp. 40-43. ISSN  0233-0873
  • Josef Kraus : The PISA hoax. Our children are better than their reputation. How parents and schools can promote potential. Signum, Vienna 2005, ISBN 3-85436-376-1 (deliberately polemical pamphlet)
  • W. Potthoff, J. Schneider, F. Schrage: Impulses for the active school. Suggestions for better centering and profiling of the education system according to PISA. Reformedagogischer Verlag, Freiburg 2004, ISBN 3-925416-27-7 .
  • Volker Ladenthin: PISA - Law and Limits of a Global Empirical Study. An educational theory consideration. (PDF; 255 kB). In: Quarterly journal for scientific pedagogy. Volume 79, No. 3, Paderborn 2003, pp. 354-375.
  • Hans Freudenthal : Pupils achievements internationally compared --- the IEA. In: Educational studies in mathematics. Volume 6, Dordrecht 1975, pp. 127-186. ISSN  0013-1954 (criticism of the basic approach of international comparative studies)

Web links

International
Germany
Austria
Switzerland
South-Tirol
other participating States
criticism

Individual evidence

  1. Organization for Economic Cooperation and Development
  2. German project management 2012, 2015 and 2018 German project management 2009 at the German Institute for International Educational Research German project management 2003 and 2006 - IPN at the University of Kiel
  3. International basic concept according to German project management 2000
  4. OECD publication with sample exercises (PDF; 418 kB). It is not entirely clear whether the OECD license conditions would allow integration into the WP.
  5. For details see methodology of the PISA studies .
  6. Sources: OECD reports “First Results” 2001, 2004 (for PISA 2000 and 2003); for the results of PISA 2006: "PISA 2006. Science Competencies for Tomorrow's World."; for the results of PISA 2009: “Eckhard Klieme, Cordula Artelt, Johannes Hartig, u. a. (2010): PISA 2009 - Balance after a decade. Waxmann Verlag ”. Abbreviations: "k. T. “= no participation; "Disq." = Disqualified due to insufficient participation rate. These are the dates originally published; the correction published in 2006 for the Austrian results from 2000 is not taken into account. The number after the “±” sign is the official standard error, which indicates the stochastic uncertainty of the sampling and the item response modeling; the first decimal place is also given in the original reports.
  7. Ramm et al. 2004: Ramm et al.: Socio-cultural origin: Migration. In: PISA 2003: The educational level of young people in Germany - results of the second international comparison. Waxmann, Münster, ISBN 3-8309-1455-5 , pp. 269/270.
  8. ^ PISA-Konsortium Deutschland: PISA 2003: Results of the second international comparison - summary. ( Memento of May 17, 2012 in the Internet Archive ) (PDF; 263 kB) pp. 20–21.
  9. ^ E. Klieme et al.: PISA 2009 balance sheet after a decade - summary. ( Memento from September 20, 2011 in the Internet Archive ) (PDF; 1.5 MB) p. 16.
  10. OECD : Gender-related prejudices influence the educational outcomes of boys and girls. Retrieved December 22, 2010.
  11. ^ OECD Program for the International Student Assessment. ( Memento of February 27, 2015 in the Internet Archive ) Retrieved on September 11, 2012 (English, official PISA data, for the list see "Executive Summary").
  12. Comparing Countries 'and Economies' performance.
  13. ^ PISA study - Organization for Economic Co-operation and Development. Retrieved April 13, 2018 .
  14. Klieme et al.: PISA 2009: Balance after a decade. Waxmann, Münster 2010 ..
  15. ^ PISA 2006: Scientific Competencies for Tomorrow's World . OECD Briefing Note for Germany
  16. Summary of the German results by the Kieler ( Memento of March 24, 2012 in the Internet Archive ) (PDF; 391 kB) IPN
  17. E. Neuwirth, I. Ponocny, W. Grossman (eds.): PISA 2000 and PISA 2003: In-depth analyzes and contributions on methodology. Leykam, Graz 2006.
  18. See the relevant article in the online version of the liberal Austrian daily Der Standard: Sampling errors caused the "Pisa crash" . Retrieved April 19, 2011.
  19. See p. 135 of the PISA report PISA 2009 RESULTS: LEARNING TRENDS. Change in Student Performance since 2000. Vol. V. Accessed April 18, 2011.
  20. PISA 2012 results: What students know and can do. Volume 1, 2014, ISBN 978-3-7639-5321-9 , pp. 400, 323, 417.
  21. And then it's the teachers' fault again. In: Tyrolean daily newspaper. No. 128, May 10, 2009, p. 2.
  22. McKinsey study (English) for PISA September 2007 ( Memento of 10 August 2011 at the Internet Archive )
  23. oecdpisaletter.org
  24. ^ T. Jahnke: The PISA entrepreneurs. In: Research & Teaching . Volume 15, 2008, pp. 26-27.
  25. ^ KP Liessmann: Theory of Unbildung. The errors of the knowledge society. Paul Zsolnay, Vienna 2006.
  26. Eckhard Klieme , DIPF , Why We Need PISA. (= Leibniz Journal. 1/2015). leibniz-gemeinschaft.de
  27. "Attic" instead of "floor of the attic" for "attic floor", "hemisphere" instead of "hemisphere" for "hemisphere", "research" for "scientific experiments" - cf. Wuttke (2007)
  28. Joachim Wuttke: Pisa - an expensive random number generator. In: Berliner Zeitung . December 8, 2007, accessed June 18, 2015 .
  29. More or Less. BBC Radio 4 , April 22, 2011, accessed June 14, 2013 .
  30. Christian Pfeiffer et al.: The PISA losers - victims of media consumption. (PDF; 150 kB). KFN Hanover. ( kfn.de ( Memento from February 25, 2016 in the Internet Archive ))