Programme for International Student Assessment (2000 to 2012)

The Programme for International Student Assessment has had several runs before the most recent one in 2012. The first PISA assessment was carried out in 2000. The results of each period of assessment take about one year and a half to be analysed. First results were published in November 2001. The release of raw data and the publication of technical report and data handbook only took place in spring 2002. The triennial repeats follow a similar schedule; the process of seeing through a single PISA cycle, start-to-finish, always takes over four years. 470,000 15-year-old students representing 65 nations and territories participated in PISA 2009. An additional 50,000 students representing nine nations were tested in 2010.[1]

Every period of assessment focuses on one of the three competence fields of reading, math, science; but the two others are tested as well. After nine years, a full cycle is completed: after 2000, reading was again the main domain in 2009.

Period Focus OECD countries Partner countries Participating students Notes
2000 Reading 28 4 + 11 265,000 The Netherlands disqualified from data analysis. 11 additional non-OECD countries took the test in 2002.
2003 Mathematics 30 11 275,000 UK disqualified from data analysis. Also included test in problem solving.
2006 Science 30 27 400,000 Reading scores for US excluded from analysis due to misprint in testing materials.[2]
2009[3] Reading 34 41 + 10 470,000 10 additional non-OECD countries took the test in 2010.[4]
2012[5] Mathematics 34 31 510,000

Results

edit

PISA 2012

edit
PISA 2012
 
The results for the 2012 "Maths" section on a world map.
 
The results for the 2012 "Science" section on a world map.
 
The results for the 2012 "Reading" section on a world map.
OECD members as of the time of the study are in boldface.
Mathematics Science Reading
1   Shanghai, China 613
2   Singapore 573
3   Hong Kong, China 561
4   Taiwan 560
5   South Korea 554
6   Macau, China 538
7   Japan 536
8   Liechtenstein 535
9    Switzerland 531
10   Netherlands 523
11   Estonia 521
12   Finland 519
13=   Canada 518
13=   Poland 518
15   Belgium 515
16   Germany 514
17   Vietnam 511
18   Austria 506
19   Australia 504
20=   Ireland 501
20=   Slovenia 501
22=   Denmark 500
22=   New Zealand 500
24   Czech Republic 499
25   France 495
26   United Kingdom 494
27   Iceland 493
28   Latvia 491
29   Luxembourg 490
30   Norway 489
31   Portugal 487
32   Italy 485
33   Spain 484
34=   Russia 482
34=   Slovakia 482
36   United States 481
37   Lithuania 479
38   Sweden 478
39   Hungary 477
40   Croatia 471
41   Israel 466
42   Greece 453
43   Serbia 449
44   Turkey 448
45   Romania 445
46   Cyprus 440
47   Bulgaria 439
48   United Arab Emirates 434
49   Kazakhstan 432
50   Thailand 427
51   Chile 423
52   Malaysia 421
53   Mexico 413
54   Montenegro 410
55   Uruguay 409
56   Costa Rica 407
57   Albania 394
58   Brazil 391
59=   Argentina 388
59=   Tunisia 388
61   Jordan 386
62=   Colombia 376
62=   Qatar 376
64   Indonesia 375
65   Peru 368
1   Shanghai, China 580
2   Hong Kong, China 555
3   Singapore 551
4   Japan 547
5   Finland 545
6   Estonia 541
7   South Korea 538
8   Vietnam 528
9   Poland 526
10=   Liechtenstein 525
10=   Canada 525
12   Germany 524
13   Taiwan 523
14=   Netherlands 522
14=   Ireland 522
16=   Macau, China 521
16=   Australia 521
18   New Zealand 516
19    Switzerland 515
20=   Slovenia 514
20=   United Kingdom 514
22   Czech Republic 508
23   Austria 506
24   Belgium 505
25   Latvia 502
26   France 499
27   Denmark 498
28   United States 497
29=   Spain 496
29=   Lithuania 496
31   Norway 495
32=   Italy 494
32=   Hungary 494
34=   Luxembourg 491
34=   Croatia 491
36   Portugal 489
37   Russia 486
38   Sweden 485
39   Iceland 478
40   Slovakia 471
41   Israel 470
42   Greece 467
43   Turkey 463
44   United Arab Emirates 448
45   Bulgaria 446
46=   Serbia 445
46=   Chile 445
48   Thailand 444
49   Romania 439
50   Cyprus 438
51   Costa Rica 429
52   Kazakhstan 425
53   Malaysia 420
54   Uruguay 416
55   Mexico 415
56   Montenegro 410
57   Jordan 409
58   Argentina 406
59   Brazil 405
60   Colombia 399
61   Tunisia 398
62   Albania 397
63   Qatar 384
64   Indonesia 382
65   Peru 373
1   Shanghai, China 570
2   Hong Kong, China 545
3   Singapore 542
4   Japan 538
5   South Korea 536
6   Finland 524
7=   Taiwan 523
7=   Canada 523
7=   Ireland 523
10   Poland 518
11=   Liechtenstein 516
11=   Estonia 516
13=   Australia 512
13=   New Zealand 512
15   Netherlands 511
16=   Macau, China 509
16=    Switzerland 509
16=   Belgium 509
19=   Germany 508
19=   Vietnam 508
21   France 505
22   Norway 504
23   United Kingdom 499
24   United States 498
25   Denmark 496
26   Czech Republic 493
27=   Austria 490
27=   Italy 490
29   Latvia 489
30=   Luxembourg 488
30=   Portugal 488
30=   Spain 488
30=   Hungary 488
34   Israel 486
35   Croatia 485
36=   Iceland 483
36=   Sweden 483
38   Slovenia 481
39=   Lithuania 477
39=   Greece 477
41=   Russia 475
41=   Turkey 475
43   Slovakia 463
44   Cyprus 449
45   Serbia 446
46   United Arab Emirates 442
47=   Thailand 441
47=   Chile 441
47=   Costa Rica 441
50   Romania 438
51   Bulgaria 436
52   Mexico 424
53   Montenegro 422
54   Uruguay 411
55   Brazil 410
56   Tunisia 404
57   Colombia 403
58   Jordan 399
59   Malaysia 398
60=   Argentina 396
60=   Indonesia 396
62   Albania 394
63   Kazakhstan 393
64   Qatar 388
65   Peru 384

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[5] This testing cycle had a particular focus on mathematics, where the mean score was 494. A sample of 1,688 students from Puerto Rico took the assessment, scoring 379 in math, 404 in reading and 401 in science.[6] A subgroup of 44 countries and economies with about 85 000 students also took part in an optional computer-based assessment of problem solving.[7]

Shanghai had the highest score in all three subjects. It was followed by Singapore, Hong Kong, Chinese Taipei and Korea in mathematics; Hong Kong, Singapore, Japan and Korea in reading and Hong Kong, Singapore, Japan and Finland in science.

They were a sample of about 28 million in the same age group in 65 countries and economies,[8] including the OECD countries, several Chinese cities, Vietnam, Indonesia and several countries in South America.[5]

The test lasted two hours, was paper-based and included both open-ended and multiple-choice questions.[8]

The students and school staff also answered a questionnaire to provide background information about the students and the schools.[5][8]

PISA 2012 was presented on 3 December 2013, with results for around 510,000 participating students in all 34 OECD member countries and 31 partner countries.[5] This testing cycle had a particular focus on mathematics, where the mean score was 494. The mean score in reading was 496 and in science 501.[citation needed]

The results show distinct groups of high-performers in mathematics: the East Asian countries, with Shanghai, scoring the best result of 613, followed closely by Hong Kong, Japan, Chinese Taipei and South Korea. Among the Europeans, Liechtenstein and Switzerland performed best, with Netherlands, Estonia, Finland, Poland, Belgium, Germany, Austria all posting mathematics scores "not significantly statistically different from" one another. The United Kingdom, Ireland, Australia and New Zealand were similarly clustered around the OECD average of 494, with the USA trailing this group at 481.[5]

Qatar, Kazakhstan and Malaysia were the countries which showed the greatest improvement in mathematics. The USA and the United Kingdom showed no significant change.[9] Sweden had the greatest fall in mathematics performance over the last ten years, with a similar falling trend also in the two other subjects, and leading politicians in Sweden expressed great worry over the results.[10][11]

On average boys scored better than girls in mathematics, girls scored better than boys in reading and the two sexes had quite similar scores in science.[9]

Indonesia, Albania, Peru, Thailand and Colombia were the countries where most students reported being happy at school, while students in Korea, the Czech Republic, the Slovak Republic, Estonia and Finland reported least happiness.[8]

PISA 2009

edit
PISA 2009

The PISA 2009 cycle included results in mathematics, science and reading for all 36 OECD member countries and 37 partner countries.[3][12][13]

Of the partner countries, only selected areas of three countries—India, Venezuela and China—were assessed. PISA 2009+, released in December 2011, included data from 10 additional partner countries which had testing delayed from 2009 to 2010 because of scheduling constraints.[4][14]

OECD members as of the time of the study are in boldface. Participants in PISA 2009+, which were tested in 2010 after the main group of 65, are italicized.
Mathematics Science Reading
1   Shanghai, China 600
2   Singapore 562
3   Hong Kong, China 555
4   South Korea 546
5   Taiwan 543
6   Finland 541
7   Liechtenstein 536
8    Switzerland 534
9   Japan 529
10   Canada 527
11   Netherlands 526
12   Macau, China 525
13   New Zealand 519
14   Belgium 515
15   Australia 514
16   Germany 513
17   Estonia 512
18   Iceland 507
19   Denmark 503
20   Slovenia 501
21   Norway 498
22   France 497
23   Slovakia 497
24   Austria 496
25   Poland 495
26   Sweden 494
27   Czech Republic 493
28   United Kingdom 492
29   Hungary 490
30   Luxembourg 489
31   United States 487
32   Portugal 487
33   Ireland 487
34   Spain 483
35   Italy 483
36   Latvia 482
37   Lithuania 477
38   Russia 468
39   Greece 466
40   Malta 463
41   Croatia 460
42   Israel 447
43   Turkey 445
44   Serbia 442
45   Azerbaijan 431
46   Bulgaria 428
47   Uruguay 427
48   Romania 427
49   United Arab Emirates 421
50   Chile 421
51   Mauritius 420
52   Thailand 419
53   Mexico 419
54   Trinidad and Tobago 414
55   Costa Rica 409
56   Kazakhstan 405
57   Malaysia 404
58   Montenegro 403
59   Moldova 397
60   Miranda, Venezuela 397
61   Argentina 388
62   Jordan 387
63   Brazil 386
64   Colombia 381
65   Georgia 379
66   Albania 377
67   Tunisia 371
68   Indonesia 371
69   Qatar 368
70   Peru 365
71   Panama 360
72   Tamil Nadu, India 351
73   Himachal Pradesh, India 338
74   Kyrgyzstan 331
1   Shanghai, China 575
2   Finland 554
3   Hong Kong, China 549
4   Singapore 542
5   Japan 539
6   South Korea 538
7   New Zealand 532
8   Canada 529
9   Estonia 528
10   Australia 527
11   Netherlands 522
12   Liechtenstein 520
13   Germany 520
14   Taiwan 520
15    Switzerland 517
16   United Kingdom 514
17   Slovenia 512
18   Macau, China 511
19   Poland 508
20   Ireland 508
21   Belgium 507
22   Hungary 503
23   United States 502
24   Norway 500
25   Czech Republic 500
26   Denmark 499
27   France 498
28   Iceland 496
29   Sweden 495
30   Latvia 494
31   Austria 494
32   Portugal 493
33   Lithuania 491
34   Slovakia 490
35   Italy 489
36   Spain 488
37   Croatia 486
38   Luxembourg 484
39   Russia 478
40   Greece 470
41   Malta 461
42   Israel 455
43   Turkey 454
44   Chile 447
45   Serbia 443
46   Bulgaria 439
47   United Arab Emirates 438
48   Costa Rica 430
49   Romania 428
50   Uruguay 427
51   Thailand 425
52   Miranda, Venezuela 422
53   Malaysia 422
54   Mauritius 417
55   Mexico 416
56   Jordan 415
57   Moldova 413
58   Trinidad and Tobago 410
59   Brazil 405
60   Colombia 402
61   Tunisia 401
62   Montenegro 401
63   Argentina 401
64   Kazakhstan 400
65   Albania 391
66   Indonesia 383
67   Qatar 379
68   Panama 376
69   Georgia 373
70   Azerbaijan 373
71   Peru 369
72   Tamil Nadu, India 348
73   Kyrgyzstan 330
74   Himachal Pradesh, India 325
1   Shanghai, China 556
2   South Korea 539
3   Finland 536
4   Hong Kong, China 533
5   Singapore 526
6   Canada 524
7   New Zealand 521
8   Japan 520
9   Australia 515
10   Netherlands 508
11   Belgium 506
12   Norway 503
13   Estonia 501
14    Switzerland 501
15   Poland 500
16   Iceland 500
17   United States 500
18   Liechtenstein 499
19   Sweden 497
20   Germany 497
21   Ireland 496
22   France 496
23   Taiwan 495
24   Denmark 495
25   United Kingdom 494
26   Hungary 494
27   Portugal 489
28   Macau, China 487
29   Italy 486
30   Latvia 484
31   Greece 483
32   Slovenia 483
33   Spain 481
34   Czech Republic 478
35   Slovakia 477
36   Croatia 476
37   Israel 474
38   Luxembourg 472
39   Austria 470
40   Lithuania 468
41   Turkey 464
42   Russia 459
43   Chile 449
44   Costa Rica 443
45   Malta 442
46   Serbia 442
47   United Arab Emirates 431
48   Bulgaria 429
49   Uruguay 426
50   Mexico 425
51   Romania 424
52   Miranda, Venezuela 422
53   Thailand 421
54   Trinidad and Tobago 416
55   Malaysia 414
56   Colombia 413
57   Brazil 412
58   Montenegro 408
59   Mauritius 407
60   Jordan 405
61   Tunisia 404
62   Indonesia 402
63   Argentina 398
64   Kazakhstan 390
65   Moldova 388
66   Albania 385
67   Georgia 374
68   Qatar 372
69   Panama 371
70   Peru 370
71   Azerbaijan 362
72   Tamil Nadu, India 337
73   Himachal Pradesh, India 317
74   Kyrgyzstan 314

PISA 2006

edit
PISA 2006
OECD members as of the time of the study are in boldface. Reading scores for the United States were disqualified.
Mathematics Science Reading
1   Taiwan 549
2   Finland 548
3   South Korea 547
4   Hong Kong, China 547
5   Netherlands 531
6    Switzerland 530
7   Canada 527
8   Macau, China 525
9   Liechtenstein 525
10   Japan 523
11   New Zealand 522
12   Belgium 520
13   Australia 520
14   Estonia 515
15   Denmark 513
16   Czech Republic 510
17   Iceland 506
18   Austria 505
19   Slovenia 504
20   Germany 504
21   Sweden 502
22   Ireland 501
23   France 496
24   United Kingdom 495
25   Poland 495
26   Slovakia 492
27   Hungary 491
28   Norway 490
29   Luxembourg 490
30   Lithuania 486
31   Latvia 486
32   Spain 480
33   Russia 476
34   Azerbaijan 476
35   United States 474
36   Croatia 467
37   Portugal 466
38   Italy 462
39   Greece 459
40   Israel 442
41   Serbia 435
42   Uruguay 427
43   Turkey 424
44   Thailand 417
45   Romania 415
46   Bulgaria 413
47   Chile 411
48   Mexico 406
49   Montenegro 399
50   Indonesia 391
51   Jordan 384
52   Argentina 381
53   Colombia 370
54   Brazil 370
55   Tunisia 365
56   Qatar 318
57   Kyrgyzstan 311
1   Finland 563
2   Hong Kong, China 542
3   Canada 534
4   Taiwan 532
5   Japan 531
6   Estonia 531
7   New Zealand 530
8   Australia 527
9   Netherlands 525
10   Liechtenstein 522
11   South Korea 522
12   Slovenia 519
13   Germany 516
14   United Kingdom 515
15   Czech Republic 513
16    Switzerland 512
17   Austria 511
18   Macau, China 511
19   Belgium 510
20   Ireland 508
21   Hungary 504
22   Sweden 503
23   Poland 498
24   Denmark 496
25   France 495
26   Croatia 493
27   Iceland 491
28   Latvia 490
29   United States 489
30   Slovakia 488
31   Spain 488
32   Lithuania 488
33   Norway 487
34   Luxembourg 486
35   Russia 479
36   Italy 475
37   Portugal 474
38   Greece 473
39   Israel 454
40   Chile 438
41   Serbia 436
42   Bulgaria 434
43   Uruguay 428
44   Turkey 424
45   Jordan 422
46   Thailand 421
47   Romania 418
48   Montenegro 412
49   Mexico 410
50   Indonesia 393
51   Argentina 391
52   Brazil 390
53   Colombia 388
54   Tunisia 386
55   Azerbaijan 382
56   Qatar 349
57   Kyrgyzstan 322
1   South Korea 556
2   Finland 547
3   Hong Kong, China 536
4   Canada 527
5   New Zealand 521
6   Ireland 517
7   Australia 513
8   Liechtenstein 510
9   Poland 508
10   Sweden 507
11   Netherlands 507
12   Belgium 501
13   Estonia 501
14    Switzerland 499
15   Japan 498
16   Taiwan 496
17   United Kingdom 495
18   Germany 495
19   Denmark 494
20   Slovenia 494
21   Macau, China 492
22   Austria 490
23   France 488
24   Iceland 484
25   Norway 484
26   Czech Republic 483
27   Hungary 482
28   Latvia 479
29   Luxembourg 479
30   Croatia 477
31   Portugal 472
32   Lithuania 470
33   Italy 469
34   Slovakia 466
35   Spain 461
36   Greece 460
37   Turkey 447
38   Chile 442
39   Russia 440
40   Israel 439
41   Thailand 417
42   Uruguay 413
43   Mexico 410
44   Bulgaria 402
45   Serbia 401
46   Jordan 401
47   Romania 396
48   Indonesia 393
49   Brazil 393
50   Montenegro 392
51   Colombia 385
52   Tunisia 380
53   Argentina 374
54   Azerbaijan 353
55   Qatar 312
56   Kyrgyzstan 285

PISA 2003

edit

The results for PISA 2003 were released on 14 December 2004. This PISA cycle tested 275,000 15 year-olds on mathematics, science, reading and problem solving and involved schools from 30 OECD member countries and 11 partner countries.[15] Note that for Science and Reading, the means displayed are for "All Students", but for these two subjects (domains), not all of the students answered questions in these domains. In the 2003 OECD Technical Report (pages 208, 209), there are different country means (different than those displayed below) available for students who had exposure to these domains.[16]

PISA 2003
OECD members at the time of the study are in boldface. The United Kingdom was disqualified due to a low response rate.
Mathematics Science Reading Problem solving
1   Hong Kong, China 550
2   Finland 544
3   Korea 542
4   Netherlands 538
5   Liechtenstein 536
6   Japan 534
7   Canada 532
8   Belgium 529
9   Macau, China 527
10    Switzerland 527
11   Australia 524
12   New Zealand 523
13   Czech Republic 516
14   Iceland 515
15   Denmark 514
16   France 511
17   Sweden 509
18   Austria 506
19   Germany 503
20   Ireland 503
21   Slovakia 498
22   Norway 495
23   Luxembourg 493
24   Poland 490
25   Hungary 490
26   Spain 485
27   Latvia 483
28   United States 483
29   Russia 468
30   Portugal 466
31   Italy 466
32   Greece 445
33   Serbia and Montenegro 437
34   Turkey 423
35   Uruguay 422
36   Thailand 417
37   Mexico 385
38   Indonesia 360
39   Tunisia 359
40   Brazil 356
1   Finland 548
2   Japan 548
3   Hong Kong, China 539
4   Korea 538
5   Liechtenstein 525
6   Australia 525
7   Macau, China 525
8   Netherlands 524
9   Czech Republic 523
10   New Zealand 521
11   Canada 519
12    Switzerland 513
13   France 511
14   Belgium 509
15   Sweden 506
16   Ireland 505
17   Hungary 503
18   Germany 502
19   Poland 498
20   Slovakia 495
21   Iceland 495
22   United States 491
23   Austria 491
24   Russia 489
25   Latvia 489
26   Spain 487
27   Italy 486
28   Norway 484
29   Luxembourg 483
30   Greece 481
31   Denmark 475
32   Portugal 468
33   Uruguay 438
34   Serbia and Montenegro 436
35   Turkey 434
36   Thailand 429
37   Mexico 405
38   Indonesia 395
39   Brazil 390
40   Tunisia 385
1   Finland 543
2   Korea 534
3   Canada 528
4   Australia 525
5   Liechtenstein 525
6   New Zealand 522
7   Ireland 515
8   Sweden 514
9   Netherlands 513
10   Hong Kong, China 510
11   Belgium 507
12   Norway 500
13    Switzerland 499
14   Japan 498
15   Macau, China 498
16   Poland 497
17   France 496
18   United States 495
19   Denmark 492
20   Iceland 492
21   Germany 491
22   Austria 491
23   Latvia 491
24   Czech Republic 489
25   Hungary 482
26   Spain 481
27   Luxembourg 479
28   Portugal 478
29   Italy 476
30   Greece 472
31   Slovakia 469
32   Russia 442
33   Turkey 441
34   Uruguay 434
35   Thailand 420
36   Serbia and Montenegro 412
37   Brazil 403
38   Mexico 400
39   Indonesia 382
40   Tunisia 375
1   Korea 550
2   Hong Kong, China 548
3   Finland 548
4   Japan 547
5   New Zealand 533
6   Macau, China 532
7   Australia 530
8   Liechtenstein 529
9   Canada 529
10   Belgium 525
11    Switzerland 521
12   Netherlands 520
13   France 519
14   Denmark 517
15   Czech Republic 516
16   Germany 513
17   Sweden 509
18   Austria 506
19   Iceland 505
20   Hungary 501
21   Ireland 498
22   Luxembourg 494
23   Slovakia 492
24   Norway 490
25   Poland 487
26   Latvia 483
27   Spain 482
28   Russia 479
29   United States 477
30   Portugal 470
31   Italy 469
32   Greece 448
33   Thailand 425
34   Serbia and Montenegro 420
35   Uruguay 411
36   Turkey 408
37   Mexico 384
38   Brazil 371
39   Indonesia 361
40   Tunisia 345

PISA 2000

edit

The results for the first cycle of the PISA survey were released on 14 November 2001. 265,000 15 year-olds were tested in 28 OECD countries and 4 partner countries on mathematics, science and reading. An additional 11 countries were tested later in 2002.[17]

PISA 2000
OECD members as of the time of the study are in boldface. The 11 partner countries tested in 2002 after the main group of 32 are italicized.
Mathematics Science Reading
1   Hong Kong, China 560
2   Japan 557
3   Korea 547
4   New Zealand 537
5   Finland 536
6   Australia 533
7   Canada 533
8    Switzerland 529
9   United Kingdom 529
10   Belgium 520
11   France 517
12   Austria 515
13   Denmark 514
14   Iceland 514
15   Liechtenstein 514
16   Sweden 510
17   Ireland 503
18   Norway 499
19   Czech Republic 498
20   United States 493
21   Germany 490
22   Hungary 488
23   Russia 478
24   Spain 476
25   Poland 470
26   Latvia 463
27   Italy 457
28   Portugal 454
29   Greece 447
30   Luxembourg 446
31   Israel 433
32   Thailand 432
33   Bulgaria 430
34   Argentina 388
35   Mexico 387
36   Chile 384
37   Albania 381
38   Macedonia 381
39   Indonesia 367
40   Brazil 334
41   Peru 292
1   Korea 552
2   Japan 550
3   Hong Kong, China 541
4   Finland 538
5   United Kingdom 532
6   Canada 529
7   New Zealand 528
8   Australia 528
9   Austria 519
10   Ireland 513
11   Sweden 512
12   Czech Republic 511
13   France 500
14   Norway 500
15   United States 499
16   Hungary 496
17   Iceland 496
18   Belgium 496
19    Switzerland 496
20   Spain 491
21   Germany 487
22   Poland 483
23   Denmark 481
24   Italy 478
25   Liechtenstein 476
26   Greece 461
27   Russia 460
28   Latvia 460
29   Portugal 459
30   Bulgaria 448
31   Luxembourg 443
32   Thailand 436
33   Israel 434
34   Mexico 422
35   Chile 415
36   Macedonia 401
37   Argentina 396
38   Indonesia 393
39   Albania 376
40   Brazil 375
41   Peru 333
1   Finland 546
2   Canada 534
3   New Zealand 529
4   Australia 528
5   Ireland 527
6   Hong Kong, China 525
7   Korea 525
8   United Kingdom 523
9   Japan 522
10   Sweden 516
11   Austria 507
12   Belgium 507
13   Iceland 507
14   Norway 505
15   France 505
16   United States 504
17   Denmark 497
18    Switzerland 494
19   Spain 493
20   Czech Republic 492
21   Italy 487
22   Germany 484
23   Liechtenstein 483
24   Hungary 480
25   Poland 479
26   Greece 474
27   Portugal 470
28   Russia 462
29   Latvia 458
30   Israel 452
31   Luxembourg 441
32   Thailand 431
33   Bulgaria 430
34   Mexico 422
35   Argentina 418
36   Chile 410
37   Brazil 396
38   Macedonia 373
39   Indonesia 371
40   Albania 349
41   Peru 327

Comparison with other studies

edit

The correlation between PISA 2003 and TIMSS 2003 grade 8 country means is 0.84 in mathematics, 0.95 in science. The values go down to 0.66 and 0.79 if the two worst performing developing countries are excluded. Correlations between different scales and studies are around 0.80. The high correlations between different scales and studies indicate common causes of country differences (e.g. educational quality, culture, wealth or genes) or a homogenous underlying factor of cognitive competence. European Economic Area countries perform slightly better in PISA; the Commonwealth of Independent States and Asian countries in TIMSS. Content balance and years of schooling explain most of the variation.[18]

Reception

edit

The results from PISA 2003 and PISA 2006 were featured in the 2010 documentary Waiting for "Superman".[19]

China

edit

Education professor Yong Zhao has noted that PISA 2009 did not receive much attention in the Chinese media, and that the high scores in China are due to excessive workload and testing, adding that it's "no news that the Chinese education system is excellent in preparing outstanding test takers, just like other education systems within the Confucian cultural circle: Singapore, Korea, Japan, and Hong Kong."[20]

Students from Shanghai, China, had the top scores of every category (Mathematics, Reading and Science) in PISA 2009. In discussing these results, PISA spokesman Andreas Schleicher, Deputy Director for Education and head of the analysis division at the OECD’s directorate for education, described Shanghai as a pioneer of educational reform in which "there has been a sea change in pedagogy". Schleicher stated that Shanghai abandoned its "focus on educating a small elite, and instead worked to construct a more inclusive system. They also significantly increased teacher pay and training, reducing the emphasis on rote learning and focusing classroom activities on problem solving."[21]

Schleicher also states that PISA tests administered in rural China have produced some results approaching the OECD average: Citing further, as-yet-unpublished OECD research, Schleicher said, "We have actually done Pisa in 12 of the provinces in China. Even in some of the very poor areas you get performance close to the OECD average."[22] Schleicher says that for a developing country, China's 99.4% enrollment in primary education is "the envy of many countries". He maintains that junior secondary school participation rates in China are now 99%; and in Shanghai, not only has senior secondary school enrollment attained 98%, but admissions into higher education have achieved 80% of the relevant age group. Schleicher believes that this growth reflects quality, not just quantity, which he contends the top PISA ranking of Shanghai's secondary education confirms.[22] Schleicher believes that China has also expanded school access and has moved away from learning by rote.[23] According to Schleicher, Russia performs well in rote-based assessments, but not in PISA, whereas China does well in both rote-based and broader assessments.[22]

Denmark

edit

University of Copenhagen Professor Svend Kreiner, who examined in detail PISA's 2006 reading results, noted that in 2006 only about ten percent of the students who took part in PISA were tested on all 28 reading questions. "This in itself is ridiculous," Kreiner told Stewart. "Most people don't know that half of the students taking part in PISA (2006) do not respond to any reading item at all. Despite that, PISA assigns reading scores to these children."[24]

Finland

edit

The stable, high marks of Finnish students have attracted a lot of attention. According to Hannu Simola[25] the results reflect a paradoxical mix of progressive policies implemented through a rather conservative pedagogic setting, where the high levels of teachers' academic preparation, social status, professionalism and motivation for the job are concomitant with the adherence to traditional roles and methods by both teachers and pupils in Finland's changing, but still quite paternalistic culture. Others advance Finland's low poverty rate as a reason for its success.[26][27] Finnish education reformer Pasi Sahlberg attributes Finland's high educational achievements to its emphasis on social and educational equality and stress on cooperation and collaboration, as opposed to the competition among teachers and schools that prevails in other nations.[28]

India

edit

Of the 74 countries tested in the PISA 2009 cycle including the "+" nations, the two Indian states came up 72nd and 73rd out of 74 in both reading and mathematics, and 73rd and 74th in science. India's poor performance may not be linguistic as some suggested. 12.87% of US students, for example, indicated that the language of the test differed from the language spoken at home. while 30.77% of Himachal Pradesh students indicated that the language of the test differed from the language spoken at home, a significantly higher percent[29] However, unlike American students, those Indian students with a different language at home did better on the PISA test than those with the same language.[29] India's poor performance on the PISA test is consistent with India's poor performance in the only other instance when India's government allowed an international organization to test its students[30] and consistent with India's own testing of its elite students in a study titled Student Learning in the Metros 2006. [31] These studies were conducted using TIMSS questions. The poor result in PISA was greeted with dismay in the Indian media.[32] The BBC reported that as of 2008, only 15% of India's students reach high school.[33]

Italy / South Tyrol

edit

In 2003 South Tyrol (Provincia Autonoma di Bolzano / Autonome Provinz Bozen), a predominantly German-speaking province in the north of Italy, took part in the PISA project for the first time in order to have a regional result as an adjudicated region. In the rest of Italy PISA is conducted by INVALSI (Istituto nazionale per la valutazione del sistema educativo di istruzione e di formazione), a formally independent research institution affiliated to the Ministry of Education, whereas in South Tyrol PISA was carried out by the regional Education Authority itself (Intendenza scolastica / Schulamt, since 2018 renamed into Bildungsdirektion),[34] which is part of the South Tyrolean regional government. At the end of 2004, in the months prior to the announcement of the test results, the regional Education Authority in Bolzano / Bozen downplayed the validity of the PISA assessment and commissioned alternative school evaluations, preparing the public for a mediocre test result. According to the official PISA report 2003, however, South Tyrol seemed to even beat the PISA world champion Finland.

Critique

Right from the beginning, there was scepticism as to how South Tyrol succeeded in outdoing the neighbouring Italian and Austrian provinces. On the front page of its weekend edition for 29/30 January 2005, the South Tyrolean newspaper Neue Südtiroler Tageszeitung published a harsh critique and revealed that the South Tyrolean Education Authority had secretly eliminated more than 300 students from the 1500 students officially drawn as South Tyrolean test sample by the PISA Consortium, and soon more inconsistencies were to surface:

  • Lack of independence of the South Tyrolean PISA board: In South Tyrol, PISA was not conducted by a nominally independent body like the local university Freie Universität Bozen or the European Academy EURAC, both of which have ample expertise in the field of education, but by the so-called Pedagogical Institute (Pädagogisches Institut, director: Rudolf Meraner), which was part of the Education Authority, which in its turn was part of the regional government. A few years later the Pedagogical Institute was nominally absorbed into the Education Authority and renamed into Department for Innovation and Counselling (with the same director: Rudolf Meraner).
  • Exploitation for political ends: In the most influential mass media, the regional PISA results were presented as a triumph of the regional government and the SVP (Südtiroler Volkspartei or South Tyrolean People’s Party) ruling party’s policy, although the legal framework for all high schools in Italy is a purely national domain. For all the political autonomy granted to South Tyrol, this region still has the same types of schools and follows the same curricula as do all Italian regions. Mass media like the most-read South Tyrolean newspaper, Dolomiten, whose owner Michel Ebner is a prominent party member of the ruling SVP, did not either try to explain why the secondary schools attended by the Italian speaking students in the same Province did considerably worse, although it is the same regional government run by the SVP which is in charge of the Italian and the German school administration.[35]
  • Harassment of Critics: People who criticized the official PISA results and pointed out violations of the technical rules were officially threatened by the provincial governor, Luis Durnwalder, with libel action for slandering South Tyrol. On 16 March 2006, Durnwalder announced in a press conference that an Austrian teacher would be prosecuted and sued for damages simply because the teacher, in a letter sent to the Austrian Ministry of Education, had mentioned the fact that the South Tyrolean Education Authority had eliminated 17 percent of the students from the regional sample, thereby rendering invalid the regional PISA result. A year later, however, Mr Durnwalder had to admit that he had never taken legal action against the teacher and that there were no legal proceedings obviously because the critique was correct.[36] It is also noteworthy that the director of the Pedagogical Institute, Rudolf Meraner, and others have constantly deleted the original German Wikipedia article about the South Tyrolean PISA results, replacing it with public government statements.
  • Regional results deliberately misrepresented as national results: By definition, the PISA result of South Tyrol is a subregional result, which is not fully valid because of the small sample of 1500 students. Such regional results mainly serve documentary purposes and cannot be compared with national results. Nonetheless, the South Tyrolean Education Authority and the regional government repeatedly, and falsely, claimed that the South Tyrolean test results are national, i.e. fully valid results, whereas the neighbouring regions like Tyrol (Austria) and Trentino (Italy), according to South Tyrolean press releases, only had subnational results.[37]
  • Manipulation of the sample: Elimination of 17 percent of students: For a subnational result, regions had to test 1500 students. Among all regions with a subnational result, worldwide, South Tyrol was the only one that failed to test 1500 students. For reasons never specified, the Education Authority had eliminated 292 students (i.e. 17 percent) from the 1500 students sample, testing only 1208 of the students selected by the PISA consortium. Failing to test all 1500 students, the South Tyrolean Education Authority violated the technical PISA rules and deprived the PISA result even of its limited technical validity as a subnational result. Later on the Education Authority had to admit that it had actually excluded all vocational students, whose performance is generally considered to be inferior to that of high schools, and all students of the third forms of so-called middle schools (scuole medie / Mittelschulen), who would have been part of the sample only if they repeated a class and were considered underachievers because normally 15 year old students are enrolled at high schools. From the official PISA report this manipulation can be easily deduced as the difference between total population of 15-year-olds (4.908) and total enrolled population of 15-year-olds at grade 7 or above (4.087).[38] In subsequent PISA assessments, the OECD or rather the PISA consortium did not publish these key figures so that it was no longer possible to ascertain the number of students eliminated from the sample against the rules.
  • Incorrect figures about Target Population and actual sample: In Italy, according to the figures officially made available by the INVALSI, the number of all 15-year-old students (574.611) paradoxically exceeds the number of all 15-year-old people (561.304).[39] Due to this mistake, it is impossible to establish how many Italian students were actually identified as Italian target population, and it is also impossible to find out which percentage of South Tyrolean students had actually been presented as target population to the PISA consortium. In theory, the Education Authority could have excluded certain types of students from the target population even before the sample of 1500 students was drawn by the PISA consortium, from which the Education Authority later on eliminated 17 percent.

Comparison with similar assessments

The stunning South Tyrolean 2003 PISA results can hardly be reconciled with similar high school evaluations, which were not conducted or influenced by the South Tyrolean Education Authority itself. Three international or national large scale assessment projects painted a gloomy picture of the South Tyrolean students’ performance.

  • Admission Test at Austrian Universities of Medicine (EMS = Eignungstest für das Medizinstudium): Over the last two decades, candidates graduating from South Tyrolean high schools have traditionally scored very badly at the entry examination. The quota system at Austrian Universities of Medicine makes sure that 75 percent of all applicants admitted to Medicine are Austrian nationals or students from South Tyrol, who are considered an Austrian minority in Italy. Hence, for Austrians and South Tyroleans it is easier to get a place at an Austrian University of Medicine, even with bad scores, than for Germans and other EU nationals because the number of EU students must not exceed 20 percent. Nevertheless, according to a study financed by the Austrian Ministry of Education, not even within the big 75 percent quota, South Tyroleans could get a place because of their mediocre entrance test results. South Tyrolean applicants had an average score of 96,5 whereas e.g. German applicants had an average score of 103,1. Interestingly, South Tyrolean applicants did even worse than Austrian applicants although Italian high school graduates are one year older. What is even worse, South Tyrolean high school graduates show a sharp gender gap with female candidates doing much worse than their male colleagues, which reflects the Italian high school system: Certain types of high schools popular with female students (e.g. licei linguistici, licei scienze umane) have little mathematics and sciences. The difference between the average female and the average male South Tyrolean score in the 2007 EMS was dramatic: 94,8 versus 100,1 points.[40] Later on the Swiss EMS organisation disallowed the Austrian Universities of Medicine to use its EMS test and the Austrian universities introduced a new entrance test scheme with an extra quota for female students. South Tyrolean mass media did not cover the EMS debacle at all. Within the South Tyrolean parliament, however, there have been debates as to the reasons of the poor performance of South Tyrolean high school students.[41] As a consequence of the EMS debacle, the former South Tyrolean governor, Luis Durnwalder, envisaged a collaboration between the University of Medicine in Innsbruck and the Hospital in Bolzano (Bozen) which should have led to a Euregio Medical School open to all North Tyrolean (i.e. Austrian) and South Tyrolean applicants without any entrance exam. Eventually, the South Tyrolean government adopted a more pragmatic approach, it subsidized, as it were, extra places for South Tyrolean students. This seems to be a breach of the national quota system and the entrance requirements as outlined by Austrian laws. The former South Tyrolean governor, Luis Durnwalder, however, frankly admitted to resort to this solution when he complained about the high amount of money South Tyrol paid to the University of Medicine in Innsbruck for buying extra places for South Tyrolean students.[42]
  • German DESI assessment of linguistic skills in German (mother tongue) and English (foreign language): Parallel with the 2003 PISA assessment, the South Tyrolean Education Authority commissioned a second assessment, obviously because the Education Authority expected a mediocre PISA result and had downplayed the importance of PISA. Like PISA, the DESI assessment was conducted against the technical rules because, again, all vocational students attending a vocational school (Berufsschule) and working part-time in a workshop or small firm, i.e. one third of the target population was excluded from the test. The South Tyrolean Education Authority also excluded the whole English test section from DESI, thereby depriving DESI of its main purpose, namely a comparison of German speaking students’ language competence in L1 and L2. The South Tyrolean Education Authority argued that in (bilingual) South Tyrol, German speaking students start with Italian as their first foreign language at elementary schools. This explanation, however, lacks credibility because in Germany, as well, many students assessed by DESI had main foreign languages other than English, e.g. French or even Latin (Bavaria), and the amount of weekly classroom teaching in English from the first form at elementary schools is the same in Germany and South Tyrol. In their official report, the German experts responsible for the DESI assessment in South Tyrol generally praised the quality of teaching, but the concrete results in the report reveal drastic shortcomings. For instance, in the semantic field of railway station (Bahnhof), not even one South Tyrolean student with German as his or her mother tongue knew the German word for a signal box (Stellwerk). Instead of marking such errors as errors, however, the German experts resorted to a methodologically disputable assumption. They claimed, without checking, that all words which South Tyrolean students did not know are, by definition, not used in the South Tyrolean variant of German and that these words must be excluded from the test (item bias), thus counting only correct answers. The DESI testers from Germany, however, did not check if the words excluded from DESI because of item bias actually were unknown. In fact, all words excluded from the South Tyrolean DESI questionnaire are common German words used in South Tyrol as well, i.e. for these words there are no South Tyrolean variants at all. For all the technical exceptions and modifications, the 2003 South Tyrolean DESI result was very disappointing. Only 14 percent of the South Tyrolean high school students came into the best achievement group, whereas in Germany almost half of the 15 year old students belong to this group. On the other hand, one fourth of the South Tyrolean German speaking students came into the lowest achievement group, which in Germany, in spite of all social problems in big towns, comprises only 7 percent of all students.[43]
  • Italian INVALSI assessments: Traditionally, the annual assessments conducted by the Italian INVALSI in the pre-PISA period painted a sorry picture of the northernmost Italian Regione Autonoma di Trentino e Alto Adige comprising the two autonomous provinces called Trentino (Italian speaking) and South Tyrol (predominantly German speaking). Schools in Trentino and South Tyrol did constantly worse than those in all other North Italian regions and even lagged behind the Italian average, though it is not clear if the sample of German speaking students was representative because, again, the South Tyrolean Education Authority was entitled to eliminate bad schools from the sample, thereby manipulating the validity of the assessment.[44] To some extent, two South Tyrolean anomalies may account for the traditional bad performance of South Tyrolean students in international evaluations. Until recently, no pedagogical qualifications were required for teachers, not even for permanently employed teachers (insegnanti di ruolo / Stammrollenlehrer) appointed by the Education Authority in Bolzano (Bozen) through a so-called concorso. Besides, the admission criteria for the South Tyrolean concorsi have always been inconsistent. For example, a South Tyrolean student who studied at an Italian university German as a Foreign Language or Art History, automatically obtained the teaching license for completely unrelated subjects, like History and Latin, at South Tyrolean high schools.

United States

edit

Two studies have compared high achievers in mathematics on the PISA and those on the U.S. National Assessment of Educational Progress (NAEP). Comparisons were made between those scoring at the "advanced" and "proficient" levels in mathematics on the NAEP with the corresponding performance on the PISA. Overall, 30 nations had higher percentages than the U.S. of students at the "advanced" level of mathematics. The only OECD countries with worse results were Portugal, Greece, Turkey, and Mexico. Six percent of U.S. students were "advanced" in mathematics compared to 28 percent in Taiwan. The highest ranked state in the U.S. (Massachusetts) was just 15th in the world if it was compared with the nations participating in the PISA. 31 nations had higher percentages of "proficient" students than the U.S. Massachusetts was again the best U.S. state, but it ranked just ninth in the world if compared with the nations participating in the PISA.[45][46]

Comparisons with results for the Trends in International Mathematics and Science Study (TIMSS) appear to give different results—suggesting that the U.S. states actually do better in world rankings.[47] This can likely be traced to the different material being covered and the United States teaching mathematics in a style less harmonious with the "Realistic Mathematics Education" which forms the basis of the exam.[48] Countries that commonly use this teaching method score higher on PISA, and less highly on TIMSS and other assessments.[49]

Poverty

edit

Stephen Krassen, professor emeritus at the University of Southern California,[50] and Mel Riddile of the NASSP attributed the relatively low performance of students in the United States to the country's high rate of child poverty, which exceeds that of other OECD countries.[26][27] However, individual US schools with poverty rates comparable to Finland's (below 10%), as measured by reduced-price school lunch participation, outperform Finland; and US schools in the 10–24% reduced-price lunch range are not far behind.[51]

Reduced school lunch participation is the only available intra-poverty indicator for US schoolchildren. In the United States, schools in locations in which less than 10% of the students qualified for free or reduced-price lunch averaged PISA scores of 551 (higher than any other OECD country). This can be compared with the other OECD countries (which have tabled figures on children living in relative poverty):[27]

Country Percent of reduced school lunches (US)[27]

Percent of relative child poverty (Other OECD countries)[52]

PISA score[53]
United States < 10% 551
Finland 3.4% 536
Netherlands 9.0% 508
Belgium 6.7% 506
United States 10%–24.9% 527
Canada 13.6% 524
New Zealand 16.3% 521
Japan 14.3% 520
Australia 11.6% 515
United States 25–49.9% 502
Estonia 40.1% 501
United States 50–74.9% 471
Russian Federation 58.3% 459
United States > 75% 446

Sampling errors

edit

In 2013 Martin Carnoy of the Stanford University Graduate School of Education and Richard Rothstein of the Economic Policy Institute released a report, "What do international tests really show about U.S. student performance?", analyzing the 2009 PISA data base. Their report found that U.S. PISA test scores had been lowered by a sampling error that over-represented adolescents from the most disadvantaged American schools in the test-taking sample.[54] The authors cautioned that international test scores are often "interpreted to show that American students perform poorly when compared to students internationally" and that school reformers then conclude that "U.S. public education is failing." Such inferences, made before the data has been carefully analyzed, they say, "are too glib"[55] and "may lead policymakers to pursue inappropriate and even harmful reforms."[56]

Carnoy and Rothstein observe that in all countries, students from disadvantaged backgrounds perform worse than those from advantaged backgrounds, and the US has a greater percentage of students from disadvantaged backgrounds. The sampling error on the PISA results lowered U.S. scores for 15-year-olds even further, they say. The authors add, however, that in countries such as Finland, the scores of disadvantaged students tends to be stagnant, whereas in the U.S the scores of disadvantaged students have been steadily rising over time, albeit still lagging behind their those of their more advantaged peers. When the figures are adjusted for social class, the PISA scores of all US students would still remain behind those of the highest scoring countries, nevertheless, the scores of US students of all social backgrounds have shown a trajectory of improvement over time, notably in mathematics, a circumstance PISA's report fails to take into account.

Carnoy and Rothstein write that PISA spokesman Schleicher has been quoted saying that "international education benchmarks make disappointing reading for the U.S." and that "in the U.S. in particular, poverty was destiny. Low-income American students did (and still do) much worse than high-income ones on PISA. But poor kids in Finland and Canada do far better relative to their more privileged peers, despite their disadvantages" (Ripley 2011).[57] Carnoy and Rothstein state that their report's analysis shows Schleicher and Ripley's claims to be untrue. They further fault the way PISA's results have persistently been released to the press before experts have time to evaluate them; and they charge the OECD reports with inconsistency in explaining such factors as the role of parental education. Carnoy and Rothstein also note with alarm that the US secretary of education Arne Duncan regularly consults with PISA's Andreas Schleicher in formulating educational policy before other experts have been given a chance to analyze the results.[58] Carnoy and Rothstein's report (written before the release of the 2011 database) concludes:

We are most certain of this: To make judgments only on the basis of national average scores, on only one test, at only one point in time, without comparing trends on different tests that purport to measure the same thing, and without disaggregation by social class groups, is the worst possible choice. But, unfortunately, this is how most policymakers and analysts approach the field.

The most recent test for which an international database is presently available is PISA, administered in 2009. A database for TIMSS 2011 is scheduled for release in mid-January 2013. In December 2013, PISA will announce results and make data available from its 2012 test administration. Scholars will then be able to dig into TIMSS 2011 and PISA 2012 databases so they can place the publicly promoted average national results in proper context. The analyses we have presented in this report should caution policymakers to await understanding of this context before drawing conclusions about lessons from TIMSS or PISA assessments.[59]

References

edit
  1. ^ PISA 2009 Technical Report, 2012, OECD, http://www.oecd.org/dataoecd/60/31/50036771.pdf
  2. ^ Baldi, Stéphane; Jin, Ying; Skemer, Melanie; Green, Patricia J; Herget, Deborah; Xie, Holly (10 December 2007), Highlights From PISA 2006: Performance of U.S. 15-Year-Old Students in Science and Mathematics Literacy in an International Context (PDF), NCES, retrieved 14 December 2013, PISA 2006 reading literacy results are not reported for the United States because of an error in printing the test booklets. Furthermore, as a result of the printing error, the mean performance in mathematics and science may be misestimated by approximately 1 score point. The impact is below one standard error.
  3. ^ a b PISA 2009 Results: Executive Summary (PDF), OECD, 7 December 2010
  4. ^ a b ACER releases results of PISA 2009+ participant economies, ACER, 16 December 2011, archived from the original on 8 October 2014, retrieved 15 April 2016
  5. ^ a b c d e f PISA 2012 Results in Focus (PDF), OECD, 3 December 2013, retrieved 4 December 2013
  6. ^ CB Online Staff. "PR scores low on global report card" Archived 2015-01-03 at the Wayback Machine, Caribbean Business, 26 September 2014. Retrieved on 3 January 2015.
  7. ^ OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V), http://www.oecd-ilibrary.org/education/pisa-2012-results-skills-for-life-volume-v_9789264208070-en
  8. ^ a b c d PISA 2012 Results OECD. Retrieved 4 December 2013
  9. ^ a b Sedghi, Ami; Arnett, George; Chalabi, Mona (3 December 2013), "Pisa 2012 results: which country does best at reading, maths and science?", The Guardian, retrieved 14 February 2013
  10. ^ Adams, Richard (3 December 2013), "Swedish results fall abruptly as free school revolution falters", The Guardian, retrieved 3 December 2013
  11. ^ Kärrman, Jens (3 December 2013), "Löfven om Pisa: Nationell kris", Dagens Nyheter, retrieved 3 December 2013
  12. ^ Multi-dimensional Data Request, OECD, 2010, archived from the original on 14 July 2012, retrieved 28 June 2012
  13. ^ PISA 2009 Results: Executive Summary (Figure 1 only) (PDF), OECD, 2010, retrieved 28 June 2012
  14. ^ Walker, Maurice (2011), PISA 2009 Plus Results (PDF), OECD, archived from the original (PDF) on 22 December 2011, retrieved 28 June 2012
  15. ^ Learning for Tomorrow's World First Results from PISA 2003 (PDF), OECD, 14 December 2004, retrieved 6 January 2014
  16. ^ PISA 2003 Technical Report (PDF), OECD
  17. ^ Literacy Skills for the World of Tomorrow: Further Results from PISA 2000 (PDF), OECD, 2003, retrieved 6 January 2014
  18. ^ M. L. Wu: A Comparison of PISA and TIMSS 2003 achievement results in Mathematics. Paper presented at the AERA Annual Meeting, New York, March 2008.
  19. ^ "Waiting for "Superman" trailer". YouTube. 7 May 2010. Retrieved 8 October 2010.
  20. ^ Yong Zhao (10 December 2010), A True Wake-up Call for Arne Duncan: The Real Reason Behind Chinese Students Top PISA Performance
  21. ^ Gumbel, Peter (7 December 2010), "China Beats Out Finland for Top Marks in Education", Time, archived from the original on 10 December 2010, retrieved 27 June 2012
  22. ^ a b c Cook, Chris (7 December 2010), "Shanghai tops global state school rankings", Financial Times, retrieved 28 June 2012
  23. ^ Mance, Henry (7 December 2010), "Why are Chinese schoolkids so good?", Financial Times, retrieved 28 June 2012
  24. ^ "Archived copy" (PDF). Archived from the original (PDF) on 4 March 2016. Retrieved 15 April 2016.{{cite web}}: CS1 maint: archived copy as title (link)
  25. ^ Simola, Hannu (2005), "The Finnish miracle of PISA: Historical and sociological remarks on teaching and teacher education" (PDF), Comparative Education, 41 (4): 455–470, doi:10.1080/03050060500317810, S2CID 145325152
  26. ^ a b "The Economics Behind International Education Rankings" National Educational Association
  27. ^ a b c d Riddile, Mel (15 December 2010), PISA: It's Poverty Not Stupid, National Association of Secondary School Principals, archived from the original on 22 January 2014, retrieved 15 April 2016
  28. ^ Cleland, Elizabeth (29 December 2011). "What Americans Keep Ignoring About Finland's School Success – Anu Partanen". The Atlantic.
  29. ^ a b "Database – PISA 2009". Pisa2009.acer.edu.au. Archived from the original on 22 March 2016. Retrieved 15 April 2016.
  30. ^ http://ddp-ext.worldbank.org/EdStats/INDprwp08b.pdf [bare URL PDF]
  31. ^ Initiatives, Educational (November 2006), "Student Learning in the Metros" (PDF), Educational Initiatives
  32. ^ Vishnoi, Anubhuti (7 January 2012), "Poor PISA ranks: HRD seeks reason", The Indian Express
  33. ^ Masani, Zareer (27 February 2008). "India still Asia's reluctant tiger". BBC News.
  34. ^ Cf. http://www.provincia.bz.it/bildung-sprache/deutschsprachige-schule/mitteilungen.asp?publ_action=300&publ_image_id=469121. Retrieved 11 April 2021.
  35. ^ Cf. INFO, December 2004 (i.e. a Circular Letter edited by the regional Education Authority): «In all fields, the South Tyrolean schools achieved a first-rate performance» (p. 2); Mr Höllrigl, then director of the Education Authority, and Mr Meraner, then head of the PISA board, in INFO, January 2005: «I am surprised that we have already become world leaders» (p. 12); Mr Meraner in the most-read daily Dolomiten, 18 February 2005: «We are the world champions even in Problem Solving».
  36. ^ Cf. the South Tyrolean weekly FF, 17 February 2005, p. 10: "Land klagt Lehrer [Regional Government’s Action against Teacher", and FF, 16 March 2006, in which Mr Durnwalder admits to the failed legal suit.
  37. ^ Cf. Dolomiten, 27 January 2005: Mr Hilpold misinformed the press on behalf of the regional government: «South Tyrol was assessed as a nation [Land]. It is due to the fact that we were assessed as a nation that we may compare our results with other nations.» Mr Meraner, director of the Pedagogical Institute, also wrongly claimed that the South Tyrolean overall result may be compared to that of other «nations» because, as he falsely stated, South Tyrol had a national result of its own.
  38. ^ Cf. the PISA report: Learning for Tomorrow’s World. First Results from PISA 2003. Paris, 2004. p. 469; online version: http://www.oecd.org/dataoecd/1/60/34002216.pdf. Retrieved 8 March 2012.
  39. ^ Cf. the PISA report: Learning for Tomorrow’s World. First Results from PISA 2003. Paris, 2004. p. 321; online version: http://www.oecd.org/dataoecd/1/60/34002216.pdf.
  40. ^ Cf. the EMS study published by the Austrian Ministry of Research in 2007: http://www.bmwf.gv.at/startseite/mini_menue/service/publikationen/wissenschaft/universitaetswesen/spiel_studie/. Retrieved 1 April 2012. Abridged version: https://docplayer.org/18915390-Evaluation-der-eignungstests-fuer-das-medizinstudium-in-oesterreich.html = Evaluation der Eignungstests für das Medizinstudium in Österreich - PDF Free Download (docplayer.org) – retrieved 10 January 2021.
  41. ^ E.g. in the year 2007: c.f. the parliamentary question of an opposition party: https://suedtiroler-freiheit.com/2007/08/16/landtagsanfrage-zu-den-medizinstudium-ausbildungsplaetzen. Retrieved 6 April 2021. Or the more recent article in an online newspaper: https://www.salto.bz/de/article/25082016/braucht-suedtirol-die-oesterreicher-quote. Retrieved 6 April 2021.
  42. ^ Cf. the interview in the Austrian daily Tiroler Tageszeitung, 3 November 2008, p. 4.
  43. ^ An abridged version of the South Tyrolean DESI report was published by the Pedagogical Institute on its site: http://www.provinz.bz.it/news/de/news.asp?news_action=5&news_article_id=138926. Retrieved 8 April 2021.
  44. ^ Cf. the reports published by INVALSI, or its predecessor SNV, Servizio nazionale di evaluazione: http://www.invalsi.it/invalsi/index.php?page=snv. Retrieved 27 December 2009.
  45. ^ Paul E. Peterson, Ludger Woessmann, Eric A. Hanushek, and Carlos X. Lastra-Anadón (2011) "Are U.S. students ready to compete? The latest on each state's international standing." Education Next 11:4 (Fall): 51–59. http://educationnext.org/are-u-s-students-ready-to-compete/
  46. ^ Eric A. Hanushek, Paul E. Peterson, and Ludger Woessmann (2011) "Teaching math to the talented." Education Next 11, no. 1 (Winter): 10–18. http://educationnext.org/teaching-math-to-the-talented/
  47. ^ Gary W. Phillips (2007) Chance favors the prepared mind: Mathematics and science indicators for comparing states. Washington: American Institutes for Research (14 November); Gary W. Phillips (2009) The Second Derivative:International Benchmarks in Mathematics For U.S. States and School Districts. Washington, DC: American Institutes for Research (June).
  48. ^ "PISA Mathematics: A Teacher's Guide" (PDF). 13 August 2019.
  49. ^ Loveless, Tom. "International Tests Are Not All the Same". Brookings Institution.
  50. ^ quoted in Valerie Strauss, "How poverty affected U.S. PISA scores", The Washington Post, 9 December 2010.
  51. ^ "Stratifying PISA scores by poverty rates suggests imitating Finland is not necessarily the way to go for US schools". Simply Statistics. 23 August 2013.
  52. ^ "Child poverty statistics: how the UK compares to other countries", The Guardian. The same UNICEF figures were used by Riddile.
  53. ^ Highlights From PISA 2009, Table 3.
  54. ^ See, Martin Carnoy and Richard Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, 28 January 2013.
  55. ^ Valerie Strauss, "U.S. scores on international test lowered by sampling error: report", Washington Post, 15 January 2013.
  56. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", Economic Policy Institute, 28 January 2013
  57. ^ Schleicher was quoted by Amanda Ripley to this effect in her 2011 book, The Smartest Kids in The World (Simon and Schuster).
  58. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, 28 January 2013. Another scholar, Matthew di Carlo of the Albert Shanker Institute, criticized PISA for reporting its results in the form of national rankings, since rankings can give a misleading impression that differences between countries' scores are far larger than is actually the case. Di Carlo also faulted PISA's methodology for disregarding factors such as margin of error. See Matthew di Carlo, "Pisa For Our Time: A Balanced Look", Albert Shanker Institute website, 10 January 2011.
  59. ^ Carnoy and Rothstein, "What do international tests really show about U.S. student performance?", EPI, January 28, 2013.

Further reading

edit

Official websites and reports

edit
  • OECD/PISA website
    • OECD (1999): Measuring Student Knowledge and Skills. A New Framework for Assessment. Paris: OECD, ISBN 92-64-17053-7 [1]
    • OECD (2001): Knowledge and Skills for Life. First Results from the OECD Programme for International Student Assessment (PISA) 2000.
    • OECD (2003a): The PISA 2003 Assessment Framework. Mathematics, Reading, Science and Problem Solving Knowledge and Skills. Paris: OECD, ISBN 978-92-64-10172-2 [2]
    • OECD (2004a): Learning for Tomorrow's World. First Results from PISA 2003. Paris: OECD, ISBN 978-92-64-00724-6 [3]
    • OECD (2004b): Problem Solving for Tomorrow's World. First Measures of Cross-Curricular Competencies from PISA 2003. Paris: OECD, ISBN 978-92-64-00642-3
    • OECD (2005): PISA 2003 Technical Report. Paris: OECD, ISBN 978-92-64-01053-6
    • OECD (2007): Science Competencies for Tomorrow's World: Results from PISA 2006 [4]
    • OECD (2014): PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Volume V) [5]

Reception and political consequences

edit
  • A. P. Jakobi, K. Martens: Diffusion durch internationale Organisationen: Die Bildungspolitik der OECD. In: K. Holzinger, H. Jörgens, C. Knill: Transfer, Diffusion und Konvergenz von Politiken. VS Verlag für Sozialwissenschaften, 2007.

France

edit
  • N. Mons, X. Pons: The reception and use of Pisa in France.

Germany

edit
  • E. Bulmahn [then federal secretary of education]: PISA: the consequences for Germany. OECD observer, no. 231/232, May 2002. pp. 33–34.
  • H. Ertl: Educational Standards and the Changing Discourse on Education: The Reception and Consequences of the PISA Study in Germany. Oxford Review of Education, v 32 n 5 pp 619–634 Nov 2006.

United Kingdom

edit
  • S. Grek, M. Lawn, J. Ozga: Study on the Use and Circulation of PISA in Scotland. [6]

Books

edit
  • H. Brügelmann: Vermessene Schulen - standardisierte Schüler. Beltz-Verlag, Weinheim (Deutsch, English summary: https://www.academia.edu/15203894/Evidence-Based_Pedagogy ).
  • S. Hopmann, G. Brinek, M. Retzl (eds.): PISA zufolge PISA. PISA According to PISA. LIT-Verlag, Wien 2007, ISBN 3-8258-0946-3 (partly in German, partly in English)
  • T. Jahnke, W. Meyerhöfer (eds.): PISA & Co – Kritik eines Programms. Franzbecker, Hildesheim 2007 (2nd edn.), ISBN 978-3-88120-464-4 (in German)
  • R. Münch: Globale Eliten, lokale Autoritäten: Bildung und Wissenschaft unter dem Regime von PISA, McKinsey & Co. Frankfurt am Main : Suhrkamp, 2009. ISBN 978-3-518-12560-1 (in German)

Websites

edit

Video clips

edit