Home > General Politics, Terrible Tories > Dishonesty at the DfE

Dishonesty at the DfE

In the last few days TCF has covered:

  • Tory ministers telling lies about their so-called (but non-existent) community right-to-buy plans;
  • Tory ministers telling lies about childcare subsidies;
  • Tory ministers telling lies about the Big Lottery Fund.

So it seems appropriate to end the week with a note on a Tory minister being somewhat economical with the truth.

This time around it’s School Minister Nick Gibb, commenting in a Department for Education press release*, provocatively entitled ‘England’s 15-year-olds’ reading is more than a year behind the best':

The gulf between our 15-year-olds’ reading abilities and those from other countries is stark – a gap that starts to open in the very first few years of a child’s education.

That sounds terrible, doesn’t it?  The Sun is quick off the mark with its reaction:

ENGLISH teenagers are 18 months behind Chinese pupils when it comes to reading, new research has revealed. Our 15-year-olds also trail behind those in countries such as Korea, Finland and Japan. The devastating findings show how far England’s education system has fallen behind its international competitors……..

In its global rankings, England has slipped from seventh to 25th for reading.

For English pupils to reach Chinese standards, the proportion getting a GCSE in English would need to rise by 22 per cent.

Blimey, from over a year behind to eighteen months behind, in the time it takes the Sun to read the press release and ‘interpret’ it for its readers. 

And from 7th to 25th? By the time it gets to the Daily Mail, our lazy arsehole teenagers will be at least three years behind because of their liberal do-gooding leftie pinko good-for-nothing teachers, and we’ll probably have slipped behind Outer Mongolistan, or some other foreign place which shouldn’t by rights have any of their foreigners learning to read at all.

And how on earth is this country, devoid of all morals and standard after 13 years of Nu Lieboor educayshoon apparatchnicks, ever going to increase GCSE English grades by 22%?

We’re doomed. We’re surely doomed.

On the other hand, perhaps a little fact check may help.

1) The DfE press release claims:

GCSE pupils’ reading is more than a year behind the standard of their peers in Shanghai, Korea and Finland, research reveals today.

In fact, it’s not new research at all. 

What’s the DfE have actually just published is a ‘research report’ on research carried out by the National Foundation for Educational Research (NFER), and published in December 2010, which itself feeds into the most recent OECD Programme for International Student Assessment (PISA 2009).

That is, the DfE has spent some 10 months on the development of a report about a report commissioned for a wider internatinally validated study.

DfE’ report on a report, as we will see, appears intentionally designed to create the impression of English 15 year olds doing very badly compared with their peers in other countries. It does so via a series of methodologically flawed processes and ‘mistakes’ which are unbecoming of a government department.

2) The DfE press release claims:

To match the attainment of pupils from Shanghai in the reading assessment, the proportion of England’s pupils achieving five A*-C grades (including English and maths) at the end of Key Stage 4 would need to increase by 22 percentage points.

This comes from “additional analysis” in the research report, which is concerned with:

[T]ranslating the difference in average PISA points scores to an effect size is that we can apply the attainment gap between England and the comparison countries to measures we are familiar with, for example: pupils’ capped Key Stage 4 point scores and GCSE grades; proportion of pupils achieving 5 A* to C including English and maths.

Bizarrely, then, the reading tests used for the PISA study are being extrapolated to provide ‘findings’ on how far 15 year-olds’ GCSE results in all subjects would need to improve to meet China’s standards.

Not only that, but the research report simply ignores the fact the PISA study is NOT primarily reflective of 15 year olds’ performance in schools.  The OECD is very clear what the PISA study is actually about:

In all cycles, the domains of reading, mathematical and scientific literacy are covered not merely in terms of mastery of the school curriculum, but in terms of important knowledge and skills needed in adult life.

It is therefore either methodologically stupid or dishonest of the DfE to extrapolate in the way that they do from PISA scores to GCSE results.

3) The DfE press release claims:

Across all three strands, England has tumbled down the international tables in the last nine years – from 7th to 25th in reading; 8th to 28th in maths; and 4th to 16th in science.

This is incorrect and irresponsible.

The research study by NFER makes very clear at para 3.2) that only 12 countries significantly outperform England on their PISA scores.  A further twelve have mean scores above or the same as England’s (395), but these are not statistically signficant.

Further, in the case of Chinese Taipei and Denmark (also on 395), they come ahead of England in the table (3.3) because they start with an earlier letter in the alphabet.

Finally, the idea that England has “tumbled down the international tables” ignores the fact that two of the country-regions that have apparently moved ahead of England did not take part in previous studies (NFER study, para. 3.5).

It is irresponsible for the DfE simply to ignore such an important method statement from the researchers, not least given the complexities of international comparisons, and this and its other dishonest reading of the NFER research suggests that the DfE is much more interested in headlines showing how bad the English education system is than it is in accuracy. 

4) The DfE press release provides a table professing to show how many years ahead of England are 13 countries, these being the 12 with signficantly higher PISA scores plus Iceland, which is not significantly higher and appears to have been added by mistake.  Perhaps the DfE were simply seeing if anyone noticed.

The years range between Shanghai-China (1.5 years), and the afore-mentioned Iceland (0.1 years), and are based on a fairly complex set of workings comparing ‘PISA points’ and the established English education system of Key Stages (1-4).  I won’t cover that in detail here, but simply remind that this is the DfE’s analysis, completely independent of the PISA methodology.

But if DfE is going to do this kind of ‘years of progress’ calculation, it would help if it read the whole of the NFER research it was analysing, including the method note at para 1.5:

Countries were required to carry out the survey during a six-week period between March and August 2009. However, England, Wales and Northern Ireland were permitted to test outside this period because of the problems for schools caused by the overlap with the GCSE preparation and examination period. In England, Wales and Northern Ireland the survey took place in November and December 2009 [in November in England].

So in fact those children sitting the PISA test in England did so, in year 11 (the final compulsory year of secondary education), at an average age up to eight months earlier than children in other countries, and only one half term into their second GSCE/Key Stage 4 year.

Using the DfE’s own methods, but taking into account this study particularity, we could suddenly find England in fourth place in the table, not thirteenth, behind Hong Kong (supposedly 0.9 years ahead) but ahead of Singapore, Canada and New Zealand (0.7 years ahead).

Of course, that wouldn’t suit the DfE’s scheme of things.

Conclusions

Overall, this attempt to ‘do down’ both teaching standards and 15 year -olds’ reading abilities is a shocking indictment of the Tories approach to education, and to government for generally.

It is interesting to note that, just as this press release was being put out based on totally malicious interpretation of previous PISA data, the Times Educational Supplement was discovering that Michael Gove had decided that schools will not participate in a PISA programme to assess children’s problem solving skills.  TES comments:

[One] possible explanation for England’s absence from the optional [problem-solving] test, being taken by 43 of the 66 territories participating in Pisa 2012, might be found in the framework for the assessment. It says “problem-solving competency” can be developed through “progressive teaching methods, like problem-based learning, inquiry-based learning” and project work. “The Pisa 2012 computer-based assessment of problem-solving aims to examine how students are prepared to meet unknown future challenges for which direct teaching of today’s knowledge is not sufficient,” it concludes.

They are words likely to make those like Mr Gove, who see knowledge as the foundation of reason, wince. And Mr Bangs argues that the whole basis of the test runs “contrary to the Government’s obsession with a narrow fact- based curriculum”. But the idea is not going away. In 2015, Pisa will build on it and introduce a new test of collaborative problem-solving. The DfE says it is too early to say whether England will take part in this test. But it is likely to prove even less attractive to Mr Gove, who has said that “time and effort spent on cultivating abstract thinking skills” denies children access to “essential” subject knowledge.

Perhaps even more interestingly, this desire on the part of OECD to assess wider cognitive skills amongst children actually echoes one finding of the NFER PISA study, which the DfE completely fails to mention:

England’s highest reading process score was attained on the reflect and evaluate  subscale, with a mean of 504, nine scale points higher than its overall mean for reading…… [T]his may suggest that, in England, pupils are relatively strong in skills such as making judgements about authorial techniques and determining the usefulness of a text for a particular purpose (reflect and evaluate) and relatively less strong in skills such as locating and selecting explicit information (access and retrieve  or using inference and deduction, and linking ideas within or across texts (integrate and interpret).

Broadly speaking, this suggests that English education may actually be doing quite well at enabling children to ‘”think outside the box”, though perhaps at the cost of reading “accuracy” (at which Japan, for example, appears to be very strong).  

In turn, Gove’s reluctance to pursue a formal assessment of this suggests that he has no desire to know what children and teachers might be good at, and how education methods might develop under his watch.  

On the other hand, he appears very content for his and Gibb’s department to keep on misusing data to pretend they’ve inherited a failing education system.

*I am grateful to Dorothy (@deevybee) for alerting me to this DfE press release in the first place with an excellent post making an initial assessment of how flawed the DfE’s analysis is, and how suspect its motivations.

About these ads
  1. Guest
    October 18, 2011 at 2:47 pm

    While I agree with most of this post – and from experience know that the DfE’s interpretation of NFER/OECD research has been blinkered – the point about bringing the testing dates forward by five months doesn’t have quite the effect you note. Because PISA tests 15-year-olds specifically, by moving the date forwards you actually capture more pupils in the later stages of their educations the earlier in the calendar year you test. (Although this sounds paradoxical, it is because a greater proportion of Year 11 pupils will be 16 the later in the year you test, and therefore fall outside the testing cohort. These places will then be filled by Year 10 pupils who have turned 15 that year.)

    Another point worth making about England’s ‘fall’ in PISA rankings is that the OECD have explicity warned against comparing 2000/2003 PISA results with 2006/2009, which has been comprehensively ignored by DfE. This is because the response rate for the earlier years was so low that concerns were raised about how representative the samples were (it’s been noted, for example, that more independent and selective schools returned data for the 2000 sample, which may have contributed to what were acknowledged at the time as anomalous results, although the Government of the time understandably ignored these warnings to play up their significance). See this for more (especially para 2): http://www.oecd.org/dataoecd/33/8/46624007.pdf

  1. March 16, 2012 at 2:15 am
  2. September 26, 2012 at 5:40 pm
  3. November 2, 2012 at 11:21 am
  4. January 31, 2013 at 1:21 pm
  5. February 7, 2013 at 1:06 am

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 124 other followers

%d bloggers like this: