Jun-28-18-social-spending-oecd-longrunThe last chapter from David I Smith and James KA Smith’s Teaching and Christian Practices that has got me excited is a discussion by Professor Kurt Schaefer of Calvin College on how we as Christians approach data, particularly data in its original meaning, namely, something that is given. Professor Schaefer is an economics professor and in his chapter he describes an attempt to approach an econometrics class with overt Christian practices. The problem with this, as with all technical subjects, is that there is no tradition of such topics being tackled with a specifically Christian perspective. As he says, the “discipline of econometrics is a combination of economic theory, data handling, statistics and computer programming. Virtually everything in these fields – and in many other fields beyond econometrics – emerged during the modern era, and emerged without any obvious connection to traditional Christian practices.” (p. 195)

Econometrics is the wing of economic study that is concerned with economic data and its interpretation and use, in order to answer the central economic question of what choices do people and societies make, what choices could they make and perhaps what choices should they make.

Kurt Schaefer points out that in the overall study of economics, econometrics has been privileged more recently over a serious study of the history of economic thought, with its historical, philosophical and anthropological facets – in other words the metric, measurable, testable, model-able “technical” world has been given more time and importance than the philosophical reasons we believed those models to be important in the first place. As time has gone on in university economics courses, the mathematical and technical has driven the historical and philosophical to the margins. Even within econometrics, the mathematical modelling has pushed the reflective and historical to one side.

So it is possible, and Professor Schaefer contests that it is more than just that, for young economists to enter the profession without a really clear understanding of the relationship of economics to its parent disciplines, philosophy and history. If modern economists perhaps thought of their subject having any subject antecedents today, it would likely be mathematics. The problem with that is that the tools that economists mostly use are not those that will provide usable perspective and care upon their datasets. Econometrics is about

“piecing together a credible, faithful explanation of events from information that is fragmentary, for which the investigator did not control the data-generating process, from which a number of credible inferences could be drawn” (p. 197)

What this is saying about data is in itself interesting – he compares econometrics to police detective work – because it means that the work of an econometricist is never “technical in the sense of an algorithmic process that generates guaranteed, self-evident conclusions; techniques only exist to to serve, discipline, clarify and support the bigger project of meeting standards for warranted truth claims – of engaging in epistemic practices aimed at truth” (p.198)

And what, then are these epistemic practices that can be deployed in the service of data? The argument from Schaefer is that they consist in those Christian practices that are to do with knowing well. He is very sensitive in the argument about how to apply this, but eventually settles upon the Reformed practices of biblical exposition. The particular Reformed slant on this argument need not concern us here, except to say that he is forensic in ensuring that we take an approach from what is the special revelation of God (through the “book” of the bible) and apply it to an aspect of the general revelation (through the “book” of the natural world in which we live and move) legitimately and without doing damage to the Reformed tradition’s self-understanding. He is from Calvin College, after all. His conclusion from that discussion is that:

Perhaps those epistemic practices that have proven to be helpful in knowing well will lead us toward analogous practices that can help establish good epistemic habits for knowing the world around us….perhaps the interpretive disciplines that help us read Scripture well could also be relevant to helping us to understand nature (and culture) well. (p. 199-200)

The practices of “data interpretation” that he proposes are the trio of hermeneutics, exegesis and homiletics employed by Reformed bible scholars and preachers. Using this trio has led, over time, to an accurate as possible understanding of the Scriptures and their application, fully respectful of the “data” that they are dealing with, aware of its importance and its origin, and yet dealing with a source that is only ever partial (at least in his historical and cultural information). Hermeneutics describes the rules of the game – what we have to consider in the act of interpretation, not the interpretation itself, in areas such as reliability, grammatical analysis, vocabulary and syntax study, historical and cultural analysis, etc. These are the ground rules, avoiding category mistakes and ensuring that the “raw data” is treated in accordance with scholarship and humility. Exegesis takes that interpretive context and says, with epistemic humility but also with reasoned judgment, what the data actually says, “leading out the truth that exists in the Scriptures.” This exegetical outturn has to be completed well and thoroughly before homiletic work is done, lest the “rush to relevance” get in the way of accurate interpretation. Good exegesis has, as Martyn Lloyd-Jones always insisted, the effect of taking the “I” out of interpretation! Once completed, the application of the biblical passage can begin, from a number of different angles, so that we can find at last what the word of God is saying to our own situation. This homiletic work will lead us to the actions we need to take as a response to the “data.”

This careful, thoughtful, humble approach to data results in a number of approaches that we might use, if we accept that we can use this approach (rooted in “special revelation”) in our work with other forms of data – other texts, spreadsheets, political manifestos, etc. – that are part of the “general revelation.”

I found it a really thought-provoking approach, and it has a lot of “overlapping consensus” with the way that the best OFSTED inspectors approach and treat data. Schaefer takes his argument one stage further before applying it to his course on econometrics, establishing 5 principles that seek to answer the question “What interpretive practices and epistemic virtues in this tradition’s engagement with special revelation might resonate with the study of natural revelation and the sort of things that economists usually study?” (p. 202). The five principles are:

  1. The aim is always to lead the truth out of the revelation that surrounds us. This will take time, “demoting temporal efficiency,” and mean that a “teaching” might have to give way to a “research colloquium” approach. This standards of “citing… organising… evaluating… criticizing evidence should figure large” (p. 204)
  2. Fidelity to the data set under scrutiny should be taken as read. This means that “faithful econometrics…should involve familiarity with the peculiarities of any particular data set under study – its origin, blind spots, the research agendas that may have shaped its collection, and a detailed knowledge of the original situation of which those data form an imperfect abstraction.”
  3. Proper habits of interpretation follow the principle of “Scripture interpreting Scripture” or “the analogy of Scripture” – the idea that one scripture does not by itself have exegetical validity until set in the context of the whole. In econometrics, Schaefer argues for students doing their work in the context of those researchers who have gone before, worked in the same field, being aware of and “where appropriate, bowing to, the work of others in the area of study”
  4. Limiting our inferences to that on which the “revelation” is clearly speaking. This means being firm and clear about what is proven or demonstrated, and being humble and not overstating the case when it does not. This means for Kurt Schaefer that he spends “a good deal more time delving into the details of inference, especially the uses and abuses of statistical tests of significance, than is done in the standard course.”
  5. A communal approach to data is always better than an individual one. Work done openly in conversation with others will bring different perspectives. Tom Wright talks about “coveting other angles of vision” and is an “echo of the Reformed ‘hermeneutic of humility’ regarding epistemic claims” (p. 206).

What can be applied to the study and interpretation of econometric data, often generated from the “part” that has been measured, is no less applicable to the way we look at school data. English school data is some of the most comprehensive in the world, and for that reason it is easy to think that it has all the answers. Every child in England is a computer cipher, described by a UPN – a Unique Pupil Number – that follows the child throughout its school life. This seems “normal” to us in England, but in many societies it is seen as abusive, and to us too it ought to be seen as at least questionable. The comprehensive, big-data approach that we have got used to taking means that we harbour a suspicion that the data is unchallengeable. Here is an example, where a child with a particular Key Stage 1 outcomes (measured by “points” scored – itself an averaging out of children’s assessments, themselves an approximation of what children can do) can be predicted to get a certain score at the end of Key Stage 2. This is widely used, the basis of Fischer Family Trust data and the IDSR published by OFSTED for each school around this time of year:

Picture1

The numbers in the three right hand columns in this table are to two decimal places. The numbers in the second column are data approximations made on the basis of teacher judgments four years ago about children across England, turned into numerical scores and then used as a predictor of children’s outcomes – sorry, not children’s, pupils’ outcomes – based on the enormous dataset which we as schools give to the government.

School leaders routinely use this stuff to get a handle on target setting and to provide tracking data through a school year, yet it is completely ludicrous to generate 2DP data on the basis of what are fundamentally numerical approximations of teacher assessments that themselves were approximations, best fit interpretations, of what a child could do. This is just one example of the tyranny of the data we use. (By the way, if you are a school leader and want a reasoned approach to your data, and have a struggle to know how to handle it, then you could do worse than contact James Pembroke at Sig +, who uses data with the appropriate epistemic care.)

What follows from this discussion is the following:

  • Any kind of data that is given us – handed to us through whatever source, whether from OFSTED, from the IDSR, from simple attainment data in schools – this data cannot of and by itself tell a story. There is no “data story”.  There is an interpretive exegesis that results from the application of a human interpreter upon the data itself, more (or less) improved by whether the interpreter understands the hermeneutic principles applicable to the field of study (in this case test and assessment data). The hermeneutic process establishes the principles by which we can safely deal with the data – the limits of interpretation and the conventions that data analysis has over time, made available as working tools to assist interpretation. The data can of course give a pointer to where the hermeneutic might begin, but it does not tell the story itself.
  • Secondly, the data that is thus “validated” through the process of hermeneutic must then be submitted to an exegesis and subsequent application. I have always believed, though scary reports some years ago about using computer-based algorithms to “manage” the data tend to undermine that belief, that OFSTED does this fairly well, and in its inspection processes takes seriously the school’s own hermeneutic and exegetical work.
  • Thirdly, and most importantly, the point made by Kurt Schaefer about the way that econometrics has been privileged over the history of economic thought should alert us to the very real possibility that the data, no matter how well it has been interpreted, cannot and must not be divorced from the question of teleology – the purpose of schooling as interpreted by the school and its community. Contrary to some schools of thought, there is no “one way” of interpreting data. There is likewise no “one way” of improving a school. People who think that will eventually fail at the attempt. Every “way” of interpreting data stems from a strong position held by the interpreter that includes, amongst many other things, their view of the purpose of teaching and learning, their understanding of the role of curricular content and knowledge, their view of children and what they are for and what they need and deserve, and their understanding of what purpose schools serve in a liberal, democratic society. Many people take attainment and progress data as a proxy for school improvement. They are wrong – foolish in the extreme – to do so. Until we recognise our own standpoint with regards to data and its purpose, and have thought through what it is we believe and what we therefore bring to the exercise of interpreting it, we have little chance – and less right – of saying what it means, or planning actions based on it.
  • When data is a proxy for something, we need to take even more care. In secondary education, some of the data is real and worthwhile, as it is gleaned from actual outcomes made by actual young people who need the grades at GCSE or A Level to progress to university or to another preferred future. The data has meaning for children and young people, probably the first time in their school career that school data and children’s data serve the same purpose. In primary schools, by contrast, nearly all the data is completely worthless and of no use to children at all. It stands simply as a proxy measure to indicate attainment and progress in a narrow sample of educational outcomes to the local or national government, who gain the greatest use for it. The fact that the data is worthless is shown in that it is not criterion-referenced, but norm-referenced; progress is measured from some point on a bell-curve to another point on another (but different, since norm-referencing shifts the peak of the bell by definition) bell-curve. This has been demonstrated beautifully by Tom Sherrington in a post I reblogged last month. But beyond that, there is a problem in saying that outcomes in English and maths are adequate proxies for school improvement. Gerald Grace, in discussing mission integrity in Catholic schools in England and Wales, argues that a school’s mission integrity can be threatened by the agenda of “school effectiveness” if by that term we mean the mono-focus on academic performance. This is not the only problem with it – there is an interpretive and statistical  difficulty as well, as has been shown above.

There is more to be said, and I shall probably return to this theme – not least because of the pressure this worthless proxy is placing upon the curricular and behavioural decisions of a school.

The idea, for instance, that doing more English and maths is the answer to low standards in English and maths is just rubbish, and has been proven to be so time and again, in a range of jurisdictions. Those school leaders trying to get more out of the school day by limiting things like play, eating together in harmony, extra-curricular trips and activities or the daily act of collective worship are contributing to the poverty of our children’s experience and life-chances, and thus to the poverty and retrogression of our country.

About Huw Humphreys

I am a headteacher by profession, now working as an educational researcher, in the city of Milton Keynes, where I have been since April 2011. My work looks to make education effective for the whole child and keeps a distant relationship with the powers that be and their narrowing approach to education... but most of all I am looking to find out what it means to be both a follower of Jesus Christ and a passionate educator in the midst of an unsettled community. I am also a part time musician, amateur printmaker, part time linguist and lover of history and literature...committed both to freedom to learn and depth of learning for children. The views on this blog are all my own.

One response »

  1. […] So we have to make things un-simple, acknowledge the complexity and start questioning data, not accepting it; contextualising data, not decontextualising it; tackling methodology, not standing in awe of it. If, in doing so it renders the data near-meaningless, we have learnt something both about ourselves and the data. Questions about how the data was generated, with what it was compared, in what contexts, and how robust it is to interrogation – all of these matter, as Kurt Schaeffer has shown in his hermeneutic approach to econometrics. […]

Please comment here...

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s