Publications on the internet
Public Administration Select Committee - Minutes of EvidenceHC 406
Taken before the Public Administration Select Committee
on Wednesday 12 September 2012
Mr Bernard Jenkin (Chair)
Examination of Witnesses
Witnesses: Jill Leyland, Vice President, Royal Statistical Society, and Jenny Church, Chair, Royal Statistical Society Statistics User Forum, gave evidence.
Q1 Chair: Welcome to this first session on statistics. I wonder if you could both identify yourselves for the record.
Jill Leyland: Hello, I am Jill Leyland. I am a Vice President of the Royal Statistical Society, and the Chair of its National Statistics Working Party.
Jenny Church: I am Jenny Church. I am the RSS Honorary Officer for Statistics Users-that is users of statistics outside government-and I chair the Statistics User Forum, which is an umbrella body for a wide variety of user groups. I can say more about the Forum if you would like me to later.
Q2 Chair: Thank you both for your evidence in writing. Perhaps you could each first say how you feel the Statistics Act is now working, what improvements it has led to and what evidence you have of those improvements.
Jill Leyland: Broadly, I think the Act is working okay. We think it could work better in some respects. I would perhaps like to make it clear from the start that, although certain parts of the Act are not as we would have wished them, we think the Act as a whole is workable and we would certainly not seek to suggest that there is a need for any new primary legislation, which would be a distraction. It has crystallised and reinforced some of the positive things about UK statistics. For example, the existence of the Code of Practice is definitely a good point, because, first, it lays out and makes very clear what good practice is, and then, of course, it can be used as a tool to make sure that statistics measure up to that. In that sense, it is working well.
There are other areas where there has been definite progress but not enough, for example, in paying attention to user needs outside central Government, which perhaps Jenny would want to say more about, in a moment. We are still looking for more improvements in communication. The disadvantages of the Act: obviously the dual role or, as we see it in some ways, the triple role of the Authority is a problem. It is dealt with to some extent, but it is something that needs to be managed carefully. It is a pity perhaps that the Act did not give more powers to the Office for National Statistics or for the Government Statistical Service generally to acquire administrative data sets. While it facilitates that, there is still a secondary legislation process to go through, which has proved extremely cumbersome.
Q3 Chair: We will come to that point later, but by "acquiring", do you mean setting up systems to gather that data or acquiring the data that other people have already gathered?
Jill Leyland: Acquiring the data that other people have already gathered-the legal procedures in that and the consultations. I think, too, that one of the problems is that we still do not really have the ability to plan as well as we could. Obviously, we have a decentralised and devolved statistical system, and there are many strengths in that arrangement, but it does not make for easy planning. We would certainly like to see an instruction to Departments that, for example, they are obliged to consult and pay attention to the National Statistician and/or the UK Statistics Authority when making major changes to their statistics or in formulating their statistical planning, which we think would give more effective influence to the centre, but so far that has not happened. Obviously, we appreciate that, under the UK system, Departments have to have the final word. Prerelease access is the other point, which we would like to see abolished.
Jenny Church: If I could perhaps just give some background and talk about the effect of the Act as far as users are concerned, users have very much welcomed the spotlight that the Act and the Code of Practice have placed on the need to engage with users outside central Government, of which there are a huge variety. The assessment process has been helpful, in that it has identified engagement as an area of weakness in the system, and that identification is leading to improvements. There is a lot still to be done, in that statistics can have value only if they are used, and used by the widest possible audience. Users still perceive that administrative data is underutilised, and that access to administrative data is still not as good as it could be. If there were better access, it would facilitate all sorts of planning in local government, among commercial organisations and so on, so there is enormous scope, which users feel is not entirely recognised yet, within the Government Statistical Service. As I say, the Act has put a spotlight on that and that has been helpful, but that now needs to move forward.
Q4 Paul Flynn: Just to follow up on that last sentence, can you give us a concrete example of where you think information is not being used, and the value of it, if it was used?
Jenny Church: We have actually been collecting examples, particularly where concerns about disclosure have impeded access to data. For example, the Valuation Office Agency has a very rich database on dwellings in this country and their attributes, which could be of great use in planning in all sorts of areas. Some of that data is now being released. Until recently, none of that data was available outside central Government.
Q5 Paul Flynn: That is interesting. The Authority carried out an assessment recently. Do you think there have been any improvements in the national statistics and official statistics as a result of the assessment? We have seen a chart that has been produced, but have there been any perceptible improvements, particularly in areas that people feel strongly about, such as inflation and migration?
Jenny Church: It is a very mixed picture, to be honest. There are examples of good practice, particularly, for example, with the census data, but prices data is an example of where a lot still needs to be done. In fact, quite recently a user group has been set up in the area of price statistics, CPI and RPI, and that is spearheading user concerns on prices. I do not know whether you would like to say more about that, Jill? Jill is very involved in that.
Jill Leyland: Yes. This is only quite recent. This is a subject that, as you may know, that the Royal Statistical Society had concerns about, expressed in a letter, just over a couple of years ago, to Sir Michael Scholar. We have now eventually seen the way forward. We have a user group; it is a very lively user group. We have now been consulted on the inclusion of owneroccupied housing costs into the CPI. The consultation period has only just closed, so we do not know the results of that. We are getting a lot of cooperation from staff at the ONS, but the jury is out, for the moment, as to whether we will see real changes.
Q6 Paul Flynn: That was going to be my second question about the relationship with users. When the jury comes back in, what will they tell us?
Jill Leyland: As I said, the consultation period has just closed. The user group responded to it. We do not know now what the Authority will decide, so we have to wait to see to what extent we have been listened to. That does not mean of course that users always get what they want, but they have to be treated seriously.
Q7 Paul Flynn: You referred approvingly to the census and the figures from them. Do you think there will ever be another census along the biblical lines of the present one, instead of having a census based on about a thousandth of the sample that is taken now, which will produce results that will be quicker and almost certainly more accurate than trying to question the entire nation?
<?oasys [pc10p0] ?>Jenny Church: Users certainly support the ONS’s Beyond 2011 programme, looking at the alternatives to the census. What that has pinpointed is the difficulties in datasharing within Government. Although, as Jill said at the outset, the Act has provided enabling legislation for datasharing, it is quite cumbersome in terms of having to set up datasharing orders for the ONS to have access to each separate set of data from other Departments. That is an issue that is going to have to be resolved, if we can move forward to getting the same data as the census, but from administrative sources.
Q8 Kelvin Hopkins: On the census point that Paul was making, I am a great lover of the census and I would deeply regret it, if and when it goes. What are the views of the Royal Statistical Society? Are there people who feel like I do?
Jill Leyland: I think so, yes. Obviously, there is a whole range of views. As Jenny said, we support the fact that this investigation is being done, because there may be better ways of doing this. Of course, internationally the trend has been to move away from censuses towards either a complete use of administrative data or to a mixed model, where you have a short census and then supplement that with additional information. One of the problems with the census is that it happens only every 10 years, and of course society and the economy move very quickly these days, so a 10year period is really quite long. I agree with you: there are probably things that will be lost, but we still may see that it is better on balance.
Q9 Kelvin Hopkins: We will lose wonderful time series, which go back a long time.
Jill Leyland: Yes, I agree with you. I love looking at historical census data.
Q10 Chair: Will we not lose the ability to crosscheck the data we think we are gathering? The size of populations in some local authorities was way different from expectations-in places like Newham, for example. How are we going to deal with that? Where is the benchmarking for the statistics that we check, unless we do a census?
Jill Leyland: That is a serious point that will have to be addressed. The Beyond 2011 programme is still in its early stages and we have not yet seen the plans, so we cannot really comment on them, but you are absolutely right: there will have to be some system in there to enable crosschecking, referencing and so forth.
Q11 Chair: On the user interfaces, particularly the electronic interfaces, do you think they are practical and friendly enough, and provide manipulable and useable data for the statisticsuser community?
Jenny Church: Probably the short answer to that is no. Things have improved to some extent. One of the difficulties is that a lot of the Government Statistical Service has not quite appreciated the width and breadth of the audience for statistics that the web has produced. To make data available on huge spreadsheets, while okay for the expert user, is not friendly for the occasional user, for small businesses and so on.
Chair: I find I still write down the numbers I am interested in on a piece of paper.
Jenny Church: Well, quite. There is still not an appreciation of the needs of the nonexpert, and that is something that we are working hard to promote. One thing I might mention is that recently the Forum has developed an online communications hub, StatsUserNet, with the aim of facilitating useruser dialogue and userproducer dialogue, using social networking software. The intention is to have a much more userfriendly interface. This is not actually to get the statistics themselves, but to interrogate the producers and to give views about what is being produced. That was launched only six months ago, but it already has 1,000 registrations and is growing every day. That again shows the potential breadth of the user community.
Q12 Lindsay Roy: Good morning, ladies. In any review of the process and the monitoring process, I am sure you will agree the key question is: so what, in terms of outcomes? UKSA’s work is limited to official statistics, and it has been suggested that the definition of "official statistics" is not clear enough in the legislation. How would you like to see official statistics defined?
Jill Leyland: It is hard to improve a lot really. Okay, official statistics are everything produced by Government Departments or agencies working for them. I believe the UKSA thinks there are at least 200 bodies in that. There are two issues. First, there are other statistics of key interest that are not produced by central Government or any agency, or indeed by the devolved Administrations. The clearest example is perhaps some of the statistics produced by the Greater London Authority, which obviously are of wide interest to people. I do not think you can call those "official statistics" because, if you started there, you would have to put all local authority statistics under that banner.
Q13 Lindsay Roy: What is the banner that would be used then?
Jill Leyland: The banner has to remain the statistics of central Government and devolved Administrations, or agencies working for them. What we would like to see is some voluntary signingup to the Code of Practice by bodies such as the GLA. The Greater London Assembly members have indeed endorsed that idea unanimously, so we wait to see what Mr Johnson will say on that one. That is one area where statistics that are of key importance could be signed up to the Code of Practice, and maybe even assessed against it.
The other area is the tricky one where Departments produce statistics that they call "research statistics", "administrative data" or that come from management information. In practice, you cannot say that every bit of information produced by a Department is an official statistic, even if it is numerical. There therefore has to <?oasys [pc10p0] ?>be a judgment call. The UK Statistics Authority is watching this area and does occasionally intervene. The example we cited was the statistics produced at the beginning of this year by the Department for Work and Pensions on benefit claimants analysed by nationality, which is obviously of key political interest, because of all the issues around migration and so forth. Things like that, which are of clear political interest, should ideally be treated as official statistics, but it is very hard to have a firm dividing line. It has to be a judgment call.
Q14 Lindsay Roy: Can you just clarify the rationale behind the distinction in official statistics, management information and research? Some people would regard that as diversionary.
Jill Leyland: No, I do not think it is diversionary. I think official statistics need to be trusted, and that means they have to meet certain standards. Now, one would hope that all Government information can be trusted, but it probably does not need to jump through so many hoops. Let me take one example of something the Office for National Statistics produced some years ago. They produced, I think with the help of the House of Commons Library, a historical series on consumer prices going back to 1750. That is of interest. One would hope it is of a certain quality, but there is no reason why that-which obviously is research and, to some extent, inference from historical information-should be considered as official statistics. It is an interesting piece of research and analysis.
Q15 Lindsay Roy: Are you saying then that we should accord a higher level of trust to official statistics?
Jill Leyland: We certainly should be able to accord a higher level of trust to official statistics, yes. The assessment process and the Code of Practice are really a system to ensure that we can place that high level of trust. It is not that we would distrust other Government information; it is simply that the bar is set higher for official statistics.
Q16 Lindsay Roy: Does the UK Statistics Authority have the right amount of control over the work of official statisticians and others outside of the Office for National Statistics?
Jill Leyland: We would certainly like to see it have more influence. I do not think, in the UK system, we can have control, because of the way that Departmental authority works, plus the devolved Administrations are also a separate issue. We would like to see more influence, for a number of reasons, partly simply for quality assurance and so forth but, perhaps more importantly, for planning. There is no actual planning for UK statistics as a whole; there is a certain amount of coordination, but there is no mechanism at the moment by which the National Statistician’s office or the UKSA sit down and say, "Now, what are the emerging needs for the UK? How do we channel and divert resources?" Resources spent <?oasys [pc10p0] ?>on statistics vary by Department, so for example you have a substantial amount of money spent on agricultural statistics, which arguably is out of kilter with the amounts spent on other parts of the economy. As I said, we have to accept that Departments have control over their budgets and there are strengths in that, but they should be obliged to pay more attention than they do to the National Statistician and to the UKSA when making their plans.
Q17 Lindsay Roy: Does there need to be a strategic review, given the differentiation you have highlighted?
Jill Leyland: A strategic review of what statistics are needed?
Lindsay Roy: Yes, absolutely. You said, for example, in agriculture there is a plethora of information.
Jill Leyland: I think it would be good if there were such a review, but there is probably not much point in having it unless it will lead to actual changes.
Q18 Lindsay Roy: Is that not the whole point of any review?
Jill Leyland: Yes, indeed.
Q19 Lindsay Roy: It would not be for its own sake. There has to be a positive outcome. It goes back to the "so what?" factor I introduced at the beginning of my questioning.
Jill Leyland: It would be nice to feel that there would be a periodic review or indeed an ongoing review process that would lead to changes but, at the moment, that does not appear to be feasible, except possibly through influence.
Q20 Lindsay Roy: Why not?
Jill Leyland: Again, it is the UK system that Departments ultimately have control over their budgets. That is a privilege they guard quite jealously.
Q21 Lindsay Roy: Are you saying we need to break down the silo mentality?
Jill Leyland: It would be good if we could. Yes, I think that puts it rather well.
Q22 Paul Flynn: Respect for the quality of statistics has been of continuing great interest among many of my constituents who work in statistics, going back to 1987, when I raised this with the then Prime Minister. I had a letter from Mrs Thatcher to say what an unworthy thought it was to suggest that any Government Department would want to interfere and contaminate the purity of the statistics that were coming from my constituents, who were considerably worried about that possibly. Since that time, the whole point of setting up the Act was to improve public trust and, if there is not trust in the statistics, there is not much point in my constituents employing their great skills to make sure the statistics are as objective as they possibly can be. Part of this is the Authority itself, the idea of national statistics and the kitemark to ensure that people regard the national statistics as something that are worthy of greater trust. It does not <?oasys [pc10p0] ?>seem to be accepted by the press; they do not announce that these are national statistics, put the kitemark on them and say that these are genuinely to be trusted; there is no question about their objectivity. Has this worked? You are suggesting a new description, I understand, of national statistics to give it that kind of authority. Is that right?
Jill Leyland: The problem, which I think is the question you raised, is that the distinction between national statistics and other official statistics is really not understood outside the cognoscenti, if you like. I think that you would agree with me from users’ viewpoint?
Jenny Church: Absolutely, yes.
Jill Leyland: It is confusing. What is a national statistic? Is it an official statistic or is it a statistic for the UK as a whole, versus one for Scotland or England? When the phrase "national statistics" was first invented, it was intended to identify those statistics of particular importance, but in practice now it is statistics that have been assessed and found to be compliant with the Code of Practice. In a sense, it is more of a kitemark. We think it would be perhaps better to call them something different or at least refer to them as something different-of course the term "national statistics" is in the Act, so cannot be changed-something like "codecompliant" or something of that sort.
Q23 Paul Flynn: Codecompliant?
Jill Leyland: Yes. I am sure somebody can think of a better term, but something that indicates that these have been found to be of a certain quality. If you did that, of course, you might find that people like the Greater London Authority would be keener to sign up to the Code.
Q24 Paul Flynn: It sounds like a meaningless piece of jargon to call them "codecompliant". You could call them "accredited statistics".
Jill Leyland: As I said, I am not very good at thinking of good phrases.
Q25 Paul Flynn: I could not think of a worse one than that, but I am sure we would need something like "accredited statistics", something that would ensure there was some force behind it. It was interesting what you said about the plethora of statistics about agriculture. It is part of the dependency culture of the agricultural industry, which has been fed and bedded with subsidies for half a century now. There is probably more attention given in this House to agriculture, which represents about 2% of the economy, than to the other 98%. To what extent is there an overprovision of statistics on agricultural matters, and to the neglect of what other industries?
Jill Leyland: It is hard for me to give a precise answer to that one. It would be something that the National Statistician could answer better than I.
Chair: I am sure the Library could help with that.
Jill Leyland: Some of these of course come from European requirements. Indeed, I remember some time ago Sir Michael Scholar actually wrote to the European Commission querying the need for certain statistics.
Q26 Paul Flynn: Any examples?
Jill Leyland: Certain agricultural statistics-I cannot remember exactly what they were, because this was some while ago. Certainly if one could wave a wand and get some of the money spent on agricultural statistics turned to newer sectors, it would be good.
Paul Flynn: The problem is Europewide, I am sure. Thank you very much.
Q27 Kelvin Hopkins: What do you see as the case for, or indeed against, a clear separation of the UK Statistics Authority’s regulatory role, as distinct from its role in supporting the ONS? It is unusual to have these two roles in one organisation.
Jill Leyland: It is unusual and it is not helpful. This is something where I think we would have liked the Act to be different, but it can work. Obviously the separation of powers within the Authority-the fact that there are two Deputy Chairs, one responsible for the Office for National Statistics and the other responsible for the rest-is a step towards it. It is very important, clearly, that the monitoring and assessment side, the scrutiny side of the Authority, is independent of the rest. There are two things: one is the practical independence of it; and then there is dealing with public perception. The practical one can be dealt with; the public perception is more difficult. You have still got a situation where they are in the same building, with the London office of the National Statistician and the London office of the ONS on the same floor. Perhaps that does not help either.
The only way you can deal with the perception is by just making it clear in practice that there is such a thing, which means that the UKSA must be scrupulous in ensuring that monitoring and assessment are genuinely independent, and that they are free to make criticisms of the ONS when needed. I would also just like to add that there is in a sense a triple role, not a dual role, because apart from the ONS and the scrutiny, there is also a responsibility, which has its problems, for the statistical service for the country as a whole.
Jenny Church: Obviously, it is important that the Authority is seen to be treating the ONS in exactly the same way as it treats the statistical services in other Departments. From a user point of view, there is evidence that indeed they do that. For example, last year when there was the error in construction output statistics, the Authority carried out an independent inquiry, published a statement and did not pull its punches about what needed to change within the ONS. The existence of the nonexecutive members of the board of the Authority is obviously very important in making sure that there is that independence and separation of roles within the Authority.
Q28 Kelvin Hopkins: Are there possible organisational changes going beyond that? We have Chinese walls at the moment, the two Deputy Chairs and so on. As it happens, the last two Chairs, the current one and his predecessor, are splendid people and totally trustworthy, but in a different world going beyond this, we might find political bias in Chairs and so on. Are there organisational changes that might strengthen the arrangement?
Jenny Church: It is quite difficult, I would say, to know how much further one could go within the confines of the Act, because it is quite clear that the Authority has this dual role. Jill mentioned the physical proximity-I suppose less physical proximity possibly.
Q29 Kelvin Hopkins: Separate buildings perhaps?
Jenny Church: Separate buildings, absolutely. That would, I am sure, help actually.
Jill Leyland: Fundamentally, the only safeguard is yourselves.
Q30 Lindsay Roy: Can we now turn to the very sensitive area of confidentiality? How well does the confidentiality requirement in the Act work to protect personal data?
Jill Leyland: As regards the ONS and official statisticians generally, it is in the absolutely fundamental DNA of any statistician that you protect personal confidentiality or the confidentiality of individual organisations. I do not think there ever was a problem with that. Very occasionally you get a slipup, but that is extremely rare. We do not see any problem from that point of view, no.
Q31 Lindsay Roy: So there have been no significant issues raised recently about breach of confidentiality?
Jenny Church: Not as far as I am aware.
Jill Leyland: Not as far as I am aware.
Lindsay Roy: That is very reassuring, if that is the case.
Jenny Church: From the user point of view, I mentioned the fact that users would welcome more accessible data. Users do entirely sign up to the restrictions that there need to be on confidentiality. It is just that users feel that there might be a more riskbased approach to data access, in that you need to weigh the utility of the data against the risk of disclosure and the sensitivity of the information. It is not that we feel that confidentiality should be breached in any sense.
Q32 Paul Flynn: There are provisions under the Act to allow datasharing, which have been used, as I understand, by giving lists of claimants in order to improve the quality of figures on population. There have also been suggestions that the powers in the Act are too restrictive and there could be wider scope for datasharing that would be of value. Do you agree with that?
Jill Leyland: I do not know what the practical or the legal problems for this are, and I know it is a complex subject, because Departments may well have legal responsibilities to protect confidentiality themselves, which they have to pay due care to. From what we are aware of, it is a very cumbersome practice, I understand. I am sure the National Statistician can give you more information on that. I believe there have only been about five instances so far. I am told that, in one case, it took about two years to get agreement to access the data.
<?oasys [pc10p0] ?>
Q33 Paul Flynn: This is information that is in the public sector. It is held by one arm, by the statistics department. It could be extremely valuable in other parts of Government and is inaccessible because we are neurotic about confidentiality probably. There are practical ways that this information could be used and it is restricted now by fears that might be entirely irrational, perhaps, about this misuse.
Jill Leyland: Yes. I do not think there is a culture at the moment of such datasharing in the UK. Departments possibly see a risk, if they have their own legal requirements, even though that risk may be very small, of a confidentiality breach, and there is nothing in it for them often, so it has proved difficult. Obviously, the ideal situation is that which happens in a number of other countries, where there is a general power for the national statistics office to access administrative data, adhering, of course, to all confidentiality requirements, but I presume that would not be possible without new primary legislation. I do not know.
Q34 Paul Flynn: There was a period when there was a great deal of excitement about memory sticks and other sources of information being lost-laptops and so on. There was one exercise involving half the population of the country, I think. Certainly members of my family were involved and told that private information had been lost. As far as I can remember, in the tens of millions of cases of information having gone astray, I do not know of any case when information was actually found and used for some disreputable purpose. Have we gone too far in trying to guard against information loss? There is obviously a need for confidentiality, but the danger is minute when the alarm is gigantic.
Jill Leyland: Yes, I think that is right. Some of the problems to which Jenny was referring come also not from straight potential disclosure of statistics, but the possibility that if you put this data set, that data set and another data set together, you might just be able to identify an individual, which I believe is referred to as "jigsaw disclosure".
Jenny Church: It is this riskbased approach that we feel needs to be adopted, because those sorts of exercises are extremely timeconsuming and expensive. You have to weigh the risk of whether somebody would actually do that against the utility of the data set. One is only looking for access for statistical purposes, at the end of the day.
Q35 Chair: Finally, the whole question of prerelease access-the hoary old chestnut-how much have your concerns about prerelease been allayed by the shorter periods that are allowed now?
Jill Leyland: It is an improvement, but we would still like to see it abolished or reduced to a very short time period, a matter of two or three hours at best.
Q36 Chair: That is intended to allow officials and Ministers to collect their thoughts, but not to prepare an assault on something else or a massive defence of their position, in advance of the statistics appearing?
Jill Leyland: Yes. We do not see the need-the real need, as distinct from what the media would like-for Ministers to have, under normal circumstances, prerelease access, nor do we consider it fair that they should have, and the Opposition and other commentators should not have.
Q37 Chair: You would favour prerelease access if all interested parties got prerelease access, like the official Opposition?
Jill Leyland: How do you define an "interested party"? A member of the public could be interested. We would favour either zero or a matter of an hour or two’s prerelease access.
Q38 Chair: Can you give an example of how you think prerelease access has distorted a particular announcement, because the Government has had the opportunity basically to spin the announcement?
Jenny Church: I do not think it is necessarily a case of distorting. It is a question of the impact on public confidence that it might have been distorted. As far as users are concerned, most users are completely unaware of prerelease access and would be extremely surprised if they knew that it existed. Where there is that knowledge, it reduces public confidence. That is the issue, as far as users are concerned.
Jill Leyland: Yes, I agree. The perception of political interference is damaging. Surveys of public confidence in statistics-the last one that was done, and all previous ones- came out clearly showing that mistrust of statistics was very clearly influenced by perceptions that they were politically manipulated or distorted. Therefore, prerelease access plays to these fears. We accept that it is very rare for there to be abuse of the system or at least straightforward abuse of the system, but it does permit the possibility of an alternative press notice coming out at the same time about the figures, and that distorts the message and again harms it.
Q39 Chair: Can you think of any examples in countries where there is no prerelease access, where the lack of prerelease access has caused a difficulty, apart from inconvenience?
Jill Leyland: We have not heard of any. I think that is possibly a question you should put to the UK Statistics Authority. I believe the Statistics Commission did some work on this some while ago. My impression is that there has not been any serious difficulty.
Q40 Kelvin Hopkins: Just a general question about the abuse of statistics: I have to say that I am a lover of statistics, ever an obsessive, and I have studied it and used to teach it a bit at a modest level, so I am always interested, but the abuse of statistics is so commonplace, particularly by politicians. I just wondered if your society, the Royal Statistical Society, polices or in a sense comments much on the abuses of statistics, or do you just stand back and let it all happen? The Government and Ministers are constantly talking about trade. They always talk about exports, but they never talk about imports. It is the balance of trade that is important, not just exports. It is simple things like that. Do you comment publicly on abuses of that kind?
Jill Leyland: We would love to, but as a society we do not have an unlimited budget or time. Also, there are organisations-Full Fact, for example-that are looking at that. What we are doing is our getstats campaign, which is a 10year campaign, to really boost public understanding of statistics. I believe there have been some presentations to Members of Parliament about that. We would like to see an improvement in statistical literacy generally.
Kelvin Hopkins: I am a Member of the AllParty Statistics Group as well, and I keep a close eye on these things myself.
Chair: Thank you both very much for your time and your comments, which have been extremely useful. We are now going to interview Andrew Dilnot and his colleagues. Thank you very much.
Examination of Witnesses
Witnesses: Andrew Dilnot CBE, Chair, UK Statistics Authority, Jil Matheson, National Statistician, and Richard Alldritt, Head of Assessment, UK Statistics Authority, gave evidence.
Q41 Chair: May I welcome you to this session of the Public Administration Committee on statistics, and could I invite each of you to introduce yourselves for the record?
Andrew Dilnot: I am Andrew Dilnot, Chairman of the UK Statistics Authority.
Jil Matheson: Jil Matheson, National Statistician.
Richard Alldritt: I am Richard Alldritt, the Head of Assessment.
Q42 Chair: Thank you very much for being with us today. In what way do you think the Act has demonstrably improved the quality and integrity of statistics?
Andrew Dilnot: I think probably in quite a lot of ways. At the top of my mind, at the moment, because we have just got to the end of the assessment process, is this process of assessment. Here in the UK, we think we are probably unique in the world; the whole of the national statistics infrastructure has been subjected effectively to an audit. We have published 240 reports on all the national statistics. There have been close to 1,500 requirements produced by Richard and his colleagues, so that the whole infrastructure has been subjected to audit. There have been a number of themes that we could dwell on later, which have come up about areas where improvement needs to be made. That has been one important thing.
The second very important thing is that, in the five months that I have been doing this job, I have met a number of Ministers, former Ministers and senior civil servants, all of whom are aware, in what seems to me sometimes a slightly edgy way, of our existence. They know that the Code of Practice exists. They know that, if they breach the Code of Practice, and that is noticed by us or drawn to our attention, the Authority will have no hesitation in publicising the fact that there is a breach and rebuking where necessary. I get a sense that that is not uniformly popular among senior civil servants and politicians, but it is something of which they are very aware.
It is also the case that, in the statistical community, there is an awareness of the kinds of ways that we are trying to take things forward, the kind of emphasis that we are putting on engaging properly with users, engaging properly with communications. In all kinds of ways, we are seeing signs of improvement.
Jil Matheson: Just to add to that, from within the Government Statistical Service, two or three things have been noticeable. The first is just the fact of the Act, the fact of the Authority, the fact of the Code of Practice as a unifying framework for setting standards. That means that there has been increasing emphasis, not always easy to achieve, on understanding that statistics have a life beyond the Department in which they are produced-so a recognition of the public value of statistics, a recognition that more needs to be done, although a start has been made, on making that real. That is how we communicate and how we increase accessibility to statistics, and there is a binding together, if you like, of good professional practice.
Richard Alldritt: What assessment has told us-maybe we could have guessed this-is that statisticians are better at collecting and publishing statistics than at explaining their strengths and weaknesses, and what they are good for. That has increasingly been the focus of the cycle of assessment reports, where a lot of our work is focusing. It is at that end of adding value by explaining the strengths and weaknesses of the statistics.
Q43 Chair: In a word, do you think the Act has improved public confidence in statistics from where it would be otherwise, even if the trends are not very discernible?
Andrew Dilnot: There are at least two groups of people who we might look at. There is public confidence, and the last time that this was measured by us was in late 2009, published in 2010. At that stage, I do not think one could say there was evidence that things had got better. Alongside the analysis of the wider public’s confidence, we also did a survey of opinion formers-people who were professionally interested in statistics-and there seemed there to be evidence of some positive feelings about the Act, and a sense that integrity and quality were being enhanced. I think it is a reasonable question that we are happy to discuss with the Committee, whether it would make sense for the Authority to conduct a repeat of those exercises to see now, three years on from the last survey of the public, whether there is evidence of improvement. We are very aware, of course, that responses to that kind of survey are strongly affected by wider political considerations and the sense of trust in government more broadly, but I think my colleagues agree that this is something that we might well want to think about doing in the near future.
Richard Alldritt: If we look at the general public, we must look at the opinion formers as well. You get different messages: the general public’s view of official statistics tends to be driven by views of the Government and the public administration; and this smaller group of people who know more about it give you a much richer insight.
Q44 Chair: Mr Dilnot, could you perhaps point us to evidence of the more strategic approach you are taking to all these issues that you intimated you would pursue in your preappointment hearing?
Andrew Dilnot: More strategic than what? I can certainly point you to examples of the strategic approach that we are trying to take. One crucial set of issues, and this is something I talked about a lot in my preappointment hearing, is to do with communication and how we communicate statistics. We are rather a small group trying to help to guide a very large group of people. The Office for National Statistics itself has thousands of staff. The wider Government Statistical Service has many more thousands. One of the things that we have set in train, as a result of collaboration with both the National Statistician and the Head of Assessment, is some work specifically on communication. In either late May or early June, we brought in colleagues who were working on three sets of releases-there were releases on GDP, retail sales and the population estimates-to work alongside them to think about ways we could improve the quality of the output, releasing that. They have come back to us with some suggestions. That work is going on.
As we looked at that, it became clear to us that, through the assessment process, a whole stream of repeated conclusions come up, many of which are to do with engaging views, communicating and making sure we are getting the best use. As we look at our colleagues, some of them are saying, "You keep telling us how to do this, but we are not absolutely sure how to do it," so we are forming a new group that is about to start working in early October, which we are going to call the Helpers Group. We are a group drawn both from Jil’s team and from Richard’s team, who will be available to help people throughout the Statistical Service-both the ONS and the GSS-to do these things, which are given such prominence in the Code of Practice, more effectively. That is one example.
A second is that the ONS has had a regular prioritysetting process. After I arrived, we sat down and thought, "What are the core priorities?" One of the priorities is one that I mentioned at my preappointment hearing, which was the website, and communication more widely. We are already in the process of drawing up budgets for next year and the years thereafter. We are even enhancing the prioritysetting process to be clear about what are the most important things for the whole statistical system and the way in which we can deliver those.
Q45 Chair: Wouldn’t it be beneficial for you to set out, in a single document, your strategic thinking for the Authority and the general direction that you wish to take it in, otherwise your strategy appears to be operating on a fairly ad hoc basis, however coherent it may be?
Andrew Dilnot: It certainly is important that, in our regular process of reporting, we should describe what our strategy is. The annual report for last year was published earlier this year. I certainly think that, in the annual report next year, I should write down what I think the strategy is. I have been in this job for five months, and one of the things I feel very strongly about all jobs is that, while one should arrive in a new job with a clear sense of what it is that you want to achieve-a vision that I hope I managed to set out in December, at my preappointment hearing-it is also very important to recognise that when you start a job you will learn things. The ONS and GSS are relatively complicated structures so, less than six months in, it would have been inappropriate to try to write a detailed strategy and publish it before that. That is something I certainly think we will be working on over the next few months and will be publishing at the end of the year in our report.
If the Committee would find it helpful, slightly in advance of that, for me to send a note describing the broad outlines of that, I would be very happy to do that. I should say that part of the strategy that I feel is very important is that we are aware of the relationship we have, through PASC, with Parliament. We are very aware that we are creatures of Parliament and we are grateful for the interest that this Committee takes in what we are doing, and delighted that so many inquiries are underway.
Chair: I think that would be extremely helpful. I would point out that this Committee has done a bit of strategic thinking itself about how Government does strategy. We are not looking for you to set a plan in stone; on the contrary, that is not strategy. That is a plan. Strategy is how you approach your planning to achieve your key objectives, how you choose your key objectives and how you approach your plans to achieve them. Those plans may need to continue to adapt as you gain experience in your role. I think it would be useful to do that quite early, certainly well in advance of your annual report next year. We would appreciate that.
Q46 Kelvin Hopkins: There are certain key statistics that the public, politicians and the media take much more interest in-inflation, migration, population, the census and so on-and obviously these require lots of attention by yourselves. Have improvements resulted as a direct result of the legislation, setting it all up?
Andrew Dilnot: Have they resulted as a direct consequence of the legislation? That is a question that, at one level, is difficult to answer. Inflation was one of the things that you mentioned. Richard and his colleagues did an assessment report on the measurement of inflation; we have also done a monitoring review on the communication of inflation, because my recollection is that the assessment report said that one of the things that we feel about the inflation measures is that more could be done on explaining to all users how they are constructed and what is going on. A monitoring review was then done.
In response to that monitoring review, or at least alongside that monitoring review, a programme of work has been conducted within the Authority, by ONS with input from the Consumer Prices Advisory Committee (CPAC), which has had a couple of big priorities. One was owneroccupied housing. A second was the formula effect, the gap between the RPI and the CPI. On owneroccupied housing, the National Statistician went out to consultation a while ago. That consultation has just about finished now, I think, so we are hoping that we will able to get that for you soon. On the formula effect, I expect that is the direction we are going there as well. We will soon be able to go out to consultation on changes there. Now, we cannot say with certainty that those things would not have happened in the absence of the Act, but they have certainly been part of a process and a structure that reflects the Act.
Jil Matheson: If I may just add another example, which perhaps is directly related to the Act: the datasharing orders. The provision of the Act that allows for datasharing subject to Parliamentary approval has meant that, since the Act came into force, ONS has been able to access some data from other Government agencies to improve the quality of what we are doing. I will give you just one example. One of the datasharing orders was for student data from the Higher Education Statistics Agency. Those data have directly fed into changes and improvements to the population estimates at a local level. They were also used as one of the sources for qualityassuring the census data, so a very direct change. It has not all been easy, in that datasharing is one of the things that is very dear to my heart and subject to all kinds of confidentiality constraints but, nevertheless, we have used the legislation, as we would not have been able to access that data before.
Q47 Kelvin Hopkins: What about resources? You need resources to do these kinds of assessments. Do you have sufficient funding to do what you really want to do in this respect?
Andrew Dilnot: One should always be wary of saying, "I’ve got more funding than I need," or "as much funding as I need." The truth is that, in all walks of life, there is always more that we would like to do. On the other hand, the amount of money out of the total Authority budget that is spent on assessment is a small part of the whole. Something that I have felt very strongly, and have said repeatedly to Mr Alldritt, is that I do not want resource constraints to stop us doing assessment or monitoring work that matters to us. We think that the total cost of the assessment programme is about £1.5 million a year, so that is small relative to the ONS budget, small relative to the wider statistics budget, tiny relative to overall public spending and even tinier relative to the national economy. There may, on occasion, it seems to me, be constraints in terms of Mr Alldritt’s being able to get a hold of the staff that he wants to do the work, but we are not constrained in budgetary terms.
Richard Alldritt: If I could add to that, I do not think we should see assessment as likely to change the figures being published. I can think of a case where it did, which was road accident statistics, where I gave evidence to Parliament’s Transport Committee and there was some change in the figures being published, but mostly it is about whether the people who need these statistics are being told enough about what they are good for, what their strengths are and what they need to be careful about. In other words, are we doing enough to get the good out of the statistics? That is that focus. Issues of whether we are using the right methods and problems in that area might be identified by assessment, but they would not be resolved. We have to see where it fits in; it is tending to look at that end of the value, rather than methodological issues, which are approached in a different way.
Q48 Chair: Might I pick on a particular example here? There is a huge focus on GDP statistics-rightly so-and on construction statistics, and yet we know these statistics, when they first come out, really are provisional figures. When they are 0.1% or 0.2% either way, they really are of questionable value, and they are subsequently revised quite substantially, to the other side of zero on occasions. Shouldn’t there be a real effort to stop the press galloping off with massive conclusions on the basis that these are unassailable facts? Shouldn’t the UK Statistics Authority be making it clear that provisional statistics are that-provisional and subject to quite substantial checks?
Andrew Dilnot: We entirely agree with you on that, Chairman. "Questionable value?" I think they are of value. They are of enormous value and there is enormous demand for them, but you are absolutely right that it is vital that it is understood that they are estimates. I think I am right in saying that the UK produces the first estimate of GDP more quickly after the end of the period to which it relates than any other country in the world. We produce it 25 days after the end of that period.
Andrew Dilnot: It is very important that we should make it clear that there is considerable uncertainty around it.
Q49 Chair: These are numbers that move markets and affect political sentiment.
Andrew Dilnot: They do move markets and affect political sentiment, and so it is very important for us to communicate effectively the basis on which they are constructed and that they are estimates. One of the things that came out of the workshop we had on communication was a list of, I think, seven principles that ought to run through all our communication. I wrote down the list. The very first of these was uncertainty-that there is inherent uncertainty in estimates, particularly when they are produced at short notice. I do not think that means they become not useful, but it is absolutely vital that it is understood.
Q50 Chair: There comes a point, surely, when a 25day assessment could actually be misleading.
Andrew Dilnot: It could be inaccurate. I am not sure that it is misleading, as long as we succeed in communicating with people that it is made after 25 days. I am pretty sure, if we were to suggest to the Bank of England, the Treasury and the markets, "Why don’t we delay this by another month to get to something with slightly more confidence?" that there would be an entirely understandable chorus of regret, because people want a number as soon as we can give any kind of number. In this, we have to be more effective communicators.
One of the things we have to do is make people realise that, in fact, in the case of GDP, the difference between 0.2%, 0% and 0.2% is very small. At the moment, it seems to me, the story about GDP in the last five years is that, until 2008, GDP had been rising for an unprecedentedly long time and pretty rapidly. We had a long period when it went up, then we had a short period of very dramatic decline, followed by a small bounceback, and then for about two and a half years now it has been roughly flat. Whether the number is 0.2%, 0% or 0.2%, that is still the story. There is a tendency with many of our numbers for there to be too much focus on the very last number, the latest little bit of evidence, which is a point estimate, uncertain, of the first differential of the level. I think we should go on doing these numbers, but we all need to work much, much harder to help the whole community understand their basis.
Q51 Chair: How are you going to achieve that?
Andrew Dilnot: In a whole number of ways. We are working very hard with the producers on communication. I have spoken at a whole of series of events-heads of profession, heads of statistics, the Royal Statistical Society, the Government Statistical Service conference-about precisely these issues. We have our Helpers Agenda. We already have new drafts of the GDP first estimate, first release, and retail sales, where similar kinds of issues apply. We have to work hard on the communication.
We also have to work with journalists. We have to work alongside those who become the medium through which most people get access to statistics to try to make sure that they report them in the most sensible and best way.
Q52 Chair: For example, the BBC has a special obligation to report these statistics in an objective way. Do you feel they do so?
Andrew Dilnot: Do they do so? Not always.
Q53 Chair: Will we see a letter from the Chair of the UK Statistics Authority to the DirectorGeneral of the BBC?
Andrew Dilnot: It would not be the DirectorGeneral who was guilty, although it might be the DirectorGeneral to whom a letter might have to go. If the BBC flagrantly misrepresented some data, yes. My understanding is that there was an occasion when my predecessor did not write to the BBC, but wrote to a print outlet, and circumstances could occur where that would be appropriate. For the time being, most of what goes on is just not very good, rather than being an example of malpractice. We need to work hard with that. As you say, there are certain particularly significant and highprofile outlets where it is especially important.
Q54 Kelvin Hopkins: Just pursuing some of these points, one of the problems I think we have is that a huge proportion of the population is not numerate, even many in politics. Even in this wonderful House that we inhabit, there are those who are not sure about statistics, percentages for example. The Moser Report, some 12 years ago, found that more than 50% of the population were not effectively numerate, and he illustrated that by saying that more than 50% of the population do not understand what 50% means, in a sense, explaining things. I think you are absolutely right, but you do then start to get into political territory because, if you explain things in a way in which the Government says, "Ah, yes, that supports our case," or the Opposition says, "They’re taking the Government’s side," you have a problem. In a sense, I can see you are always standing on the edge of this political maelstrom.
Richard Alldritt: We call this the "cliff edge" that statisticians most not cross. The problem is they do not approach it; they stand well back and do not do as much as we would like in terms of explaining to people, guiding them on what the statistics are or are not saying. When it is an issue like foreign workers in the economy or something like this, it is quite important to guide people to what sort of interpretation is appropriate. Yes, there is a line that must not be crossed. Our experience in the work that we have done is that government statisticians actually have a pretty fair idea of where that line is, but they stand a long way back from it.
Jil Matheson: I was just going to add to that point that that is my experience, too. Rarely have Government statisticians been accused of interpreting things in a political way. Much more common is that they do not interpret or explain at all, which is actually what opens up confusion and the potential for political misuse, if you like. Richard is absolutely right: I would want Government statisticians absolutely to understand that they are still civil servants, and absolutely understand that the value that they add is in really understanding the data and statistics, and helping a wide range of users, the public, to understand therefore what those statistics mean. "What is the statistical story?" is the phrase. Of course statistics matter and of course they are used in political debate, but our role is to facilitate that debate with clear understanding and explanation of statistics, rather than a bit of a black box that is "all very difficult, isn’t it?" We have a duty to explain.
Richard Alldritt: We cannot be too critical of the press reading between the lines, as it were, if the explanation is not good enough in the first place. To some extent, our focus is on explaining the thing in simpler, clearer terms.
Andrew Dilnot: It is true that people do not find numbers very easy, but actually most of us do not find numbers very easy. Once a number gets much beyond the size of the value of people’s houses, they find it difficult to understand. I have said already today that we spend £1.5 million a year on assessment and that is a very small number. At one level, it is a big number. In lots of contexts, it would be a big number. How can we make £1.5 million make sense to people? One way of thinking about it is what that is per person, and the answer is it is 2.5 pence per person, per year. That is what we are spending on auditing the statistics we use to run the whole country.
I have been thinking about things like representing numbers as a number per head. GDP at more than £1,500 billion means nothing to any of us, but by saying it is about £25,000 per person, you have a way of thinking about it. Once you have said it is £25,000 per person, you can think, "What does a reduction of 0.2% mean?" One per cent. would be £250 so 0.2% must be £50 a year, so there are ways of doing it. If we are thinking about population statistics, this is an area where actually people do seem to have a grasp of a big number, because most people do sort of know that the population of the UK is roughly 60 million. When we are talking about change in population, it is most helpful actually to use the number and say it has gone up by 200,000, rather than say it has gone up by 0.3% a year. Thinking about how people understand numbers when we communicate them is a crucial part of what we do. I do not think it needs to take us into politics.
Chair: I am going to curtail this bit of the conversation, because we are going to come back to communications in a later session with you, because we treat this extremely seriously.
Q55 Paul Flynn: Do you recognise the picture of the world of statisticians as a priesthood of infinite integrity, acting as guardians of the sacred flame of truth, but under constant assault from all kinds of elements outside of infinite wickedness, who are trying to contaminate the flame? To give one example, the Department for Work and Pensions seems to have found a device for coming under the protection of the Act, by issuing some inflammatory information that they knew was going to be seized on by certain elements of the press. It was about the numbers of foreigners claiming benefit in the UK, which is a source of great delight to the tabloids. This was not introduced as a national or official statistic, but was put out as a research note. Is there some reason it did not have the normal protections that the Act was intended to put in place to ensure that these statistics were presented in a cool, rational, objective way? Is this a matter of concern?
Andrew Dilnot: I am in favour of the priesthood of all believers, in a good Protestant sort of way. I want everybody to think of themselves as competent to engage with statistics. That is the Nirvana to which we are looking. This particular case relates to the general question of the distinction between official statistics, national statistics and other stuff. My own view is that here we are in a slightly tricky position, because the Act gives us, as the Authority, the right to assess things that are national statistics. It also gives us the right, through a pretty laborious and bureaucratic process, to seek, in the case of other things that are official statistics, the right to assess them. That is something that cannot work anything like so quickly.
Then there are other things that can claim not to be official statistics at all, and on this distinction between official statistics and management information, the Authority has published some guidance, and Jil’s colleagues in the GSS are able to turn to that guidance. If they think they are being put under undue pressure, they can turn to Jil, who can turn to me. Nonetheless, there is a potential difficulty here. My own sense of the move that we would like to see is to move away from the very laborious process of our having to seek permission to assess something to a position where, actually, the Head of Assessment could come to me and say, "This is an area where we need to do some work." That would be a step forward.
Q56 Paul Flynn: When the Act was going through, the interest in the press was nil, apart from one person and that was you, who was writing about it. I constantly quoted you, because there was no one else who seemed to show any interest or to have any conception of what the Act was going to do. How successful has it been? Are we really in trouble because of this difficult definition? The idea of national statistics with a kitemark does not seem to have improved the public’s trust. Do we need a new definition? Shall we call them something else?
Andrew Dilnot: I think, "Shall we call them something else?" is quite an interesting question. My own sense is that I am new enough in this job-five months in-still to be able to bring a slightly external perspective, and one of those is that "national statistics" does not mean anything to me. It sounds as though it might mean a statistic that relates to the whole country or it might, because it is the last two words of the Office for National Statistics, mean that it has been produced by the Office for National Statistics. No politician, as far as I am aware, thinks, "I must make sure that I use only national statistics in my speech, otherwise that member of the Opposition will stand up and say, ‘Well, Minister, where did you get those numbers from?’"
It is something that we would be interested ourselves to think about and to talk with you about: whether there is a different description that actually sounds like something that, first, is a good thing and, secondly, is something that a Minister might actually want for the statistics that she or he was using. Yes, we have not come up with the perfect description, but we have been thinking about it and it is something we would like to think about some more, because our sense is just that "national statistics" does not mean enough to give people the incentive that they need to want to have that classification.
Q57 Paul Flynn: There was a suggestion of "codecompliant" from the previous witnesses. The other one that was suggested was "accredited statistics", which seems to suggest it carries some kind of authority with it.
Andrew Dilnot: I think "accreditation", "assured"-I don’t know and we should not try to draft it now. It is something that, over the last few months, I have a growing sense of the need for.
Q58 Chair: Is it something you are going to address?
Andrew Dilnot: Yes.
Richard Alldritt: Whatever we do on this, it will mean accredited against the Code of Practice for official statistics. That has to be the test. It was written as a Code of Practice for all official statistics, and we will need to look at this issue of, now that we have completed the cycle of national statistics, whether there are things like the DWP statistics that you mentioned that should be brought more into this process of accreditation.
Andrew Dilnot: We would have more chance of that happening if we have a description that people wanted of the statistics that they were creating and using. At the moment, I think "national statistics" just does not mean that to the wider user community.
Q59 Paul Flynn: Do you think the Authority has a sufficient degree of control over those statisticians elsewhere in Government, other than in the Office for National Statistics, which would be appropriate; or do you think that the Ministers in other Departments-you said there is some nervous relationship, which is all to the good and we welcome the news there is some tension between you and Ministers-will use all kinds of ways to get around the Act, if they find it is getting in the way of shaping public opinion to their will?
Andrew Dilnot: Certainly one would expect, I am enough of an economist to expect, any rules that you set up, any structures you set up, people will try to find ways around them. The Act does a pretty good job of that, and it is the case that statisticians in other Government Departments know that they can come to the National Statistician and her colleagues, who will bring it to me. It certainly is the case that, in Departments, senior civil servants and Ministers are thinking, "Exactly how do we negotiate our way through this?" They want to know how the Authority will approach these things. They have been grateful for the guidance. I have had a number of conversations with senior figures, including the Minister for the Cabinet Office and the Cabinet Secretary, about precisely these sets of issues and the way in which we need effectively to communicate to Ministers and civil servants where the lines are and the consequences of stepping over them. The consequences of stepping over them, I have said to both those individuals, will be public and loud. If we see behaviour that we should not see, we will not be shy about publicising it. That is a very important part of this system. We do not have authority, day by day, over everything that the statisticians in Government Departments are doing, but I think they do feel that we are there to protect them and that seems to me to be working reasonably well.
Jil Matheson: That is underpinned by one of the things that is now stronger, which is that all the chief statisticians and all the heads of profession in Departments now have in their performance agreements an objective about their professional role and their accountability to me on professional matters. There is a process point there. One other thing that I would mention is that there is now a clause in the Ministerial Code about statistics, and all those things, as Andrew has said, have raised the profile and the professional importance.
Q60 Paul Flynn: The degradation of the Ministerial Code under this Government is a subject of many anxieties on this Committee. Could I take up the point raised by the Chairman? There is also a view among Coalition MPs that-again, in the tradition of attacking the messenger, in order to defend the ineptocracy that this coalition has created in the country, they attack ONS and the BBC. Particularly, to take up the BBC, there are meetings regularly about the BBC here, which are attended by the flatearthers in this House and the next House who are obsessed by the idea that the BBC is a Bolshevik conspiracy. You were asked about distortions. If you put them in some kind of order, where do you think the grossest distortions take place, if you go down from the Daily Mail to the BBC?
Andrew Dilnot: I do not think I have a clear view on that, certainly not as the Chairman of the UK Statistics Authority. My general view is that the media often does not portray statistics in the way in which we would like them to, but nor do politicians, by any means. Our job in the Authority is, first, as the source of them, to communicate and make sure that our colleagues communicate them well and, secondly, when there is flagrant abuse to rebuke that. It is not to get involved in the knockabout of political debate, but to intervene when clear misrepresentation is going on.
Chair: We sometimes don’t represent each other’s views very fairly either.
Q61 Paul Flynn: We see people are finding ways of getting around what the Act intended to do. Do we need the Act to be amended?
Andrew Dilnot: In this area, I don’t think we need the Act to be amended. There are some areas where that could be discussed, but I think what we need is clear understanding on the part of Ministers and senior civil servants, and we need to be consistent. In the conversation I have had with the Minister for the Cabinet Office and the Cabinet Secretary, I have said that I understand that both senior civil servants, some of whom have moaned at me and I am sure have moaned at Jil and Richard, and Ministers, on occasion, find the behaviour of the Statistics Authority extremely irritating. At one level, if it were not irritating, we would not be doing our job. As we do our job, we must act consistently and fairly, in a way that is open and accountable. As we do that, if we find, for example, the use of material that, in our view, is really a statistic being labelled as management information, but is being published and used in a way that is inconsistent with the Code of Practice, we will say so, and say so publicly.
Q62 Chair: You think we need clearer boundaries?
Andrew Dilnot: Well, I think we need clear boundaries. Actually, in the guidance that has been published, those boundaries are now pretty clear. We need it to be possible for conversations to occur across Government to make sure that they are clear. One of the things that both the Cabinet Secretary and the Minister for the Cabinet Office are considering is whether there are forums in which my colleagues or I should go along and speak, just for the avoidance of doubt, but I think the guidance is pretty clear.
Q63 Chair: Presumably Ministers and officials can check with you in advance of releases?
Andrew Dilnot: They can check with us that their understanding of the Code and of the guidance is appropriate. What I do not want us to get into is lots of prehearing rulings. I don’t think it is appropriate that people should say, "We were thinking of saying this. Is this mealymouthed or unacceptable?" It is entirely appropriate for them to have conversations with us that say, "Our understanding of the guidance is that something like this would be management information; something used in this way would be a statistic." I think we want to stick to principles rather than giving out rulings, but we could have more of those conversations and we have made it clear that we are open for that sort of business.
Jil Matheson: Certainly that is the kind of conversation that happens with me in my office, if there is any uncertainty.
Q64 Kelvin Hopkins: If I may put one question on the previous discussion, presumably there are different levels at which you could relate, particularly with Ministers. You could have a quick private telephone call to say, "Sorry, Minister, but I think you have misinterpreted that." Then there is the formal meeting and then there is public comment. Sir Michael Scholar, from time to time, felt obliged publicly to criticise Government and quite rightly so. Presumably you can operate at these different levels.
Andrew Dilnot: Yes, there is a full range of our interactions and interventions.
Richard Alldritt: We make a lot of allowance for the reality of Ministers being asked questions in the course of a public interview or something like that and halfremembering statistics, and we distinguish between that and a formal statement, where there has been opportunity for it to be reviewed and approved. It has to be realistic this, and we do consider whether there has been opportunity to take professional advice before commenting.
Andrew Dilnot: There are two sets of things that I take particularly seriously. One is something that might be characterised as aiming to mislead or misrepresent things to the wider public; and the second, and I say this without wishing to appear sycophantic, is statements to the House. It seems to me that we must have very high standards for both those categories.
Q65 Kelvin Hopkins: My formal question is: how is the organisational structure of the Authority working? In particular, it has dual roles as a producer of statistics governing the ONS, but also as regulator of statistics. In the previous session, we had a discussion about this. One hopes there is at least a Chinese wall between the two roles.
Andrew Dilnot: It is a relatively unusual structure and it imposes some disciplines on the Chairman and the structure as a whole. The way through this set of potential tensions that was devised when the Authority was set up was to have two Deputy Chairmen, and I think that was a very effective way through it. At the end of the last calendar year, tragically, Sir Roger Jowell, one of the deputy chairmen, died. In the last few months that Sir Michael, my predecessor, was Chairman, he acted as the Deputy Chair for Official Statistics. In the first few months that I was Chairman, I also had to do that, so we did not have both the Deputy Chairmen. We now have both Deputy Chairs: Adrian Smith and David Rhind. David Rhind is on the Official Statistics side, the job that Sir Roger Jowell used to do. Sir Adrian Smith is coming in after Lord Rowe-Beddoe’s retirement. We have both those Deputy Chairs in place and we have a structure of committees that separates that out.
I also have some wonderfully independentminded colleagues, and we have some fairly robust and vigorous conversations about a whole range of issues, involving me, the Deputy Chairs, the Head of Assessment and the National Statistician. My own feeling is that these things are working pretty well. There is a tension but, so far, it has been a rather creative tension. The voices have not been raised very often, but there is a real discussion, a dialogue, and we certainly have at least Chinese walls set up in our committee structure and the way in which we work in the allocation of nonexecutive directors to the various committees.
Yes, it is unusual, and I am still learning about it. There are some things we are thinking about; over the next few months, we may make some relatively small changes to the precise way in which we operate as a result of things we have learned over the last four years. Also, now that we have got to the end of our first massive programme of assessment, the balance of the monitoring and assessment work may change, and more of it may become less relatively routine, and so we need to make sure that we have the right arrangements for that.
Q66 Chair: Sorry; could I interrupt for a second? Jil Matheson, I confess I think I saw a wry smile on your face at the talk of "robust exchanges". Could you give us an example, a flavour, of the sort of thing that can create this constructive tension between you? What the public wants to know is that there is this tension, because it is inevitable that this tension would occur, given the relationship between you. It would be interesting to know what sorts of things cause the tension.
Andrew Dilnot: I can think of one example and Jil might think of others while I am speaking. One example, in a strange kind of way, has been this thing we are calling the ‘Helpers Agenda’. We are trying to take forward a group of people who can help people, across the whole statistical service, do the things about which Richard and his grumpy cohorts are saying, "You’re not doing this." They say, "Well, we don’t quite understand." Richard, Jil and I talked about this very early on in my time, and it was actually quite tricky, because Richard of course wants it to be done, but does not want anything to do with it, because it is very important that, in future, Richard can come along and say, "Well, you may have been working with these helpers, but it is still rubbish."
We had quite a complicated discussion about who would be in charge and how we would get the governance structure, and it went on over a period of two or three months. On occasion, it became slightly tricky, but we have now resolved it, and the way we have resolved it is that, yes, some of the people who work on this will have been in Richard’s team; they will come out of it for a while. Richard will be involved in helping to work out exactly what we will do, but the programme will be managed by Jil and answerable to Jil, so that we are not in a position whereby, if one of these people then goes back to work for Richard in the future, we could be overturning it. As I say, it was reasonably robust, wasn’t it?
Richard Alldritt: I can probably give you another example. We write a lot of reports about the Office for National Statistics. We have done something like 50 assessment reports and quite a lot of what we call "monitoring reports". One that we are doing at the moment is about whether ONS is making public everything from its databases that it possibly can, in the spirit of the open government agenda. As we draft those reports and drafts are circulated-we show them always to the Department concerned-sometimes the drafting is not as clear and considerate as it might be, or indeed factually correct. That is part of the process, and those sorts of things cause robust exchanges.
Jil Matheson: Both are excellent examples and inevitable, but the point that I would make is that, actually, from my point of view, which is about representing the producers of statistics wherever they are in government, this is a helpful process; because one of the things that I want to see is the continuous raising of standards and continuous improvements in the kinds of things that Richard’s team have been focusing on, which are to do with communication, a life outside Departments and user engagement. They are extremely important messages for the GSS. What I also have to take into account is that of course Richard wants it done tomorrow, and sometimes it takes longer than that. That will inevitably lead to discussions about-you can make whatever recommendations you like-how feasible it is to expect producers to respond to them and how quickly. That is a builtin discussion.
Richard Alldritt: We do not start by considering what resources are available to the Department. We look at what they should be doing, and the resource issues are discussed after we have reported. That can cause tensions.
Q67 Kelvin Hopkins: There is just one other point. In our previous discussions, we talked about perhaps separating the functions, geographically at least, perhaps having separate offices rather than the same floor of the same building as I understand it is at the moment, so production and regulation might just be in separate buildings-just a little bit of geographical separation, to make the Chinese walls more effective.
Andrew Dilnot: I think I said at my preappointment hearing that that was something I would want to think about because, until April when I started, I had not. The way things are at the moment, the Assessment staff are in London, Newport and a small office in Edinburgh. In our London office, it is true that the address is the same as Jil’s office; it is on the other side of the whole square. My own feeling is that the separation that really matters is the separation in the Act and the separation in the structures that we have put in place, so that every member of both Jil’s team and Richard’s team-everyone who works at the ONS-knows what the relationships are. I don’t think that any further physical separation would help a great deal. Other people might have different views, but I have not heard that view strongly expressed from the Assessment side.
Richard Alldritt: I suppose it concerns me that people might think it is all too cosy, which is particularly frustrating given all the robust exchanges we have just been referring to. The symbolism of physical separation might help in some respect, but I do not think it would influence much that we actually do. I cannot quite see how it would, but if it gave people more confidence in us, it is a consideration.
Q68 Greg Mulholland: Just turning to the issue that I think, Jil, you have already mentioned about confidentiality, which clearly is a hugely important sensitive one, my first question is: do you think that the confidentiality requirement in the Act has worked well to protect personal data? I know there are issues; I am going to come on to them in a minute. From the point of view of doing what the Act was trying to do, which was to protect confidential public information, do you think that has succeeded?
Andrew Dilnot: That was only one of the things it was trying to do: to protect confidential private information. My impression is yes. It is also my impression from more than 30 years of working in this sort of field that, sometimes even to a slightly irritating extent, official statisticians in this country are extremely committed to confidentiality. If I have anxieties about the confidentiality arrangements, it is not that they have been insufficiently tough and effective. I am not aware of any concerns that there has been disclosure of personal information. It is really pretty robust; it is not something that Assessment has been concerned about either, in terms of protecting. That is clearly vital for public confidence. If we are to collect data from the public, they have to believe that their personal data would be safe, and I think it is, in this country.
Jil Matheson: It is something that, as Andrew said, is very dear to our hearts. One of the additional points I would make is that it is not just the legislation. The legislation is there; it is helpful. The statement of what it says about criminal offence is extremely useful, but it needs to be supported by practice and, in ONS, there are very strong practices. To go back to the census example, the protection of personal information was central to the way that we carried out the census and how we could assure the public that we would do so. The Act was very much part of the assurance that we were able to give. It was not enough on its own, but it never would be. The fact that it was there was important.
Andrew Dilnot: I can give you an example of how much people care about this. Glen Watson was the Director of the Census, and has been appointed the new Director General of the ONS. After all the questionnaires had been received and logged, they were then shredded. Glen himself, after all the shredding, went around the warehouse where they had all been held, carefully checking in cupboards and down staircases to make absolutely sure that he could assure himself that we had done it. There is an astonishing culture of concern about it.
Q69 Greg Mulholland: That being the case, and you have sent a helpful section on this in your reply to the Chair’s letter, there are clearly issues the other way in terms of being too restrictive, perhaps even preventing new helpful statistics from coming forward and being shared. What kind of changes would you like to see in that area and would they actually require changes to the legislation?
Andrew Dilnot: There are two sorts of issues. One is about access for the public, academics or interested parties to the data. There, my sense is that things like the Virtual Microdata Laboratory have worked reasonably well. We have had a very heavy degree of use of that and it seems to me to be working well. Where I am a bit more anxious is on sharing administrative data, and in particular its use for statistics. A good example of this has come up in the last few months. My understanding, my reading of the Act, is that the Act only really helps us with legislation that occurred up to the Act. We do not have the same data access to legislation that occurs after our 2007 Act. A good example of this has been the new electoral registration legislation, which is going to be collecting some data that would be potentially astonishingly useful, in terms of improving accuracy and saving money, for any subsequent censustype activities. The electoral registration legislation explicitly excludes access to that data to anybody else, even within government. It is also the case that the datasharing arrangements that we have at the moment are laborious. If data from another Department are helpful and useful for the creation of statistics, there is a very laborious process to be gone through before they can be made available.
Is a change in the legislation required here? We are not absolutely sure, and that is something that I think we would be interested in thinking about some more and working with the Committee on, but, in an environment where everyone can see the benefits of datasharing, for it not to be possible in some cases-like the Electoral Registration and Administration Bill, which would have created this very useful data-or for it to be possible but very hard work, does not seem to be the right way forward.
Jil Matheson: If I may, the other part of this, which I have heard from some users, academics and so on, is not so much in the datasharing area, but in the kinds of controls that we put on: what we do to the data before we release them to make sure that individuals are not identifiable; the disclosure controls. There is absolutely a balance there between protecting individuals and making sure that the data that were collected for a purpose are useful and useable for that purpose. There is some evidence that those kinds of disclosure controls are not being applied consistently across government, and there is some work going on looking at that to make sure we are clear that we have the right balance between being useful to users who want to analyse the data, while protecting the privacy and confidentiality of individuals.
Q70 Greg Mulholland: That is useful. Just picking up, finally, an interesting thing in your response, it is quite complex, as you state, with quite technical points. One point that you made was to say that "Datasharing gateways, once they are created, permit sharing of data but do not create any obligation on the part of the data owner to cooperate with ONS." Have you got any examples? Has that happened regularly?
Jil Matheson: There are discussions at the moment actually about one of those. There was a fairly recent datasharing order for some data held by DWP. There are still discussions about whether that datasharing order is enabling or whether it is an obligation. There are certainly some views being expressed within DWP that there is other legislation that prevents them from sharing the data with ONS. That is a kind of example of the complexity, the difficulty and how timeconsuming this area is.
Andrew Dilnot: It is a good example, if I may say so, of areas where we would like to feel able to come to the Committee, if we were having continuing problems.
Greg Mulholland: I think it is an interesting area, and we look forward to hearing further from you.
Q71 Chair: Again, you do not feel any requirement to amend the Act at present?
Andrew Dilnot: That is something we want to think about. We are very hesitant to suggest amendments to the Act, because we know how difficult that would be and how timeconsuming, but we are also aware that this is an area where we may need to make such a suggestion.
Q72 Chair: Very briefly, there are two further areas to explore. Do you feel that the prerelease access issue remains an issue, despite the changes?
Andrew Dilnot: Yes, I absolutely do. I am looking at a table that I have in front of me of how many people get prerelease access to a number of items. At the very top is the crime in England and Wales release, which is made available to 142 people.
Q73 Chair: The harm that is done by this?
Andrew Dilnot: I think there are two sets of harm. One is that it brings the whole system into disrepute. The second is that, in my view, it clearly allows the possibility of attempts-probably not in England to put pressure on statisticians to change what is in the release-to think of some way of covering the release, so that either we can publish something else at the same time that might distract attention, or give Ministers a particular advantage in spinning that material.
Q74 Chair: Are you satisfied only 140 people see the information?
Andrew Dilnot: 142 in advance.
Q75 Chair: Are you satisfied it is limited to 142 in reality?
Andrew Dilnot: No, absolutely not.
Q76 Chair: Is there any way you can check?
Andrew Dilnot: I don’t know. I think that is quite hard for us to do. We would need observational powers.
Q77 Chair: Certainly your predecessor wrote to the Treasury about an, albeit inadvertent, release of inflation information to a far wider group of people than intended.
Andrew Dilnot: It seems to me that, in the case of some of these potentially marketmoving statistics, so far we are assured that there has been no personal gain but, in the end, there is that opportunity. So I think it just makes a nonsense. We are also in the position where, in England, it is down to 24 hours but, in Scotland and Wales, it is still five days. I was approached confidentially and anonymously over the summer by a Scottish statistician, who feels that during that period pressure is brought to bear by politicians to change what is in it. Yes, I think it does do material damage and I think it should be changed.
Q78 Chair: Is statistics a devolved matter? Do you have any power?
Andrew Dilnot: Yes, statistics is a devolved matter and the prerelease arrangements are up to the Scottish Government, as I understand it, and the Welsh.
Q79 Chair: Should they be the same? Should it be a UK responsibility?
Andrew Dilnot: I don’t have a view about whether it should. The Act means that we are not responsible for it; I regret that. Despite the Act making us not responsible, it is in the power of the relevant responsible Ministers to change it. We live in a world where, as I understand it, in the US the President is allowed half an hour’s prerelease access to the most marketsensitive data. In this country, 142 people get the crime data in England and Wales 24 hours in advance. Health and safety statistics, there are 50 people. Forty-two people get the multiagency public protection arrangements annual report 24 hours in advance.
Q80 Chair: What about economic data that would perhaps be more useable?
Andrew Dilnot: My own view is that no prerelease access would be the appropriate thing. The Statistics Authority’s official view has been, in its report, that reducing the 24 hours to three hours and significantly reducing the number of people who have access would be the right thing. That would be a step in the right direction. My own sense is that we are unlikely to get a complete removal of this, but what I have agreed with a number of senior figures I have spoken to is that we will start by pressing on those that we think are the most absurd, and seeing if we can get pressure across the whole system to reduce the number of people who are getting it and to reduce the time. That is something where we can try to begin some of that process within the ONS, because some of the ONS figures themselves are released to a large number of people.
Richard Alldritt: Internationally, the Act has done a lot of good for the reputation of UK official statistics, except in this respect. This is the thing that is mentioned to us: you have got good arrangements, but it is a pity that you have institutionalised prerelease access, which everybody else is trying to drive out of the system.
Chair: We think we have friends in high places on this matter.
Q81 Kelvin Hopkins: One very brief last question: the Royal Statistical Society has called for the Consumer Prices Advisory Committee to be reconstituted as an advisory committee to the ONS. With the widening gap between RPI and CPI, and all sorts of controversy surrounding price indices, would you think that is a good idea? Would you support doing that?
Andrew Dilnot: I mentioned earlier that CPAC has had two major pieces of work, one of which is out to consultation. The next on the formula effect, I hope, will be going out to consultation, under the National Statistician’s authority, very soon. At the end of those pieces of work, we will have come to the end of an obvious programme of work that CPAC and others have been doing. Yes, I think it would be appropriate for us to review the governance and constitutional arrangements for CPAC, and the way in which RPI and CPI in general are conducted. CPAC was set up by the Authority 2009. We soon will have been through a whole programme of work, and I think it is appropriate that we pause and reflect on which aspects work well and which work less well.
Q82 Kelvin Hopkins: Bring it closer to the ONS though, so that it works with you, so to speak.
Andrew Dilnot: I wouldn’t want to prejudge that, Mr Hopkins. I think we should have a review of the whole governance and structure arrangements, and think about what is sensible. That is certainly something where we would want to make sure that we consulted this Committee, as we went forward. As I said earlier, we should be going out to consultation on the formula effect quite soon.
Chair: Thank you very much indeed for your evidence this morning. All three of you delivered with enthusiasm and commitment, as well as a deep knowledge of the subject. We take this responsibility extremely seriously on this Committee, and great things are expected of you in the years ahead. Could you pass to all your staffs our thanks for their commitment and to all the statisticians across Government? We value their independence and integrity very highly indeed, and we will want to know if that is being compromised by pressures upon them. We take your role extremely seriously for that reason. Thank you very much indeed for being with us this morning.