Select Committee on Public Accounts Forty-Ninth Report



INITIATIVES TO IMPROVE POLICY-MAKING

  1. Government departments and their agencies spend some 350 billion each year on a range of services and activities intended to benefit the public. If policies are not well designed and implemented public services may be of poor quality, those intended to benefit may not do so or a significant section of society may be excluded from the benefits, or a policy may be successful in achieving its objectives but the cost of doing so may not represent value for money[12] (Figure 2).
  2.  

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

  3. The Centre for Management and Policy Studies was formed as part of the Cabinet Office in June 1999 to promote improvements in policy formulation, for example by making it more evidence-based, giving greater emphasis to achieving tangible outputs, and learning lessons through systematic evaluations. In September 1999 the Cabinet office published Professional Policy-Making for the Twenty-first Century[13] setting out nine key characteristics to which policy-making should aspire. This was followed in November 2001 by the report Better Policy-Making[14] which included 40 case examples of good practice in policy-making.[15]
  4. Adopting good practice

  5. Asked how it could be sure that the good practice which it was advocating was being translated into action by departments, the Cabinet Office told us that there was some evidence of good progress but also a recognition that more needed to be done. Departments were getting better at consulting and informing stakeholders who had an interest in a policy, but were less good at forward thinking and using international evidence to learn from other countries. Better information on the impact of policy-making in delivering tangible outputs and improvements in public services was becoming available through reporting progress against achievement of Public Service Agreement targets.[16] The Cabinet Office also told us that they followed up some specific initiatives, for example they had asked departments to report on their compliance with guidance on the need to consult more widely in developing policies.[17]
  6. The Cabinet Office and the Treasury have produced a substantial amount of guidance on policy-making. The Comptroller and Auditor General found, however, that it was not widely used by departments because it tended to be inaccessible and did not fit together well, and often reduced policy-making to a structured logical process which did not reflect reality. The Cabinet Office accepted that guidance tended not to be user friendly, but said that it was making it more accessible by establishing a website which would include practical examples of best practice in policy-making. It intended to monitor the number of departments accessing the website as well as the use made of the good practice. Websites were not enough on their own as they were only a source of information, so in addition the Cabinet Office was establishing networks of policy makers to make it easier for them to consult one another and share good practice.[18]
  7. The role of the new central units

  8. In addition to the Policy Unit and the Performance Innovation Unit, three new units - the Forward Strategy Unit, the Delivery Unit and the Office of Public Services Reform were established as part of the Cabinet Office in June 2001. The Cabinet Office accepted that there was some scope for confusion, but emphasised that while the units were doing complementary work they had different responsibilities. The Delivery Unit was working with departments on achieving the targets in their Public Service Agreements. The Office of Public Services Reform was developing a departmental change programme to improve civil servants' programme delivery and project management skills. The Forward Strategy Unit took a longer term perspective considering issues to which government might need to respond in ten to fifteen years.[19]
  9. The use of experts in policy-making

  10. Rigorous data and analyses are needed to inform the government's decision-making on policies and assess the outcomes policies are delivering. Without soundly based analysis, research and modelling those involved in policy development and delivery of services will be working in the dark and unaware of the impact policies are having. Some 400 million is spent on policy related research by departments each year. A report[20] by the Cabinet Office's Performance and Innovation Unit in January 2000 identified a need for policy makers to make better use of the 1,800 specialists (economists, statisticians, actuaries, social researchers and operational researchers) employed by departments. The report was also concerned at the distribution of specialists across departments and drew attention to the relatively small numbers of economists and statistical experts devoted to some high expenditure areas. The Comptroller and Auditor General found at the time of his examination in 2001 that the situation remained unchanged. The Cabinet Office considered that departments were now more successful in recruiting economists and statisticians and did also buy in extra advice for a particular purpose if in-house skills were not available. But for IT project management skills for example, departments were often competing in an expensive market. The Cabinet Office also said that a better system for managing knowledge was needed so that policy makers could more easily access the wider range of information and research that was now available to make policy-making more evidence based.[21]
  11. Interconnections between policies

  12. Some departments such as the Department of Health and the Department for Transport, Local Government and the Regions have a broad range of policy responsibilities whose activities impact on different groups in society. Policy developed in one part of such a department may have an indirect impact on other policies either in the same department or other related departments and agencies. The consequence of not identifying and considering these connections is that policies may be developed in isolation from other priorities. Resources may be misdirected and lessons not learned which might have wider application within departments or to related policy areas in other government organisations. The Committee has for example, been concerned at bed blocking in the NHS where people remain in hospital who do not need to do so because of inadequate arrangements for their care outside hospital, which prevents the use of NHS beds for people who need to be in hospital. The Comptroller and Auditor General nevertheless found no examples of the interconnection between policies being regularly reviewed. The Cabinet Office said that the need to consider carefully the interconnection between policies was now widely accepted by departments. The last spending review had included a number of cross-cutting reviews. Moreover when policies were considered by Cabinet Committees their interconnectivity was also reviewed.[22]
  13. Peer review

  14. The Cabinet Office has recommended that departments subject themselves to peer review as one means of improving policy-making. These are intended to be constructively critical reviews of departments' key activities carried out by an independent team with hands on experience of the area under review to identify practical ways to improve. The Cabinet Office told us that it had been subject to a peer review in the summer of 2000, which had led it to consider how it worked across all its responsibilities to achieve greater coherence and avoid duplication, and how it could become more engaged with departments to strengthen policy-making.[23]
  15. IMPLEMENTING POLICIES

  16. Poor value for money and under-performance are likely to mean that the implementation of policies has not been thought through and planned. For example, when the Child Support Agency was established in April 1993 to implement a major new social policy and to operate a new system of child maintenance, a very high risk strategy was adopted by bringing in a totally new policy, with new staff and a new computer system, all at the same time. Staff numbers had to be increased because the time required to carry out maintenance assessments had been underestimated and was twice as long as expected.[24]
  17. Typical problems encountered during implementation include: over-ambitious timescales or resources not being available when required; those implementing policies not having the appropriate skills or training; poor project management resulting in significant and uncoordinated policy or design changes contributing to cost increases and time delays; and roles and responsibilities not being clearly defined resulting in confusion and under-performance. Cost-effective implementation also requires departments to remain alert to new opportunities, such as developments in information technology, which might improve service delivery.[25]
  18. Risk management

  19. The Committee has in a previous Report[26] emphasised the importance of departments improving the way in which they manage risk and we asked whether sufficient attention was now being given to risk management. The Cabinet Office said that it was working closely with the Treasury and the Interdepartmental Liaison Group on Risk Assessment (ILGRA) to develop a statement on the approach to risk management to be adopted across government. The Performance and Innovation Unit was carrying out a study on how best to promote improvements in departments' risk management. All Permanent Secretaries were now required to sign annual statements of internal control which were intended to provide assurance that reliable risk management is in place.[27]
  20. The Comptroller and Auditor General found that departments were spending more time on risk identification and analysis in developing policies, but risk assessment and management still tended to be ad hoc and not done in a systematic way. The Cabinet Office considered that while the importance of risk management was now more widely accepted by departments more needed to be done so that staff better understood how to manage risk. This was something which the Cabinet Office's Delivery Unit and Office of Public Services Reform would be addressing by drawing departments' attention to examples of where risk had been effectively managed in rolling out projects and programmes.[28]
  21. There have been a number of instances when departments have not kept the public sufficiently informed about the degree of risk involved with some programmes or policies. For example, the public had for some time had increasing concerns over the health risks associated with BSE (Bovine Spongiform Encephalopathy) before these were fully explained. We asked what departments would normally do to explain likely risks to the public arising from complicated scientific issues which might affect their health. The Cabinet Office said that the Performance and Innovation Unit study currently under way was considering how to communicate risk better to the public. The government's Chief Scientist was also reviewing how to handle scientific risk and, in particular, how to improve departments' capacity and skills to form better assessments of scientific risk.[29]
  22. The Department of Health said that for nearly a decade it had twice a year surveyed mothers of young children to ascertain their understanding of the risk of disease and their concerns about, for example, vaccines. It used this knowledge in deciding how to communicate health policies to the public. In terms of the likelihood of risks becoming a reality, the Department told us that it worked very closely with the scientific community to ensure that it had a good understanding of scientific risks. We asked whether there was a danger that departments could be too reliant on scientific evidence. For example, the scientific advice for many years was that there was no risk to human health from BSE and then the scientific advice suddenly changed. The Cabinet Office considered that however good the risk analysis some judgement was ultimately required. It was important with all advice to the public that risks were clearly set out and explained so that people could make an informed choice based on the known evidence.[30]
  23. Consulting those who have to implement policies

  24. Assessing how a policy is likely to work in practice is a crucial stage in policy design, because it should identify practical constraints which need to be overcome if the policy is to be successful and can help develop more accurate estimates of the likely cost and impacts of the policy. The Comptroller and Auditor General found, however, that those required to implement policies were consulted fairly late in the design process. We asked to what extent Executive Agencies that often had to implement policies were sufficiently involved in their design. The Cabinet Office told us that agency involvement in policy design varied considerably depending on their roles and responsibilities. A review of agencies which reported to Ministers in March 2002 considered the range and scale of their activities including the extent to which they were sufficiently engaged in policy development.[31]
  25. We asked to what extent frontline staff, and in particular general practitioners, had been consulted in developing the Meningitis C vaccination policy. The Department of Health said that it had worked closely with the British Medical Association and the Joint Committee on Vaccination and Immunisation, which included representatives from the Royal College of General Practitioners. The Department told us that with any new health programme it would discuss how it should be implemented with general practitioners and other frontline staff such as those working in community child health in respect of school health services.[32]
  26. Asked whether sufficient warning was given to general practitioners about changes in policy, the Department of Health said that in some cases the profession would be alerted three to six months in advance, but sometimes the interval might be fairly short, such as in the winter when the incidence of Meningitis C increased and there was a need to immunise quickly. The Department said that there were immunisation co-ordinators in every health district and it sought to make sure that they were consulted and had all the necessary materials required to train those involved in implementing a new programme sufficiently early.[33]
  27. Terminating policies that are not cost-effective

  28. There may come a time when a policy has achieved its intended outcome, remedied the social or economic issue it was designed to tackle, or the policy may have become obsolete or ineffective. It may then be necessary to replace it with a new one to reflect different circumstances, or more cost-effective to terminate the policy altogether. The Comptroller and Auditor General found that policy reviews and formal evaluations rarely resulted in a decision to terminate policies that were no longer effective. The tendency was for policies to be developed into new forms rather than to be terminated completely. Asked why this was so the Cabinet Office said that the issues to which policies had to respond did not necessarily change so the need for a policy often continued. Policies were, however, reviewed and adapted to reflect different circumstances and to improve their effectiveness, as in the case of the single regeneration project intended to help deprived areas. This policy had had some beneficial impacts but a full evaluation recommended that local people should be consulted so that the policy better reflected their needs. As a result the policy was refocused by the Neighbourhood Renewal Unit rather than terminated.[34]
  29. The Cabinet Office's Women and Equality Unit has a policy to promote women's entrepreneurship by working with the Small Business Service, the Department of Trade and Industry and the Treasury. The outputs were a benchmarking report comparing the state of knowledge about women's entrepreneurship and self-employment in the UK and Sweden and a new website providing advice specifically for women. Asked how the policy could be considered effective, and whether it should have been terminated, the Cabinet Office said that the role of the Women and Equality Unit was to generate ideas which it would not necessarily implement itself. Its work to promote women entrepreneurs was now being taken forward by the Small Business Service. It was too soon to evaluate the success of the policy in helping women to set up businesses.[35] Asked how many policies had been terminated over the last two to three years because they did not provide value for money, the Cabinet Office said that it did not routinely collect such information.[36]
  30. Need for early warning indicators

  31. Early warning indicators ranging from increases in letters from the public and lobbying by interest groups, to detailed analyses of trends in the incidence of disease (such as had informed the introduction of the Meningitis C vaccination programme in 1999, Figure 3), falling examination pass rates or increases in the demand for social support, may suggest a need to examine the effectiveness or appropriateness of a policy.
  32. Figure 3: Cases of Meningitis C, 1990-1999 (laboratory confirmed)

    Source: C&AG's Report, Modern-Policy Making: Ensuring Policies Deliver Value for Money, page 71.

    Asked how departments ensured such indicators were in place to monitor if a policy was working, the Cabinet Office accepted that there was a need to have better performance indicators to determine what policies were achieving. The Department of Health said that it received weekly feedback from all those involved in implementing the Meningitis C vaccination programme and daily reports from the suppliers of the vaccine. The Department for Education and Skills told us that the National Literacy Strategy had mechanisms in place to provide quick feedback both from local authorities and schools on all aspects of the strategy as it was rolled out. Asked whether the results of the Key Stage 2 examinations[37] (Figure 4) were a reliable indicator of the strategy's impact in improving literacy the Department said that other indicators were also important such as Ofsted's reviews of the quality of teaching and learning in the classroom.[38]

    Figure 4: National Curriculum English test results at age 11 in primary schools

    Source: C&AG's Report, Modern Policy-Making: Ensuring Policies Deliver Value for Money, page 78.

     

    Evaluating the impact of policies

  33. Evaluating the impact of policies is important for determining the extent to which they have met or are meeting their objectives and that those intended to benefit have done so. Evaluation can also help departments share good practice and learn lessons. The Comptroller and Auditor General found, however, that while departments were commissioning more evaluations they needed to be more practical. The extent to which lessons were learned was also variable. The Cabinet Office said that major evaluations were taking place of programmes such as New Deal for Communities and Sure Start, but it accepted that more needed to be done. The Centre for Management and Policy Studies had set up a new training programme in policy evaluation bringing policy makers and analytical staff together as well as encouraging civil servants to share examples of good practice.

 


12   C&AG's Report, para 1.5 Back

13   Cabinet Office, Professional Policy-Making for the Twenty-first Century, September 1999 Back

14   Cabinet Office Centre for Management and Policy Studies, Better Policy-Making, November 2001 Back

15   ibid, para 4; Qs 2, 39  Back

16   Public Service Agreements were first introduced in 1998 setting out each department's objectives for the public services which they are responsible for together with measurable targets to monitor the delivery of the objectives.  Back

17   Qs 2, 8, 117, 127  Back

18   C&AG's Report, para 2.21; Qs 45-50, 57-59 Back

19   C&AG's Report, para 1.13; Qs 17-26 Back

20   Performance and Innovation Unit, Adding it Up - Improving Analysis and Modelling in Central Government, January 2000.  Back

21   C&AG's Report, para 2.8; Qs 4-5, 126 Back

22   C&AG's Report, paras 10, 2.6; Qs 5, 7 Back

23   ibid, para 1.10 (Figure 12); Q13 Back

24   C&AG's Report, para 2.17  Back

25   C&AG's Report, para 3.6 Back

26   1st Report from the Committee of Public Accounts, Managing Risk in Government Departments (HC 336, Session 2001-02) Back

27   Qs 16, 63-69  Back

28   C&AG's Report, para 2.17; Qs 60-61 Back

29   Qs 35-36 Back

30   Qs 36-37 Back

31   C&AG's Report, paras 14, 2.12; Q15 Back

32   Qs 98-100 Back

33   Qs 109-110 Back

34   C&AG's Report, para 3.23; Qs 83-89, 92-96 Back

35   C&AG's Report, Appendix 5, para 7; Qs 30-34 Back

36   Qs 41-42 Back

37   Key Stage 2 National Curriculum tests in English are taken by children at age 11. Level 4 is the standard expected for 11 year-olds. In 2001, 75 per cent of 11 year-olds reached this standard. Back

38   C&AG's Report, para 2.6; Qs 71-76, 79 Back

 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2002
Prepared 31 July 2002