Select Committee on Environmental Audit Minutes of Evidence



Supplementary memorandum from the Department for Environment, Food and Rural Affairs

FOLLOW-UP QUESTIONS TO THE DEPARTMENT BY THE COMMITTEE FOLLOWING THE EVIDENCE SESSION

1.   With reference to question 62 and 64, you agreed to provide the following: (a) an explanation of why UK figures cannot be provided for all 15 headline indicatiors, (b) how far it is possible to get a standardised UK view and (c) if all the indicators were based on UK figures, would the final assessments for 2001 (red, amber or green) still stand?


  (a)  UK figures cannot be provided for all 15 headline indicators because in some cases data or long-term trends are not available or are not comparable between all constituent countries. When the headline indicators were formulated, the constituent countries were consulted regarding data availability and comparability.


  (b)  Currently, nine indicators use UK data, three use GB data and three use data from England or England and Wales (see Figure 1 for details of the individual indicators). It is not therefore possible at present to get a standardised view across the UK.

  The devolved administrations are developing their own sets of indicators. Scotland and Wales have recently published their first set of indicators, influenced by the overall UK set (see Table 1 for comparisons of the indicators sets developed by each country). These were published after the Government annual report in March 2002. Northern Ireland's set is still in consultation. DEFRA works closely with the other administrations.


  (c)  If it were possible to expand the six indicators to the UK, that are currently only available for England, England and Wales or Great Britain, the assessments for 2001 (red, green or amber) would be unlikely to change. In all cases the trend will predominantly be affected by that seen in the constituent countries already covered by the indicator, and will not be affected by the relatively small contribution in terms of population size or number of dwellings made by any missing countries.

THE THREE NON-UK/GB INDICATORS ARE:

H7—Housing (England data only)

  Scotland: no comparable data are available.

  Northern Ireland: not available at present—will be available from the 2001 Housing Conditions survey.

  Wales: per cent of unfit dwellings data are available but only for 1998.

H8—Crime (England and Wales data)

  Scotland: no comparable data are available due to differing crime definitions under Scottish Law (only total number of crimes are available).

  Northern Ireland: data are available on all three crime indicators 1998 to 2000-01—these were not available when the indicators were first published. In addition, England, Wales and Northern Ireland collectively are not a recognised administration.

H14—Land Use (England data only)

  Scotland: no comparable data are available.

  Northern Ireland: no comparable data are available.

  Wales: no comparable data are available.

THE THREE INDICATORS USING GB DATA ARE:

H4—Poverty and Social Exclusion:

  Northern Ireland: three of the four indicators are available.

  The four indicators contained within this indicator were selected from those within the Opportunity for all: making progress[2] and were based on the best available data.

H6—Health:

  Northern Ireland: Life expectancy available; Healthy life expectancy not available.

H11—Road Traffic:

  Northern Ireland: vehicle kilometres available 1990 to 1998—no long-term trend.

 

See Annex A for a table of comparisons by country

 

 

2.   With reference to question 93, you agreed to clarify how often figures for renewable energy are calculated.

    —  DTI publish renewable energy data on an annual basis.

    —  Some components of the renewable energy total are collected more than once per annum.

    —  Large-scale hydroelectric producers data are collected monthly and "Auto producers" data (companies that generate electricity for the national grid as a by-product of their operations) are collected quarterly.

    —  Overall electricity sales data (from which the renewable figure is calculated) are collected monthly.

3.   Do the latest set of headline indicators paint the picture of a country progressing well on its sustainable development agenda?

  The barometer in the 2001 Report clearly show that, on the basis of data published since the snapshot we took when we issued the Strategy in 1999, qualify of life as measured by the headline indicators is improving. However, as I was at pains to point out at the press conference on 13 March, whilst there have been improvements in some areas I would not claim that everything is now on the right track.

  Since 1999, all or part of six headline indicators remain on the right track including economic output, educational qualifications, greenhouse gas emissions, air quality, river water quality and vehicle theft/domestic burglary (but not violent crime, including robbery, the other component of the crime indicator). The trend for another four—investment, employment, poverty and social exclusion and populations of woodland birds (one of the two components of the wildlife indicator)—have improved and these have been given a green light in the change since Strategy assessment.

  A key test of whether progress is being made towards sustainable development is whether improvements are to be found across the three pillars of social, economic and environmental indicators. As with the first annual report, the latest report shows this to be the case. The 1999 Strategy—a better quality of life—makes clear that Government's aim is for all the headline indicators to move in the right direction over time, or, where a satisfactory level has been reached, to prevent a reversal. Where a trend is unacceptable, the Government will adjust policies accordingly, and will look to others to join it in taking action.

  There is no sense in which the Government or I can be complacent about the situation. Indeed in no case do we believe it is yet right to change the long-term assessments of change since 1990 that were made in the year of the Strategy, despite the recent improvements. We are a long way from saying that everything we do - as a nation, as a government, as businesses, as communities and as individuals - is now sustainable; nevertheless I do believe the evidence shows we are a little closer than we were.

4.   You told the Committee (Q95) that it is the Government's statistical service which provides the red, amber, green assessments for the headline indicators. What guiding criteria do they use, if any? Is this applied every year to ensure a consistent assessment?

  The data in the headline indicators are predominantly from long-running time series collected by the Government Statistical Service (GSS), as part of National Statistics, or by environmental regulatory monitory bodies. National Statistics provides a framework for statistics of quality, integrity and freedom from political interference. It provides an up-to-date, comprehensive and meaningful description of the UK's economy and society. National Statistics covers government data produced to the high professional standards set out in the Official Statistics Code of Practice.

  Data sets used for the indicators were selected using quality criteria to ensure that they were sufficiently reliable to monitor progress in the main themes of sustainable development identified in the strategy.

  The data are analysed by government statisticians within DEFRA in consultation with the Departments responsible for the data and the relevant policy areas. The traffic light assessments for the change since Strategy column were made as a result of such analysis and published following Ministerial approval.

  The traffic light system was first introduced in Quality of Life Counts in 1999 as a method of summarising progress. It made comparisons with baselines (eg 1970, 1990) to assess which trends have been moving in the right or wrong direction, or where there has been no significant change.

  Progress is judged in both the first and the latest annual report on two bases: Change since 1990 and Change since the Strategy (comparing the most recent data with data available at the time of the Strategy—usually one or two years prior to the Strategy).

5.   You have been encouraging local authorities to use sustainable development indicators on a voluntary basis. How many are doing so?

  Following the production of the national set of Quality of life counts, it was recognised that more local indicators were needed to better reflect local issues and to assist in the monitoring of regional and local strategies. In July 2000 a new handbook Local quality of life counts was published, offering ideas for measuring sustainable development and quality of life in local communities. In total, 5,000 Local quality of life counts handbooks were published. By October 2001, the distribution centre had issued 4,481.

  The Audit Commission have been running a pilot study of Local Authority quality of life indicators over the past 18 months. To date, 93 local authorities are involved (approximately a quarter of all authorities).

6.   The sustainable development strategy and indicators were published in 1999, the first annual report was published in January 2001, and the second in March 2002—Is DEFRA planning to settle into a more regular reporting routine now?

  The first annual report covers the period from the Strategy to the end of 2000. The second report covers the 12-month, calendar year 2001. The next report, which will be published in the early months of 2003, will cover the year 2002.

  As a country, we are making progress on Sustainable Development. We were one of the first counties—in 1994—to produce a Strategy after the 1992 Earth Summit in Rio. When this Government came to power we set in train the development of a new Strategy, published in 1999—which placed equal emphasis on the social dimension as on the economic and environmental dimensions of sustainable development. We are world leaders in terms of annual reporting and the use of indicators to assess progress towards sustainable development.

7.   Many of the 150 core indicators are yet to be developed or require further development before we can expect systematic reporting. Could you please provide an update on the progress being made to get all the indicators operational. When can we expect the full data set to be up and running?

  Establishing the Quality of life counts was a very ambitious exercise, and the UK was the first country to produce such a comprehensive set of indicators of sustainable development.

  The headline indicators are now well established. When producing the wider core set of almost 150 indicators in Quality of life counts, we felt that it was important to highlight the need for indicators for areas where none are available, and to recognize that some of those we could establish are not ideal. The development of indicators is very much dependent on the availability of data, and techniques to transform the data into meaningful indicators that will genuinely reflect progress.

  Various streams of work have been taken forward to fill the gaps or to improve existing indicators, for example on resource use, access to the countryside or other green space, and tourism, But work is ongoing.

  There is a commitment to undertake a full review of all the indicators by 2004. It should be recognized however, that indicator development is a continually evolving process, as new issues arise or, new data become available.

See Annex A for a table of comparisons by country

8.   Can you clarify why the full set of statistics, supporting the waste headline indicator (H15), are only collected every five years?

  The statistics for the headline indicator H15 on arisings and management of waste from households, industry and commerce, including construction and demolition, come from a vaiety of surveys by DEFRA, DTLR, the devolved administrations and environment agencies. The frequency of collection reflects the likely degree of change from year to year and a balance between meeting policy, planning and other reporting needs against the costs and burden to respondents of collecting the data. For example, Waste Strategy 2000 (published May 2000), is due to be reviewed every five years. Future data collection will also need to take account of the draft EU regulation on Waste Statistics although the frequency of reporting has yet to be agreed, original proposals would require data every three years, currently it is every two years.

  Information on household waste is available from an annual survey of local authorities but this makes up only about a sixth of the 170-210 million tonnes of waste covered by the indicator. DTLR are currently undertaking the second in what is planned to be a regular series of surveys of construction and demolition waste. Data should be available later this year which would allow a partial update of the indicator since construction and demolition waste together with household waste make up 55-60 per cent of the indicator. Estimates of industrial and commercial waste in the indicator are based on an Environment Agency survey undertaken in 1998-99. The Agency with DEFRA are considering plans for the second survey which is likely to be in 2002-03. The results of this survey will allow a full update of the indicator.

9.   Can you confirm your statement (Q89) that there is no interim target for renewable energy for 2003? Our understanding is that there is a target of 5 per cent for 2003.

  The transcript highlights a certain amount of confusion surrounding the issue of interim energy targets, which remained unresolved as we progressed onto other areas of our debate. In paragraph 83 of the document I confirmed the "Renewables Obligation of 10 per cent of all energy, for electricity generation from renewable sources by 2010." Mr Jones then went on to enquire about the interim renewable targets, of which I stated there was no such confirmed targets—still referring to the Renewables Obligation.

  It has now become clear from reviewing this debate that the interim target to which Mr Jones alluded, was that proposed in conjunction with the preceding Non-Fossil Fuels Obligation (NFFO) in the DTI's March 1999 consultation document on renewable energy. The delayed 2003 target of 5 per cent to which Mr Wilson referred in his testimonial earlier this year was contingent on approved projects under NFFO gaining planning permission, which has proved a more gradual process than originally expected. It was never set up as a measure of the Renewables Obligation to which I was referring at this point. The Renewables Obligation, which only came into effect in April this year, has no directly comparable interim target. However, under the Obligation, licensed electricity suppliers are required to provide increasing percentages of total supplies from renewable sources. The percentage is set to rise from 3 per cent for 2002-03 by annual stages to 10.4 per cent in 2000-11. The set level for 2004-05 is 4.9 per cent and for 2005-06, 5.5 per cent.

 


2   DWP (September 2001) Opportunity for all: making progress Third annual report. Cm 5260 ISBN: 0-10-152602- Back

 
previous page contents next page

House of Commons home page Parliament home page House of Lords home page search page enquiries index

© Parliamentary copyright 2002
Prepared 11 July 2002