Local Government (Best Value) Performance Indicators and Performance Standards Order 2002

[back to previous text]

Sir Paul Beresford: For the last time?

Mr. Leslie: I give way.

Sir Paul Beresford: The Minister is learning fast. He answers the question that he wishes I had asked rather than the one that I actually asked. He is talking about comparisons between local authorities. There is a distinct drive against comparisons with the private sector that might induce a better performance from local authorities.

Mr. Leslie: Many local services are not provided by the private sector, so are not easily compared with it. I understand the hon. Gentleman's point. He presided over an era of compulsory competitive tendering as a Minister and hankers after it. If the official Opposition oppose the order or pray against it, it would be interesting to know whether they seek to return to such an era. The hon. Member for Cotswold may mention that point when he winds up.

Column Number: 19

I turn to the rationalisation of performance indicators for which the order provides. Performance indicators are not a phenomenon of best value alone. Local government has reported performance against a range of indicators since 1993, when the hon. Member for Mole Valley was a Minister and the Audit Commission was first given a power under the citizens charter initiative to set national indicators for local government. By the financial year immediately before the introduction of best value in 19992000, the set of Audit Commission indicators had grown to a total of 241. Early into best value, the Government acknowledged in a response to calls for local government changes that there were too many centrally prescribed indicators. We were able to reduce the total number of indicators to only 97 for single-tier authorities in this financial year by focusing on outputs and outcome measures, and ensuring that indicators act as key predictors of performance.

I will come to the point raised by the hon. Member for Bath (Mr. Foster) shortly. Local government generally welcomed the developments in response to the consultation exercises conducted last November. The Government recognise that local authorities must meet their local priorities, and we have encouraged them to develop their own local indicators and targets to help them in that process.

On user satisfaction surveys, the views of local people who are the consumers and taxpayers of the services are fundamentally important. All providers must know what their customers want and expect from them. We therefore developed a set of public satisfaction and expectation surveys. By operating the surveys on a three-year cycle, authorities will be able to measure their performance over time. I hope that that answers the point raised by the hon. Member for Bath.

We want local authorities to aspire to be the best and to deliver high quality services. It is important that central Government have the ability to set standards and targets, but we want local authorities to take responsibility for themselves. That is why the Government are limiting the prescribed standards and targets. We want to encourage local authorities to set their own local targets, but to take account of local priorities so that they can decide their own pace of performance and improvement. The indicators reflect the promises that we made when the principles of best value were enacted. Indicators in the statutory instrument embrace many of the suggestions that hon. Members made in debate: the need to develop indicators that reflect cost, quality and public satisfaction; the need for locally determined indicators to reflect particular priorities in certain neighbourhoods; and the need to keep to a minimum the use of nationally determined standards and targets. There is clear evidence that best value is working. Analysis of the performance data from the financial year 200001 shows that local authorities have embraced best value, and there is much to encourage us. There have been improvements in a range of services. Some 80 per cent. of all councils are showing

Column Number: 20

improvement against a basket of indicators, and the gap between the best and worst performing council is narrowing.

Mr. Clifton-Brown: The Minister obviously wants, as we all do, a narrowing of the gap between the best and the worst council against each indicator. Will the best value inspection service pay greater attention to councils that are performing badly? That is surely the way to ensure that we lift up the standards of poorly performing councils.

Mr. Leslie: I hope that that is obvious from all that the Government are doing. We want to focus particular effort on poorly performing authorities. Best value performance indicators is one of the ways in which we can tell which are the worst performing authorities, because they give us feedback and show us which authorities are not doing as well as others. We have made changes that have improved the performance of local authorities. That was echoed in the Audit Commission report ''Changing Gear'', which showed that 40 per cent. of services inspected were judged to be good or excellent and 50 per cent. were deemed likely to improve. There is still a long way to go, but local government performance culture has in general changed for the better. Most councils are now more open about their performance, and authorities recognise the value of performance information in enabling them to be more in touch with communities.

The hon. Member for Cotswold criticised best value indicator number 11 in schedule 3 on special educational needs statements and how quickly they are issued. Rather peculiarly, he gave that as an example of something that was overly bureaucratic and not necessary as a performance indicator. However, many parents are extremely concerned about how quickly SEN statements are issued, and it certainly matters to the children concerned whether and how quickly they get the extra support that they need. The measure is also an important management tool because it shows authorities how well they are performing in speeding the SEN statementing process, and where they need to target their resources more effectively if they are under-performing. I am, therefore, surprised that the hon. Gentleman criticised that indicator; in my view, it is extremely important.

The hon. Gentleman also talked about the cost of the best value regime in general. Again, I am surprised that he believes that it is somehow a waste of money for local authorities to strengthen their corporate structures and performance management systems. I believe that ensuring that we have good management, leadership and corporate structures in local authorities throughout the country is extremely important. Yes, there will be initial costs in getting things right, but I do not believe that they will necessarily recur every year. Indeed, the whole point of the order is to reduce the number of indicators from the previous system, which will help to reduce the costs imposed at national level.

Best value indicators will indicate poor performance. The hon. Gentleman asked when intervention powers were last applied. He will be

Column Number: 21

familiar with the problems that the London borough of Hackney faced and the intervention that took place there. In addition, the recent Audit Commission report on Walsall, which the hon. Member for Mole Valley mentioned, recommended action and intervention, and those matters are being actively considered. Intervention powers can be used if necessary.

The hon. Member for Bath asked many questions and I will try my best to answer as many as possible in the short time left. I ask my hon. Friends to bear with me, hopefully not for too much longer. The hon. Gentleman said that there had not been a 23 per cent. reduction in the numbers of best value performance indicators. When we mention that figure, we are talking about the statement on numbers that apply to single-tier authorities. I can assure him that no council is now required to report against more than 97 performance indicators. That is the bottom line.

The hon. Gentleman also asked about the logic of reducing the numbers of performance indicators and the rationale that the Government have used for thinning some of these out. We have certainly tried to keep the performance indicators that are predictors of key national priorities and we have tried our best to remove more process-oriented measures. For example, on waste collection, the number of missed bins per 1,000 of population might tell us about the management and contract arrangements for an authority, but they do not necessarily tell us about outcomes. I accept, however, that these are points for debate.

Another principle is that we want to move from nationally set performance indicators to more locally set PIs, and local authorities certainly have the ability to consider how they might monitor their own services. However, in many ways the most important principle used for thinning out PIs is that we do not want to duplicate information collection requirements. A vast amount of data that the Government require is provided by the Office for Standards in Education, and contained in some of the financial returns received in revenue outturn forms, the forwards assessment frameworks from health and social services departments, and housing strategies. We do not always want to duplicate that information requirement in performance indicators. We want to ensure that we reduce bureaucracy. I accept that 189 indicators were burdensome, and reducing that figure to only 97 strikes the right balance.

The hon. Member for Bath asked about corporate health performance indicators. The changes to some of the indicators that he mentioned related to reducing duplicative bureaucracy in the community strategies target that he also mentioned. He asked why we were dropping the targets on maladministration. The Audit Commission and the comprehensive performance assessment will already pick them up, so dropping them avoids duplication.

Information about election turnout is available elsewhere for local government and local electors. On community safety targets, the hon. Gentleman also asked why we were dropping the target on road safety casualties. Performance indicator No. 3 in schedule 9 shows that it is still included and has not been

Column Number: 22

dropped. On transport in general, I talked about how the surveys are collected every three years, so they do not necessarily appear in the order. On housing benefit, the information to which the hon. Gentleman referred is collected in other reports. On the word ''arisings'', I am given to understand that it is a technical term, and there is guidance on its use.

 
Previous Contents Continue

House of Commons home page Parliament home page House of Lords home page search page enquiries ordering index


©Parliamentary copyright 2002
Prepared 19 June 2002