|Best Value in Local Government
Progress on conservation benchmarking
Bob Kindred explains that all this is more than just the latest government buzzwords.
Best Value in Local Government will be coming to a local authority near you soon! The first round of the initiative is now well advanced. Thirty-seven local authorities are participating as pilot authorities, although not all are involved with conservation. In some cases, specific under-performing services or an inter-related group of services (such as housing and health) have been selected for review; in others, all the authority's functions are being reviewed systematically.
The pilots have formed benchmarking groups of interested progressive ,shadow' authorities (usually between six and 12 in number) with whom benchmarking comparisons are then undertaken. These shadow authorities are not necessarily similar to the pilot in terms of resources, size or demographics but all are anxious to exchange experiences on the processes they adopt and the prospect of improved outcomes in the level and quality of the services they provide. This should subsequently assist in arguing that services can be improved without resorting to Compulsory Competitive Tendering (CCT). At some point the Government will expect all local planning authorities to go through a similar process for conservation services.
Some authorities have already been defining their role, unit costs and customer expectations, and some have been undertaking localised (usually county-wide) benchmarking on issues such as buildings-at-risk, but this has not been done nationally and has usually involved authorities of similar size and composition.
The basis of the current national exercise (now nearing completion) may be helpful to many medium-sized Borough & District Councils, but it is clearly recognised that the process is not definitive and must be driven in part by the specific objectives which each local authority has set itself. They are not overall national objectives except in so far as they are enshrined in the current legislation, precedents and scattered disparate good practice guidance (of which more anon). Authorities setting out on Best Value next year may decide, in the light of the pilot round, to undertake benchmarking on an entirely different basis. The County and Metropolitan Conservation Teams will almost certainly approach the exercise differently.
Ipswich is the pilot for development control, land charges and building conservation outside London (Camden is the pilot for development control and building conservation in London). The shadows for conservation are: Carlisle; Lancaster; Chester; Bedford; Bath & North East Somerset and Swale. All the authorities are absorbing this considerable workload entirely in-house without extra resources and in addition to normal workload.
It would seem that, initially, most of these specific authorities came forward as a consequence of also undertaking Development Control Benchmarking where a strong statistical basis had been established, not because any of the services were felt to be underperforming.
To establish a statistical basis for some form of comparative benchmarking, the pilot and the shad~ ows have compiled a set of baseline statistics in a detailed questionnaire to establish data about their organisational structure; priorities; staffing; heritage resources; costs and internal and external administrative and other procedures.
As heritage resources and the local authority's response to them will often vary (sometimes considerably), the importance of establishing qualitative indicators emerged at an early stage. The pilot had then to identify issues from the statistical analysis to form the basis of discussion and a conference of all the parties to discuss approaches to the provision of conservation services and ways in which better procedures and practices might be introduced and how improved performance might be measured.
A final overall report of the discussions has been produced, but deliberately no general conclusions were reached about comparative performance. It was for the individual authorities in the benchmarking group to decide how to respond to the outcome and how to implement and resource any changes which might improve performance. in addition to the comparative work, some authorities have been undertaking customer surveys within their own administrative areas and/or with external agencies with which they work. Some of these matters are set out in more detail below.
In compiling statistics it also became clear that some authorities did not have detailed statistics for some fundamental activities (e.g. the numbers of applications per annum advertised under Sections 67 and 73) and on costs as defined by CIPFA.
There is something of a problem with CIPFA's approach to conserva tion statistics gathering which needs to be resolved. The CIPFA Planning & Development Statistics 1998-9 are still, for example, basing information on List Entries not Listed Buildings (our survey specifically and emphatically required the latter). Clearly, a statutory list of ten list entries each comprising a terrace of 30 Georgian houses is a different entity to a list with 300 individual diverse historic buildings ranging from c. 1400 to 1970. CIPFA are also still lumping together disparate environmental improvement matters such as derelict land reclamation with mainstream building conservation activities.
The information sought falls roughly into the following groups, although not set out in full here:
Organisation: Location within the LPA; range of duties/principal functions of the service; reliance on others for advice; post numbers; etc.;
Heritage resources: Numbers of listed buildings; changes year on year etc.; ancient monuments; conservation areas; A4Ds etc.; historic parks & gardens; local list buildings; buildingsat-risk;
Supporting organisations: BPTs, Churches, Trusts etc.;
Conservation records: Computerisation, photographic records etc.;
Technical exchanges: Forums/joint working arrangements;
Development control and enforcement.. Number of applications; staff responsibilities and roles etc.; nos of enforcement notices/prosecutions etc.;
Conservation Advisory Area Panels: Operation, organisation, scope/remit; Diocesan Advisory Committees.
Involvement/consultation; Spending resources: Grant aid scope, types, resource levels, administration/ delegation;
Miscellaneous functions etc: Formal liaison arrangement with English Heritage/Parish Councils or Neighbourhood Forums/Residents Groups
Administrative procedures: Complaints, Staff welfare/Staff appraisal; Training;
Administrative costs: based on definitions contained within CIPFA Returns 1997/98: Staff Costs;
Quality/service statement.. Customer care/ public meetings/consultations/ customer charters/customer surveys/ staff availability etc.
A questionnaire to grant recipients was also considered but not carried out as the number would have been too small to provide a meaningful sample. in future in conjunction with the provision of development control services, a customer questionnaire may be used to gauge how applicants (rather than agents) perceive the listed building consent process after a decision has been issued.
All customer questionnaires are separate from the benchmarking with the shadows and are intended to give an overview of external perceptions of the conservation service being provided. The Audit Commission seemed satisfied with these arrangements.
Development Control clearly emerged as clearly the highest priority for all the members of the benchmarking group and while other activities were given different priorities by individual authorities technical advice and grant aid were secondary, more-or-less equal; policy matters were a middle ranking concern; buildings at risk and enforcement had more-or-less equal lower rating (perhaps reflecting their time consuming, longhaul staff requirement) and environmental enhancement had the lowest rating - perhaps reflecting a lack or resources or perhaps a job largely completed? English Heritage may not take much comfort from the buildings-atrisk priorities in the light of its recently relaunched strategy on this problem.
It is self-evident that, as the statutory requirements (by comparison with development control) are limited, the weak statistical base makes comparison difficult. The thrust of the pilot Conservation Benchmarking has therefore needed to concentrate on establishing good practice, sound qualitative measures and assessment of the 'added-value' of conservation expertise, i.e. comparing processes and outcomes are the most productive.
Future qualitative issues
Several other issues which might stimulate debate are:
Organisation: How are service objectives to be prioritised? How is long term v. short term workload addressed? How does the authority view its statutory V. nonstatutory conservation functions? Is there a tension between broader, flatter organisational structures - i.e. effective delegation to officers - or is the process then so removed from members that its importance is not appreciated?
Development control and enforcement. How is the adequacy of applications determined at registration? How is the ,special regard' of S. 16(2) incorporated into processing and determining of LBC? How best would conservation issues be reported to members? Does the statutory notification procedure work satisfactorily? How can effective negotiation to improve design quality be developed/evaluated? How is ef fective compliance achieved? How quickly is noncompliance followed up and in what ways is this remedied? Customer care.. How best can public consultation improve design/conser vation quality:
Clearly some of the biggest difficulties in making comparisons are that local authority conservation services are:
In assessing quality, feedback on customer satisfaction is important, but better good practice guidance and peer group review depend on the responsiveness of chief officers and members and a willingness to resource the outcome.
Good practice guidance
As building conservation is an integrated part of the local authority planning system, if further study is to be undertaken, it is felt that IHBC should review good practice elements of Best Value in Local Government for Conservation Services. The IHBC has formally requested the views of the DoETR on how it intends that the Institute should be directly involved on Best Value in Conservation Services. At the time of writing a response is still awaited.
Where do we go from here? That may be the hardest question of all!
|Bob Kindred is Conservation Officer with Ipswich Borough Council.||
Context 60 December 1998