Best Value in Local Government
Progress on conservation benchmarking
Bob Kindred explains that all this is more than just the latest government buzzwords.
Introduction
Best Value in Local Government will be coming to a local authority near you soon! The first round of the initiative is now well advanced. Thirty-seven local authorities are participating as pilot authorities, although not all are involved with conservation. In some cases, specific under-performing services or an inter-related group of services (such as housing and health) have been selected for review; in others, all the authority's functions are being reviewed systematically.

The pilots have formed benchmarking groups of interested progressive ,shadow' authorities (usually between six and 12 in number) with whom benchmarking comparisons are then undertaken. These shadow authorities are not necessarily similar to the pilot in terms of resources, size or demographics but all are anxious to exchange experiences on the processes they adopt and the prospect of improved outcomes in the level and quality of the services they provide. This should subsequently assist in arguing that services can be improved without resorting to Compulsory Competitive Tendering (CCT). At some point the Government will expect all local planning authorities to go through a similar process for conservation services.

Some authorities have already been defining their role, unit costs and customer expectations, and some have been undertaking localised (usually county-wide) benchmarking on issues such as buildings-at-risk, but this has not been done nationally and has usually involved authorities of similar size and composition.

The basis of the current national exercise (now nearing completion) may be helpful to many medium-sized Borough & District Councils, but it is clearly recognised that the process is not definitive and must be driven in part by the specific objectives which each local authority has set itself. They are not overall national objectives except in so far as they are enshrined in the current legislation, precedents and scattered disparate good practice guidance (of which more anon). Authorities setting out on Best Value next year may decide, in the light of the pilot round, to undertake benchmarking on an entirely different basis. The County and Metropolitan Conservation Teams will almost certainly approach the exercise differently.

Ipswich is the pilot for development control, land charges and building conservation outside London (Camden is the pilot for development control and building conservation in London). The shadows for conservation are: Carlisle; Lancaster; Chester; Bedford; Bath & North East Somerset and Swale. All the authorities are absorbing this considerable workload entirely in-house without extra resources and in addition to normal workload.

It would seem that, initially, most of these specific authorities came forward as a consequence of also undertaking Development Control Benchmarking where a strong statistical basis had been established, not because any of the services were felt to be underperforming.

Comparative work
Following advice from the Audit Commission, Ipswich has approached the Best Value initiative by preparing a detailed service statement to describe the operation of all its conservation activities, against which have been set specific service targets to be met to a variety of timescales. This is intended primarily to define for the purpose of the local authority's own staff and members what service the conservation team is providing.

To establish a statistical basis for some form of comparative benchmarking, the pilot and the shad~ ows have compiled a set of baseline statistics in a detailed questionnaire to establish data about their organisational structure; priorities; staffing; heritage resources; costs and internal and external administrative and other procedures.

As heritage resources and the local authority's response to them will often vary (sometimes considerably), the importance of establishing qualitative indicators emerged at an early stage. The pilot had then to identify issues from the statistical analysis to form the basis of discussion and a conference of all the parties to discuss approaches to the provision of conservation services and ways in which better procedures and practices might be introduced and how improved performance might be measured.

A final overall report of the discussions has been produced, but deliberately no general conclusions were reached about comparative performance. It was for the individual authorities in the benchmarking group to decide how to respond to the outcome and how to implement and resource any changes which might improve performance. in addition to the comparative work, some authorities have been undertaking customer surveys within their own administrative areas and/or with external agencies with which they work. Some of these matters are set out in more detail below.

Timetable
The timetable set for the pilot exercise was ridiculously tight. It is set it out below because it is a moot point whether it would be wise to try to work to such a tight timescale in future. Authorities must certainly gear up for it in advance, devote time to it, and ensure that they have a nominated officer who will respond to the request for information quickly, however, it is also very important that the process is done properly - something the Audit Commission recognises.

  1. Send out draft questionnaire, Service Statement and Confidentiality agreement for comment. Seven working days to return.
  2. Four working days to incorporate comments.
  3. Send out final questionnaire and Service Statement. Nine working days to complete.
  4. On receipt of completed questionnaires, Service Statements and enclosures, do a basic analysis in four working days.
  5. Send out all questionnaires, Service Statements and enclosures + pilot's analysis to all parties.
  6. Hold Benchmarking Conference with all parties within ten days.
  7. Within one week of Benchmarking Conference, issue Minutes/Conclusions of the meeting.
  8. Begin internal reporting of outcome with elected members (over eight to ten weeks).

Service Statements
IBC has defined its service in a 3,000 word statement complete with targets for each of the service components. Whether this format will be followed by the other parties remains to be seen. It is horsesfor-courses. The Service Statement was drafted over about three working days. Interestingly, it mirrors to a significant degree the guidance circulated by the (then) Department of National Heritage in 1995 to the new unitary authorities stating what it expected those authorities to provide (although this was not available to us until after we had prepared ours - and of course Ipswich is not a unitary authority).

Benchmarking questionnaire
The questionnaire cannot be all embracing - for example, no shadows in the current exercise (excluding London) are large urban authorities with ethnic minorities living in conservation areas, e.g. Bradford or Leicester. In other cases we tried to anticipate other systems which Ipswich does not have (e.g. we have no Parish Councils) where some of the shadows do. We also tried to anticipate the different ways in which development control casework might be handled. The questionnaire contains about 150 questions mainly (a) statistical; (b) yes/no answers or (c) detailed costs as established by CIPFA. We recognised we could have asked for much more useful information, but not within the timescale and not without all the participants devoting much greater resources to it.

In compiling statistics it also became clear that some authorities did not have detailed statistics for some fundamental activities (e.g. the numbers of applications per annum advertised under Sections 67 and 73) and on costs as defined by CIPFA.

There is something of a problem with CIPFA's approach to conserva tion statistics gathering which needs to be resolved. The CIPFA Planning & Development Statistics 1998-9 are still, for example, basing information on List Entries not Listed Buildings (our survey specifically and emphatically required the latter). Clearly, a statutory list of ten list entries each comprising a terrace of 30 Georgian houses is a different entity to a list with 300 individual diverse historic buildings ranging from c. 1400 to 1970. CIPFA are also still lumping together disparate environmental improvement matters such as derelict land reclamation with mainstream building conservation activities.

The information sought falls roughly into the following groups, although not set out in full here:

Organisation: Location within the LPA; range of duties/principal functions of the service; reliance on others for advice; post numbers; etc.;

Heritage resources: Numbers of listed buildings; changes year on year etc.; ancient monuments; conservation areas; A4Ds etc.; historic parks & gardens; local list buildings; buildingsat-risk;

Supporting organisations: BPTs, Churches, Trusts etc.;

Conservation records: Computerisation, photographic records etc.;

Technical exchanges: Forums/joint working arrangements;

Development control and enforcement.. Number of applications; staff responsibilities and roles etc.; nos of enforcement notices/prosecutions etc.;

Conservation Advisory Area Panels: Operation, organisation, scope/remit; Diocesan Advisory Committees.

Involvement/consultation; Spending resources: Grant aid scope, types, resource levels, administration/ delegation;

Miscellaneous functions etc: Formal liaison arrangement with English Heritage/Parish Councils or Neighbourhood Forums/Residents Groups

Administrative procedures: Complaints, Staff welfare/Staff appraisal; Training;

Administrative costs: based on definitions contained within CIPFA Returns 1997/98: Staff Costs;

Quality/service statement.. Customer care/ public meetings/consultations/ customer charters/customer surveys/ staff availability etc.

Customer surveys
In addition Ipswich has been conducting two forms of customer survey:

  1. general survey to all agents asking about the frequency of contact; nature of the contact (LBC applications; technical advice; grants etc.); level of service (availability, courtesy and responsiveness, promptness and quality of advice); assessment of the current and future priorities of the service (nine headings); assessment of the adequacy of staff resources devoted to conservation and eliciting additional general comments or impressions of the Service.
  2. a specialist client survey to 16 national and local bodies with which the Council interfaces, asking the same general questions as in (a) above and others where appropriate, e.g. on grants where bodies have received them; and on statutory notification procedures to the national amenity societies.

A questionnaire to grant recipients was also considered but not carried out as the number would have been too small to provide a meaningful sample. in future in conjunction with the provision of development control services, a customer questionnaire may be used to gauge how applicants (rather than agents) perceive the listed building consent process after a decision has been issued.

All customer questionnaires are separate from the benchmarking with the shadows and are intended to give an overview of external perceptions of the conservation service being provided. The Audit Commission seemed satisfied with these arrangements.

Assessing quality
One clear factor to emerge from the Benchmarking process is the absence of comparative statistics of the kind which exist for development control. Furthermore, quality monitoring of a non-statistics-based service is difficult with no sensible, meaningful externally established measures. What would a ratio of numbers of staff to numbers of listed buildings actually mean!?) While it is clear that resources, both human and financial, varied significantly between the authorities in the benchmarking group, the same objectives and, to a large extent, the same processes applied.

Development Control clearly emerged as clearly the highest priority for all the members of the benchmarking group and while other activities were given different priorities by individual authorities technical advice and grant aid were secondary, more-or-less equal; policy matters were a middle ranking concern; buildings at risk and enforcement had more-or-less equal lower rating (perhaps reflecting their time consuming, longhaul staff requirement) and environmental enhancement had the lowest rating - perhaps reflecting a lack or resources or perhaps a job largely completed? English Heritage may not take much comfort from the buildings-atrisk priorities in the light of its recently relaunched strategy on this problem.

It is self-evident that, as the statutory requirements (by comparison with development control) are limited, the weak statistical base makes comparison difficult. The thrust of the pilot Conservation Benchmarking has therefore needed to concentrate on establishing good practice, sound qualitative measures and assessment of the 'added-value' of conservation expertise, i.e. comparing processes and outcomes are the most productive.

Future qualitative issues
Some of the topics emerging from the pilot benchmarking are worth outlining here. One important matter in the light of the importance attached to development control and requiring further work is the integration of conservation, development control and enforcement good practice. Further work on this by the benchmarking group is being considered.

Several other issues which might stimulate debate are:

Organisation: How are service objectives to be prioritised? How is long term v. short term workload addressed? How does the authority view its statutory V. nonstatutory conservation functions? Is there a tension between broader, flatter organisational structures - i.e. effective delegation to officers - or is the process then so removed from members that its importance is not appreciated?

Development control and enforcement. How is the adequacy of applications determined at registration? How is the ,special regard' of S. 16(2) incorporated into processing and determining of LBC? How best would conservation issues be reported to members? Does the statutory notification procedure work satisfactorily? How can effective negotiation to improve design quality be developed/evaluated? How is ef fective compliance achieved? How quickly is noncompliance followed up and in what ways is this remedied? Customer care.. How best can public consultation improve design/conser vation quality:
Quality.' How should a customer service plan best feed back into a conservation service? Is quality better achieved through focus groups/members/ peer group review? How are targets for improving the service best formulated? In what ways do service users' views change the quality of the service provided? How does this relate to operational requirements and member priorities? What resource implications does this have?

Clearly some of the biggest difficulties in making comparisons are that local authority conservation services are:

  • not always systematic or comprehensive;
  • a sometimes dependent on unanticipated opportunistic factors;
  • not linear and cyclical (e.g. as in development control);
  • limited by varying skills and experience;
  • conditioned by political priorities and resources;
  • affected by organisational structure; not necessarily required by statute; and m undertaken with subjective judgements on the quality of provision by peer groups, elected members, special interest groups and the general public.

In assessing quality, feedback on customer satisfaction is important, but better good practice guidance and peer group review depend on the responsiveness of chief officers and members and a willingness to resource the outcome.

Good practice guidance
In September, to the surprise of the authorities involved with both the Best Value Conservation and Development Control Benchmarking exercises (and without any communication with them), the DoETR commissioned a study from the Planning Officers Society to produce good practice guidance on Best Value in planning. (It is assumed that this must include conservation.) As so much of the work has already been undertaken on both building conservation and development
control, there was concern that all this hard work might be negated. The pilot group had been under the impression that the development of good practice through benchmarking and consultation was why they had all been hard at work. Furthermore, this work had nearly been completed by the time of the announcement!

As building conservation is an integrated part of the local authority planning system, if further study is to be undertaken, it is felt that IHBC should review good practice elements of Best Value in Local Government for Conservation Services. The IHBC has formally requested the views of the DoETR on how it intends that the Institute should be directly involved on Best Value in Conservation Services. At the time of writing a response is still awaited.

Tailpiece
Initially, the Best Value exercise was approached with some trepidation. It seemed daunting but we have emerged feeling quite confident about our service. However, the benchmarking has proved a considerable stimulus to thinking about the service and further improvements that would be possible if properly resourced. None of this work could have been undertaken without the commitment and enthusiasm of the shadow authorities. A great deal was learned from them and we have been most grateful for the effort put in by their dedicated Officers.

Where do we go from here? That may be the hardest question of all!

Bob Kindred is Conservation Officer with Ipswich Borough Council.

Context 60 December 1998