The efficiency of the Community Dental Service in England: a data envelopment analysis

7
Community Dent Oral Epidemiol 2000; 28: 274–80 Copyright C Munksgaard 2000 Printed in Denmark . All rights reserved ISSN 0301-5661 David Buck Department of Dental Public Health and Oral The efficiency of the Community Health Services Research, Guy’s King’s and St. Thomas’ Medical and Dental Institute (GKT), King’s College, London, UK Dental Service in England: a data envelopment analysis Buck D: The efficiency of the Community Dental Service in England: a data envel- opment analysis. Community Dent Oral Epidemiol 2000; 28: 274–80. C Munks- gaard, 2000 Abstract – Objectives: To assess the efficiency with which health authorities’ Com- munity Dental Services provide dental care in England. Methods: A data envelop- ment analysis of inputs (hours worked by dental officers, therapists, hygienists and others) and outputs (screening, treatment, prevention) of the Community Den- tal Service (CDS) was conducted. Relative efficiency ratings of the CDS by health authority were further analysed in order to identify external factors which effect efficiency and are outside the control of the Community Dental Service. Results: The relative efficiency of the CDS varies widely in England – on average the CDS Key words: community dental health services; economics; efficiency is operating at 75% of efficient levels compared to best-practice services. This could not be explained by plausible factors outside the CDS’s control, such as David Buck, Department of Dental Public Health and Oral Health Services Research, differences in deprivation and urban-rural differences between health authorities. Guy’s, King’s and St. Thomas’ Medical and Conclusions: These results, if validated by further studies, should be disturbing Dental Institute (GKT), King’s College, Guy’s since many Community Dental Services services appear to be under-performing. Campus, London Bridge, London SE1 9RT, However, this data-driven study could not uncover the detailed context of an indi- UK e-mail: david.buck/kcl.ac.uk vidual service’s performance. A useful next step would be detailed case-studies of several ‘‘star’’ and under-performing services to search for deeper reasons un- Submitted 25 March 1999; accepted derpinning relative performance levels. 19 November 1999 There are two main methods of delivering primary dental care in England: the General Dental Service (GDS) and the Community Dental Service (CDS). The GDS pays general dental practitioners (GDPs) on behalf of the Department of Health on the basis of a fixed fee-for-service system based on the rela- tive time taken to perform different procedures. The fee scale is devised to deliver the average den- tist a certain target income after practice costs. GDPs are also paid on a capitation basis for certain prevention services. The population are encour- aged to seek primary dental care from the GDS and register with a GDP. Patients pay a proportion of treatment charges incurred – up to a ceiling – although many groups are exempt from dental charges altogether, children, students in full-time education, pregnant and nursing mothers and the unemployed for example. The GDS is therefore a broad, general service intended to meet most of the 274 dental needs of the general public and acting as a gatekeeper to the hospital services to which pa- tients are referred for more complex dental treat- ment. The CDS has a long and complex history which is not reviewed here. In brief, it has evolved from its origins as the School Dental Service (SDS) under the control of Education Authorities in reaction to the change of ‘‘ownership’’ to the Department of Health and a succession of policy documents from the Department. As Gallagher & Gelbier (1) ex- plain, the CDS is a much smaller service than the GDS and now has a number of distinct roles set out in Department of Health circular HC(89)2: the screening of schoolchildren at least 3 times in their school career; epidemiological surveys of oral health; health promotion; a safety net service for those unable to receive care under the GDS; and the provision of a specialised referral service from

Transcript of The efficiency of the Community Dental Service in England: a data envelopment analysis

Page 1: The efficiency of the Community Dental Service in England: a data envelopment analysis

Community Dent Oral Epidemiol 2000; 28: 274–80 Copyright C Munksgaard 2000Printed in Denmark . All rights reserved

ISSN 0301-5661

David BuckDepartment of Dental Public Health and OralThe efficiency of the CommunityHealth Services Research, Guy’s King’s and St.Thomas’ Medical and Dental Institute (GKT),King’s College, London, UKDental Service in England: a data

envelopment analysisBuck D: The efficiency of the Community Dental Service in England: a data envel-opment analysis. Community Dent Oral Epidemiol 2000; 28: 274–80. C Munks-gaard, 2000

Abstract – Objectives: To assess the efficiency with which health authorities’ Com-munity Dental Services provide dental care in England. Methods: A data envelop-ment analysis of inputs (hours worked by dental officers, therapists, hygienistsand others) and outputs (screening, treatment, prevention) of the Community Den-tal Service (CDS) was conducted. Relative efficiency ratings of the CDS by healthauthority were further analysed in order to identify external factors which effectefficiency and are outside the control of the Community Dental Service. Results:The relative efficiency of the CDS varies widely in England – on average the CDS Key words: community dental health

services; economics; efficiencyis operating at 75% of efficient levels compared to best-practice services. Thiscould not be explained by plausible factors outside the CDS’s control, such as David Buck, Department of Dental Public

Health and Oral Health Services Research,differences in deprivation and urban-rural differences between health authorities.Guy’s, King’s and St. Thomas’ Medical andConclusions: These results, if validated by further studies, should be disturbingDental Institute (GKT), King’s College, Guy’s

since many Community Dental Services services appear to be under-performing. Campus, London Bridge, London SE1 9RT,However, this data-driven study could not uncover the detailed context of an indi- UK

e-mail: david.buck/kcl.ac.ukvidual service’s performance. A useful next step would be detailed case-studiesof several ‘‘star’’ and under-performing services to search for deeper reasons un- Submitted 25 March 1999; acceptedderpinning relative performance levels. 19 November 1999

There are two main methods of delivering primarydental care in England: the General Dental Service(GDS) and the Community Dental Service (CDS).The GDS pays general dental practitioners (GDPs)on behalf of the Department of Health on the basisof a fixed fee-for-service system based on the rela-tive time taken to perform different procedures.The fee scale is devised to deliver the average den-tist a certain target income after practice costs.GDPs are also paid on a capitation basis for certainprevention services. The population are encour-aged to seek primary dental care from the GDS andregister with a GDP. Patients pay a proportion oftreatment charges incurred – up to a ceiling –although many groups are exempt from dentalcharges altogether, children, students in full-timeeducation, pregnant and nursing mothers and theunemployed for example. The GDS is therefore abroad, general service intended to meet most of the

274

dental needs of the general public and acting as agatekeeper to the hospital services to which pa-tients are referred for more complex dental treat-ment.

The CDS has a long and complex history whichis not reviewed here. In brief, it has evolved fromits origins as the School Dental Service (SDS) underthe control of Education Authorities in reaction tothe change of ‘‘ownership’’ to the Department ofHealth and a succession of policy documents fromthe Department. As Gallagher & Gelbier (1) ex-plain, the CDS is a much smaller service than theGDS and now has a number of distinct roles setout in Department of Health circular HC(89)2: thescreening of schoolchildren at least 3 times in theirschool career; epidemiological surveys of oralhealth; health promotion; a safety net service forthose unable to receive care under the GDS; andthe provision of a specialised referral service from

Page 2: The efficiency of the Community Dental Service in England: a data envelopment analysis

Efficiency of the community dental service in England

GDPs for treatment such as orthodontics or involv-ing general anaesthetics. Recent general policy hasbeen to direct the CDS into acting primarily as asafety net service, to provide care for people withspecial needs and those who find it difficult to findappropriate care within the GDS. Children are nowencouraged to make use of their GDP under theGDS where possible, and recent changes to theGDPs’ contracts encourage them to take responsi-bility for children’s dental care.

However, some CDS services still retain schoolscreening as their major role, whilst others have de-veloped expertise in special needs care and yetothers in prevention. In short, there is enormousvariation in what the CDS actually does around thecountry (2). This great variety and lack of central-ised control from the Department of Health is agreat strength of the CDS, since each health author-ity’s service can adapt to local needs. However, thiscauses headaches for evaluators, since differentservices that concentrate on different outputs aredifficult to compare on a like-for-like basis. Whilstcritics of the CDS have been vocal down the years(3–4) there has not been a country-wide evaluationwhich takes into account the varied nature of theservice. This paper’s aim is to go some waytowards rectifying the situation. It uses a relativelynew technique, data envelopment analysis (DEA),to compare the ‘‘efficiency’’ of the English CDS byhealth authority in 1997–98. The next section out-lines the principles of DEA and explains why it issuitable for the analysis of the CDS.

It should be noted here, however, that this papersheds no light on the debate about the relative mer-its of the CDS versus the GDS. The GDS uses verydifferent inputs and has a very different role to theCDS – despite some overlap. Analysing the GDS,either alone or in comparison with the CDS, is be-yond the scope of this paper.

Material and methods

Economists have long been interested in measuringthe efficiency of organisations and have used a bat-tery of techniques relying mostly on the economet-ric analysis of production functions to do so. How-ever, these techniques are restrictive in that theycan only handle one measure of output. This is fineif you happen to be interested in the most efficientway to produce cars or other single products. Inthe health sector this means ideally a uni-dimen-sional measure of ‘‘health‘‘. In reality, such a con-cept is notoriously difficult to define, measure and

275

attribute to any particular action (although con-cepts such as the quality-adjusted life year [QALY]are beginning to gain currency). Consequently,most production function approaches to efficiencyhave used hospital patient bed-days or equivalentsas measures of output. This is unsatisfying. Ifforced to fall back on measures other than ‘‘health’’it needs to be recognised that healthcare units pro-duce multiple outputs – just as most industrialfirms do. The CDS for example ‘‘produces’’ screen-ing, prevention and treatment to varying degreeswith varying inputs across England. All shouldtherefore be included as output in any analysis ofefficiency. To concentrate on one alone is mislead-ing and penalises CDSs which may be more effi-cient at producing the neglected output or outputs.

DEA allows this, which is therefore its greatstrength over more traditional methods of assess-ing relative efficiency. It is a relatively new ap-proach to measuring the efficiency of organis-ations, initially proposed by Farrell (5) but not de-veloped until the late 1970s by Charnes et al (6).Since then it has been extended and applied tomany situations. DEA allows a comparison of therelative efficiency of different decision-makingunits (DMUs) whose general aims are the same, yetwhose focus on specific objectives may be different,and who choose to organise the ways of meetingthose objectives in different ways. DEA is thereforeideally suited to analysing the CDS which pro-duces a number of non-interchangeable outputsfrom different combinations of inputs. It has beenextensively used in industry as a formal aid tobenchmarking (7) but has also been used in analys-ing efficiency in the health care sector (8–10).

The essentials of DEA can be illustrated with asimple diagram. We assume that there are only twoinputs in the CDS, i.e. numbers of hours workedby community dental officers (CDOs) and dentalhygienists. Further, there is only one output, i.e.numbers of individuals treated. Extending this tomore inputs, multiple outputs and many differentlevels of outputs is mathematically intensive butoperates on the same basic principle. The essentialsof DEA are represented by Figure 1 where the in-puts are measured on either axis. A number ofcommunity dental services are represented bypoints A to L representing the various combina-tions of CDO and therapist hours used to producea given number of individuals treated, say 250.Points A to E trace the most efficient frontier. Thisis because the services on this frontier use the leastnumbers of inputs to produce a given output –

Page 3: The efficiency of the Community Dental Service in England: a data envelopment analysis

D. Buck

Fig. 1. The essentials of data envelopment analysis

they are defined as 100% efficient. All other ser-vices lie within this frontier and are, by definition,less efficient. For example, service I uses inputs inthe same combination as service C; it has the same‘‘technology‘‘. However, it is using this technologyto less effect. In actual fact, it is only OC/OI asefficient as it could be. Of course practice C couldalso improve its efficiency by copying and combin-ing working practices or ‘‘technology’’ of other ef-ficient practices – moving to a point closer to thefrontier but not along the line Ol. Similar efficiencyscores can be computed for all other points insidethe frontier.

The construction of this artificial best-practice orbenchmarking frontier from the efficient firms andthe placement of all other firms in relative positionsto the frontier is the essential insight of DEA. Therelative efficiency of each DMU is assessed relativeto the best-practice frontier. This helps us to iden-tify inefficiency – if it exists – and where it is withinthe Community Dental Service. A further compli-cation is that efficiency may vary with the scale ofthe organisation. For example, there may be a mini-mum number and types of staff required to pro-vide any services within a small area. This can becontrolled by using a varying returns to scale (VRS)DEA. In essence, CDSs are sorted into similar scaleunits. Efficiency is then measured relative to theappropriate scale frontier.

The end result of a DEA then is an efficiencyscore for each DMU, or each health authority’sCDS in our case. This is useful in itself, especiallyas a way for managers and policy-makers withineach health authority to benchmark their CDSagainst comparators. However, the efficiencyscores are then amenable to further analysis (11).Regression analysis can be used to ‘‘explain’’ the

276

efficiency scores. We are obviously interested in thedeterminants of efficiency, as much as efficiencyitself.

Frontier Analyst (12) was used to perform theDEA calculations in this study. SPSS 8.0 (13) andStata 5.0 (14) were used for the other statistical cal-culations.

Data

The majority of the data used for this study werekindly supplied by the Department of Health. In-formation from form KC64 – the return of Com-munity Dental Service activity – was abstracted forthe year 1997/8. KC64 collects data on the hoursworked by dental officers, hygienists, therapistsand ‘‘other’’ staff and their main activities in termsof screening, health education and prevention con-tacts and episodes of one-to-one patient care. Forthe period covered we therefore have informationon: total hours worked by dental officers, totalhours worked by all other staff (in the main hy-gienists and therapists), total numbers screened,total numbers of health education and preventioncontacts, and total numbers of episodes of care.This allows a basic DEA analysis with two inputsand three outputs. Unfortunately, KC64 has no de-tails of inputs other than labour. Efforts to trackdown financial information on the CDS have metwithout success. The Department of Health doesnot collect such data centrally and each health au-thority differs in its cost allocation methods. Itwould have been preferable to have such informa-tion especially on capital inputs. However, sincethe CDS is a labour intensive service the lack ofsuch information may well be less important thanelsewhere. Nyman & Bricker (15) and Kooreman(10) also focused on labour inputs in their analysesof nursing home care in the Netherlands.

Supplementary data on changes in the numbersof GDPs, population densities and Jarman depriva-tion scores by English health authorities were pro-vided by various branches of the Department ofHealth.

Results

Descriptive statisticsThe diversity of CDS services across England is il-lustrated in Table 1. Each of the input and outputvariables has a large standard deviation and rangeindicating big differences of approach across healthauthorities, and/or large differences in scale be-

Page 4: The efficiency of the Community Dental Service in England: a data envelopment analysis

Efficiency of the community dental service in England

Table 1. Descriptive statistics for the Community Dental Services in England

Std.Variable Mean Deviation Minimum Maximum

Numbers screened 29,971 16,831 1,424 102,864HE&P contacts 18,407 15,454 431 88,481Care episodes 15,887 8,398 3,828 41,796Dental officer hrs 11,679 5,417 2,397 35,020Other staff hrs 4,397 3,451 261 19,691

tween CDSs – variable returns to scale in economicjargon – a possibility investigated below.

Efficiency scoresFigure 2 shows a histogram of efficiency scores un-der the assumption of no scale effects, known asconstant returns to scale (CRS). On the far right arethe health authorities deemed to be fully technical-ly efficient (12 in this case). If all the others clus-tered near these cases it would imply that althoughthere is some inefficiency (as defined), it is relative-ly minor. With the CDSs this is not the case as Fig-ure 1 demonstrates. The mean efficiency score is63.5 indicating that on average CDSs are only63.5% as efficient as they could be – the interquar-tile range being 49.8 to 73.7.

Under the assumption of variable returns to scale(histogram not reported) more CDSs are deemedto be fully efficient (16 this time) as expected, andmore are closer to being fully efficient (since thereare more efficient frontiers and thus a higher likeli-hood of being closer to one). Although not reflectedin the mean (67.3) this is reflected in the interquar-tile range (32.6 to 85.8). The scores from the twoprocedures are however closely related (Spear-man’s rhoΩ0.97; Kendall’s tau bΩ0.89; both signi-ficant at the 1% level) as a simple graph of oneagainst the other also indicates (not reported). Theassumption of variable returns to scale thereforedoes make some difference to the scores, albeit mi-nor. Further analysis for both constant and variablereturns to scale scores is undertaken below.

A priori at least, it seems there is a lot of vari-ability in the relative efficiency of community den-tal services in England. Many appear highly inef-ficient when benchmarked against their peers.

The meaning and interpretation of efficiencyscoresHowever, the results above could be doing somestrictly ‘‘inefficient’’ CDSs a disservice if there arewell-founded reasons for their increased usage of

277

staff. Poor efficiency scores could well be due tofactors outside their control. It is therefore impor-tant to take them into account as much as possiblewhen interpreting the meaning of efficiency scores.Four influences are discussed below.

First, some CDSs will be in rural as opposed tourban areas. All other things being equal thiswould be expected to increase staff hours due tothe extra time and complexities involved in per-forming screening, preventive interventions andtreating patients in rural areas. Not taking this intoaccount may misleadingly make rural areas seemless efficient than urban ones. The socio-economiccharacteristics of patients are also likely to be im-portant in explaining efficiency scores. Since pa-tients from more ‘‘deprived’’ areas are likely to suf-fer poorer dental health and require more complextreatments – again all other things being equal – itis hypothesised that high deprivation areas willhave CDSs that spend more time with patients.They will therefore appear more inefficient thanthey actually are. Patients who are sedated or whoreceive general anaesthetics during treatment are

Fig. 2. Histogram of efficiency scores under CRS

Page 5: The efficiency of the Community Dental Service in England: a data envelopment analysis

D. Buck

Table 2. Probit model of CRS

Variable DF/dX 95% confidence interval

Jarman score 0.0009783 ª0.004107 0.006064Population density ª0.0027799 ª0.007624 0.002064% change in GDPs 0.0066849 ª0.008294 0.021664% episodes intensive ª0.9636729 ª2.27376 0.346416

Log likelihoodΩª34.973635 Pseudo R2Ω0.0468.

Table 3. Probit model of VRS

Variable DF/dX 95% confidence interval

Jarman score 0.0037657 ª0.001623 0.009155Population density ª0.0050714 ª0.010567 0.000424% change in GDPs ª0.0002323 ª0.017489 0.017024% episodes intensive ª1.502594 ª3.01353 0.008335

Log likelihoodΩª40.737393 Pseudo R2Ω0.0735.

also likely to require more time – especially giventhe recommendations of the ‘‘Poswillo Report’’(16). This should also be taken into account. Finally,the activity of the GDS in an authority may alsoinfluence the efficiency of the area’s CDS. For ex-ample, an area which has witnessed expandingGDS services may have a knock-on effect on theCDS. In the short run this could be reflected in adifficulty in ‘‘finding’’ patients and hence a poorefficiency score. Of course in the long run thisshould be resolved – in the case of appropriatecross-over of services and an efficient CDS – by re-ducing staff hours.

It is possible to take these factors into account byusing proxies for the above as independent vari-ables in a regression of the efficiency scores. If theyare practically, as opposed to theoretically, impor-tant in explaining efficiency differences theyshould be individually significant in a regression.Two sets of such regressions are reported below.One is a probit regression – can these factors ex-plain which CDSs are on the efficiency frontier?The other is what is known as a Tobit or censoredregression – can the proposed explanatory vari-ables actually explain individual efficiency scores?*

The proposed explanatory variables are as fol-

* As Kooreman (10) points out efficiency scores constrain themost efficient CDSs to a maximum score of 100 by definition.Thus the score distribution is artificially curtailed or cen-sored at the right-hand side (see Figure 1). Tobit regressiontakes account of this censoring – a normal OLS regressionwould be biased (18).

278

lows: the rural or ‘‘urbanness’’ of each health au-thority is proxied by population density; the levelof deprivation in an authority is proxied by it’s Jar-man index score, which is based on a standardisedindex of 10 local conditions that includes: unem-ployment levels, housing conditions, overcrowdingand the proportion lone parent families, etc (17);the extent of particularly intensive treatments isproxied by the proportion of episodes of carewhich involved sedation and/or general anaesthe-sia for each authority; and the local GDS contextby the percentage change in numbers of GDPs ineach authority during 1996–97 – a longer timeperiod would have been preferred but it was notavailable due to a recent change in the boundariesof health authorities.

Efficiency scores revisited: All change?Tables 2 and 3 report results for probit analysis ofthe health authorities under the assumptions ofCRS and VRS respectively. These analyses testwhether the theoretically plausible variables men-tioned above are useful in predicting which author-ities will be efficient and which less than efficient.

Tables 2 and 3 tell the same story – none of theproposed influences on efficiency scores have anyexplanatory power in explaining which authoritiesCDSs are on the efficiency frontier. Some of the en-tries in Table 2 are of the expected sign (CDSs inless deprived areas are more efficient; CDSs withmore intensive episodes are less efficient) but theothers are not (more densely populated areas areless efficient; areas with increasing numbers ofGDPs are more efficient). What is most importanthowever is that none of these variables are signi-ficant – as indicated by the confidence intervals in-cluding zero. Table 3’s results are similar with theexception that the indicator of changes in the GDSenvironment has its expected sign.

These results may not be too surprising, by con-centrating only on the distinction between totallyefficient CDSs and all others a lot of informationabout relative efficiency between CDSs is ‘‘thrownaway’’ which may be influenced by the variablessuggested. This possibility is tested in Tables 4 and5 in the Tobit regressions – can the efficiency scoresthemselves be better predicted by the suggested in-fluences?

Once again it seems not. All variables are insigni-ficant in both regressions. The only exception tothis is the constant – not strictly a variable. Thissimply implies that the efficiency of the averageCDS service is likely to be around 68% of the best

Page 6: The efficiency of the Community Dental Service in England: a data envelopment analysis

Efficiency of the community dental service in England

Table 4. Tobit model of CRS

Variable Coefficient 95% confidence interval

Jarman score 0.0244874 ª0.3275557 0.3765306Population density ª0.2342992 ª0.539266 0.0706677% change in GDPs ª0.1651786 ª1.280322 0.9499643% episodes intensive 7.800525 ª69.26641 84.86746Constant 68.25175 58.71749 77.786

Log likelihoodΩª412.51917 Pseudo R2Ω0.0056 12/100 cen-sored observations.

Table 5. Tobit model of VRS

Variable Coefficient 95% confidence interval

Jarman score 0.0427035 ª0.3302476 0.4156546Population density ª0.2551395 ª0.5777504 0.0674713% change in GDPs ª0.5916231 ª1.777667 0.5944212% episodes intensive ª3.378165 ª84.91464 78.15831Constant 74.75283 64.58086 84.9248

Log likelihoodΩª402.42055 Pseudo R2Ω0.0068 16/100 cen-sored observations.

under CRS (95% CI: 58.7–77.8) and 75% of the bestunder VRS (95% CI: 64.6–84.9%).

Discussion

What do these results tell us? It seems, from Figure1, that many CDSs are not performing as well asthey might. More importantly, the econometricanalysis shows that theoretically plausible explana-tions for why efficiency scores will differ betweenCDSs do not explain the performance in practice.It may be that many CDSs are indeed highly inef-ficient – producing far less output than they couldgiven the staff employed.

Before this conclusion is reached, however,other avenues need to be explored in more depth.There are certainly other ‘‘variables’’ that effectefficiency that have not been addressed in thisstudy. One obvious example is capital spendingfor which no systematic data is available. It seemslikely that a better and highly equipped CDSwould be more efficient than one with older equ-ipment given the same staff inputs and patientmix. Another factor is changing referral patternsfor which there is also little known information. Itis also well known that there are inherent differ-ences between dentists in the speed at which theywork and that productivity changes with age (19).‘‘Efficiency’’ itself of course should also ideally in-

279

clude measures of quality and final outputs ratherthan intermediate outputs, such as numbersscreened or episodes completed. What is reallyneeded are good measures of the short- and long-term changes in quality of life that CDSs pro-vide – it may be that being more ‘‘inefficient’’ interms of intermediate outputs goes hand in handwith a greater overall impact on patients’ qualityof life. For example, slower more considered caremay be more efficient in terms of health outcomesthan it seems in terms of intermediate outcomes.Until research is undertaken and data collected onfinal outcomes from interaction with the CDS wewill not know. Finally, anecdotal evidence alsopoints to problems with the underlying data inthe KC64 forms. Several dental professionals haveopenly expressed doubt to the author about theaccuracy of the returns. They seem not to betaken seriously by certain members of the CDS.How important this may be it is difficult, if notimpossible, to judge without detailed analysis ofthe returns and anonymous interviews with thosewho fill them in.

Until these and other avenues are explored andthe results presented here are legitimately ques-tioned through further research they will remainprovocative. Although this study uses crude meas-ures of output and inputs, it is the most compre-hensive to date both in coverage and in design –taking into account as it does the substantial varia-tion in the role of the CDS throughout England. Itfinds that on average the CDS is only as 75% asefficient as it could be.

An obvious next step would seem to be a morein-depth comparative study of those CDSs whichare successful in translating their inputs into out-puts with some that are not. This will help performtwo needed functions: the spread of best-practicetechniques around the country; and the explorationof context-specific factors which ‘‘explain’’ ineffi-ciency yet which cannot be identified in a broaderdata-driven study such as this. This study shouldalso be revisited in the future to identify anychanges in the efficiency of CDSs through time andthe reasons for them. At present CDSs are paid forthrough block contracts received from health au-thorities based on historical patterns of service.There is therefore little incentive to be efficient.Changes in the type of contract and an increaseduse of benchmarking procedures using the sort ofinformation presented here and more in-depthstudies may have an impact on efficiency. Thiscould be tested in a future study.

Page 7: The efficiency of the Community Dental Service in England: a data envelopment analysis

D. Buck

Acknowledgements

The author would like to thank colleagues in the Depart-ment, particularly Dympna Kavanagh, Iona Loh, David Gib-bons and Tim Newton for their comments. Thanks also tothe anonymous reviewers whose input improved certain as-pects of the paper and to Sharon Lea of the Department ofHealth for providing the bulk of the data for this study andexhibiting remarkable patience in response to constant re-quests. This research was unfunded.

References

1. Gallagher J, Gelbier, S. Dentistry and the NHS. In: Down-er MC, Gelbier S, Gibbons DE, editors. Introduction todental public health. London: FDI World Dental Pr; 1994.p. 35–55.

2. Wragg K, Anderson R. The difference between the cariesexperience of 11 to 13 year old children in three commu-nities in Derbyshire. Community Dent Health 1988;5:29–38.

3. Palmer JD, Berry DC. Evaluation of dental services inSomerset. Br Dent J 1978;145:371–3.

4. Milsom K. School dental screening: what value? Br DentJ 1995;178:322.

5. Farrell MJ. The measurement of productive efficiency.Journal of the Royal Statistical Society, Series A1957;120:253–81.

6. Charnes A, Cooper WW, Rhodes E. Measuring the effi-ciency of decision making units. European Journal of Op-erational Research 1978;2:429–44.

280

7. Hawdon D, Hodson M. The use of data envelopmentanalysis in benchmarking. Business Economist1996;27(3);23–39.

8. Nunamaker TR. Measuring routine nursing service effi-ciency: a comparison of cost per patient day and dataenvelopment analysis models. Health Services Research1983;18:183–204.

9. Grosskopf S, Valdamis V. Measuring hospital perfor-mance. A non-parametric approach. Journal of HealthEconomics 1987;6:89–107.

10. Kooreman P. Nursing homes in the Netherlands: a non-parametric efficiency analysis. Journal of Health Eco-nomics 1994;13:301–16.

11. Lovell CAN. Production frontiers and productive effi-ciency. In: Fried HO, Lovell CAN, Schmidt SS, editors.The measurement of productive efficiency: techniquesand applications. Oxford: Oxford University Pr; 1994.

12. Frontier Analyst. Glasgow: Banxia Software; 1998.13. SPSS 8.0. Chicago (IL): SPSS Inc; 1998.14. Stata 5.0. College Station (TX): Stata Corporation; 1998.15. Nyman JA, Bricker DL. Profit incentives and technical

efficiency in the production of nursing home care. Re-view of Economics and Statistics 1989;56:586–94.

16. Department of Health General anaesthesia, sedation andresuscitation in dentistry (UK). Report of an expert work-ing party prepared for the Standing Dental AdvisoryCommittee. London: Department of Health; 1990.

17. Jarman B. Identification Of Underprivileged Areas. BrMed 1983;286:1705–9.

18. Greene WH. Econometric Analysis. Englewood Cliffs(NJ): Prentice-Hall; 1993.

19. Brennan DS, Spencer AJ, Szuster FS. Productivity amongAustralian private general dental practitioners across aten year period. Int Dent J 1996;3:139–145.