Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert &...
-
Upload
chester-anderson -
Category
Documents
-
view
223 -
download
0
description
Transcript of Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert &...
![Page 1: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/1.jpg)
Operational Conditionsin Regulatory Benchmarking – A Monte-Carlo Simulation
Stefan Seifert & Maria Nieswand
Workshop: Benchmarking of Public UtilitiesNovember 13, 2015, Bremen
![Page 2: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/2.jpg)
1 Motivation and Literature
2 Methodologies
3 The DGP
4 Simulation Design and Performance Measures
5 Initial Results
6 Conclusion and Outlook
Agenda
Stefan Seifert & Maria Nieswand2Benchmarking of Public Utilities
![Page 3: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/3.jpg)
Motivation1
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
3
Regulatory Approaches for Electricity DSOsSource: Agrell & Bogetoft, 2013
![Page 4: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/4.jpg)
Motivation
• Benchmarking widely used in regulation – sectors in which environmental factors play an important role
• Accuracy of estimates influences revenue caps, industry performance, firm survival, and ultimately customers via prices
• Methodological advances to account for environmental factors and heterogeneity
• Non-parametric approaches: z-variables in 1-stage DEA (Johnson and Kuosmanen, 2012), conditional DEA (Daraio & Simar, 2005 & 2007), …
• Parametric approaches: Latent Class (Greene, 2002; Orea & Kumbhakar, 2004), Zero-inefficiency SF (Kumbhakar et al., 2013), …
• Semi-parametric approaches: StoNEzD (Johnson & Kuosmanen, 2011), …
• BUT: Regulatory models typically based on standard DEA or SFA
1
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
4
![Page 5: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/5.jpg)
Motivation
Aim of this study• Systematical performance evaluation of Latent Class, StoNEzD and conditional DEA in
the presence of environmental factors
• Generalization of results via Monte-Carlo-Simulation
Guidance for regulators to choose estimators given industry structure and industry characteristics
Scope of this study• Consideration of different model set-ups imitating real regulatory data
Cross section with variation in sample sizes, noise and inefficiency distributions and in terms of the true underlying technology
Consideration of different cases of impact of environmental variables
1
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
5
![Page 6: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/6.jpg)
Related Literature
Monte Carlo Simulation Studies• Basic MC evidence in original research papers
• Andor & Hesse (2014): StoNED vs SFA vs DEA
• Henningsen, Henningsen & Jensen (2014): multi-output SFA
• Krüger (2012): order-m vs order- vs DEA
• Badunenko, Henderson & Kumbhakar (2012): KSW bootstrapped DEA vs FLW
• Badunenko & Kumbhakar (forthcoming): persistent and transient ineff. SFA
Few studies focusing on environmental variables• Cordero, Pedraja & Santin (2009) – z-variables in DEA
• Yu (1998) – z-variables in DEA and SFA
1
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
6
![Page 7: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/7.jpg)
Stefan Seifert & Maria Nieswand7Benchmarking of Public Utilities
Methodologies
2
![Page 8: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/8.jpg)
Methodology – Notation
• Production function • observations,
• Input to produce output
• Deviation from the frontier • ,
• Expected inefficiency
• Environmental factors• Vector of environmental factors with impact
2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
8
![Page 9: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/9.jpg)
Methodology – conditional DEA
• DEA with firm specific reference sets (Daraio & Simar, 2005, 2007) depending on realization of s.t.
• Estimation of the reference set: Kernel estimation
• Frontier reference point is (output oriented for comparability)
2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
9
![Page 10: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/10.jpg)
Methodology – Latent Class
• LC SFA tries to account for unobserved factors and heterogeneity in technologies(Greene, 2002; Orea & Kumbhakar, 2004)
• Consideration of J classes to estimate – class-specific shape of
• Endogenous selection of class membership: multinomial logit model with
2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
10
![Page 11: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/11.jpg)
Methodology – Latent Class
Estimation: ML or MSL - Likelihood function () as function of
• - parameters of the technology – pre-specified functional form
• - parameters describing class membership
• Posterior class membership probability can be calculated as
• This class membership probability can then be used to either weight the efficiency scores – or the frontier reference points
Weighted frontier reference point:
2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
11
![Page 12: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/12.jpg)
Methodology – StoNEzD
StoNEzD for Normal-Half-Normal Noise / Ineff.
1. Stage QP: Estimation of average function
• No functional form (but piece-wise linear)
• is common to all firms
2. Stage: Decomposing residuals of first stage
• MM estimator to derive
• Shift of by expected value of inefficiencyto derive frontier estimate
Frontier reference point:
2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
12
E [u ]=𝜇=𝜎𝑢√2/𝜋
![Page 13: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/13.jpg)
Methodology – Comparison of cDEA, LC SFA and StoNEzD for production function2
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
13
cDEA LC SFA StoNEzD
Type Non-parametric Parametric Semi-parametric
Error / Inefficiency
Deterministic Stochastic Stochastic
Shape Constrained Parametrically constrained
Constrained
Scaling assumption
Necessary Possible Possible
Convexity of T Yes No Yes
Reference set Observation specific
All observations, weighted
All observations
Effect of z on frontier
Observation specific
Grouped, but observation specific via weighting
General effect
![Page 14: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/14.jpg)
Stefan Seifert & Maria Nieswand14Benchmarking of Public Utilities
The DGP
3
![Page 15: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/15.jpg)
Data Generating Process
• DGPs are created to replicate real world regulatory data
General relationship
Sample Size + 4% observations twice as large in terms of inputs
Inputs 4 correlated Inputs for small and for large firms
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
15
![Page 16: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/16.jpg)
Data Generating Process
Functional form of • Translog
Inefficiency and Noise , with
• Noise-to-Signal:
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
16
![Page 17: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/17.jpg)
Data Generating Process
Environmental Factors 4 different distributions considered, 1 symmetric, 3 skewed, 1 correlated
with inputs
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
17
![Page 18: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/18.jpg)
Stefan Seifert & Maria Nieswand18Benchmarking of Public Utilities
Simulation Design and Performance Measures
4
![Page 19: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/19.jpg)
Simulation Design
Scenarios• So far only two different scenarios: Baseline (BL) and High Impact (HI) scenarios
• Only one -Variable considered each, variation in impact
• Each scenario estimated with variation in sample size, and , for each estimator
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
19
![Page 20: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/20.jpg)
Simulation Design
Implementation Replications: 100*9*5 = 4500 data sets for 3 estimatorsR samples for u and v for each scenariox,y and z are constant over one scenario Samples with strong deviations from the DGP
are discarded (correlations in , wrong skewness in )
StoNEzD Implemented with Sweet Spot Approach (Lee et al. 2013)MoM with set to -0.0001 if wrong skewness
occurs
Latent Class CD estimation Estimation with 2 - 4 classes, reported is max BIC5 repetitions with „randomized“ starting values
cDEA Least squares cross validation, Epanechnikov kernel
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
20
![Page 21: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/21.jpg)
Performance measures
Performance Evaluation Evaluated at frontier reference points corrected for
Performance Measures
Equally weighted deviation in percentage points
Bias > 0 overestimation of the frontier and of inefficiency
Average squared deviation, higher impact of larger deviation
3
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
21
![Page 22: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/22.jpg)
Stefan Seifert & Maria Nieswand22Benchmarking of Public Utilities
Initial Results
5
![Page 23: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/23.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
23
Generally…• LC most often outperforms cDEA and StoNEzD
• Distribution of z does not seem to matter concerning bias
• Correlation of z & x has only little effect (BL4 vs. the others)
• Also magnitude of environmental effect seems to play a minor role (HI vs BL)
![Page 24: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/24.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
24
• LC SFA• Performs generally well, stable and efficient
• Frontier overestimation tendecies in higher noise cases
• cDEA• High sensitivity against noise
• Underestimation of frontier in small samples, overestimation in larger samples
• StoNEzD• General underestimation of the frontier favorable for firms
• Performs well with low inefficiency and small samples
• But problems with high inefficiency
• … but does not seem to be generally efficient
![Page 25: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/25.jpg)
Stefan Seifert & Maria Nieswand25Benchmarking of Public Utilities
Outlook
6
![Page 26: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/26.jpg)
Conclusion and Outlook
• Additional Scenarios• Scenarios with multiple z variables
• Scenarios with heterogeneity in technologies induced by zs
• Misspecified scenarios?
• Estimation• Optimization of optimization routines – still failed estimations although the
estimated model is the true underlying model
• Suggestions?
5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
26
![Page 27: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/27.jpg)
Vielen Dank für Ihre Aufmerksamkeit.
DIW Berlin — Deutsches Institutfür Wirtschaftsforschung e.V.Mohrenstraße 58, 10117 Berlinwww.diw.de
Redaktion
![Page 28: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/28.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
28
![Page 29: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/29.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
29
![Page 30: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/30.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
30
![Page 31: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/31.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
31
![Page 32: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/32.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
32
![Page 33: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/33.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
33
![Page 34: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/34.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
34
![Page 35: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/35.jpg)
Initial Results5
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
35
![Page 36: Operational Conditions in Regulatory Benchmarking – A Monte-Carlo Simulation Stefan Seifert & Maria Nieswand Workshop: Benchmarking of Public Utilities.](https://reader033.fdocuments.us/reader033/viewer/2022051008/5a4d1b527f8b9ab0599a8097/html5/thumbnails/36.jpg)
References
• Agrell, P. and Bogetoft, P. (2013). Benchmarking and Regulation. CORE Discussion Papers 2013008.• Andor, M. and Hesse, F. (2014). The stoned age: the departure into a new era of efficiency analysis? a monte carlo comparison of stoned and the
oldies (SFA and DEA). JPA, 41(1):85-109.• Badunenko, O., Kumbhakar, S. (2015) When, Where and How to Estimate Persistent and Time-Varying Efficiency in Panel Data Models. WP.• Cordero, J. M., Pedraja, F., and Santin, D. (2009). Alternative approaches to include exogenous variables in DEA measures: A comparison using
Monte carlo. Comput. Oper. Res., 36(10):2699-2706.• Daraio, C. and Simar, L. (2005). Introducing Environmental Variables in Nonparametric Frontier Models: a Probabilistic Approach. JPA, 24(1):93-121.• Daraio, C. and Simar, L. (2007). Conditional nonparametric frontier models for convex and nonconvex technologies: a unifying approach. JPA,
28(1):13-32.• Greene, W. H. (2005). Reconsidering heterogeneity in panel data estimators of the stochastic frontier model. Journal of Econometrics, 126(2):269-
303.• Haney, A. B. and Pollitt, M. G. (2009). Efficiency analysis of energy networks: An international survey of regulators. Energy Policy, 37(12):5814- 5830. • Johnson, A. and Kuosmanen, T. (2011). One-stage estimation of the effects of operational conditions and practices on productive performance:
asymptotically normal and efficient, root-n consistent StoNEzD method. JPA, 36(2):219-230.• Jondrow, J., Knox Lovell, C. A., Materov, I. S., and Schmidt, P. (1982). On the estimation of technical inefficiency in the stochastic frontier production
function model. Journal of Econometrics, 19(2-3):233-238.• Krüger, J. J. (2012). A monte carlo study of old and new frontier methods for efficiency measurement. EJOR, 222:137-148.• Kuosmanen, T. (2012). Stochastic semi-nonparametric frontier estimation of electricity distribution networks: Application of the stoned method in
the Finnish regulatory model. Energy Economics, 34(6):2189-2199.• Lee, C.-Y., Johnson, A. L., Moreno-Centeno, E., and Kuosmanen, T. (2013). A more efficient algorithm for convex nonparametric least squares. EJOR,
227(2):391-400.• Orea, L. and Kumbhakar, S. C. (2004). Efficiency measurement using a latent class stochastic frontier model. Empirical Economics, 29(1):169-183.• Yu, C. (1998). The effects of exogenous variables in efficiency measurement - a monte carlo study. EJOR, 105(3):569-580.
0
Stefan Seifert & Maria NieswandBenchmarking of Public Utilities
36