Donald P. Moynihan

Post on 25-Feb-2016

35 views 0 download

Tags:

description

Experiences with performance management in the United States Presentation Towards a more result oriented Flemish public sector, January 10, 2014. Donald P. Moynihan. Part I: Overview. Overview. US Experience - background Errors in understanding performance management - PowerPoint PPT Presentation

Transcript of Donald P. Moynihan

Donald P.

Moynihan

EXPERIENCES WITH PERFORMANCE MANAGEMENT IN THE UNITED STATESPRESENTATION TOWARDS A MORE RESULT ORIENTED FLEMISH PUBLIC SECTOR, JANUARY 10, 2014

PART I: OVERVIEW

US Experience - backgroundErrors in understanding performance

managementExpectations about implementationThe politics of performance management

Lessons: how do we encourage purposeful use?

OVERVIEW

BACKGROUND

NATIONAL GOVERNMENT-WIDE CHANGES

Government Performance and Results Act - GPRA (1993-2010)

 Program Assessment Rating Tool (2002-2008)

 GPRA Modernization Act (2010-)

State level variations on these models

DOCTRINAL LOGIC FOR CHANGE

20 YEARS OF LEARNING?

Some lessons on how it wentPartly from study of topicReflected in some policy changes, especially GPRA Modernization Act

EXPECTATIONS ABOUT

IMPLEMENTATION

OECD 2012 surveySeems to be less use of performance data than in past

Performance targets not consequential

General sense of disappointment: we have systems in place, have not delivered desired results

IS THE IDEA OF PERFORMANCE MANAGEMENT RUNNING OUT OF

STEAM?

We define performance systems by the benefits we hope will occur (more rational budgeting, more efficient management)

The gap between our aspirations and the observed effects of these rules are usually large, resulting in disappointment

More grounded and accurate description: performance systems are a set of formal rules that seek to disrupt strongly embedded social routines

EXPECTATIONS PROBLEM

Speak of governments doing performance managementWhat do we mean?Rules about measuring and disseminating data

CONFUSION: ADOPTION VS. IMPLEMENTATION

INATTENTION TO THE USE OF DATA

Performance data by itself does not do much

Implementation of performance management means using the data

Why focus on performance information use? Difficult to connect public actions to outcomes Intermediate measure of effectiveness –

performance information use Without it, good things we want don’t happen

There are different types of use

Passive – minimal compliance with procedural requirements

Purposeful –improve key goals and efficiency

Political – advocate for programsPerverse – behave in ways detrimental

to goals (goal displacement and gaming)

THE FOUR TYPES OF USE

Can observe if agencies comply with requirements (passive use), but not other types of use

Performance systems encourage passive use, not purposeful

EFFECT OF PERFORMANCE REFORMS

THE POLITICS OF PERFORMANCE MANAGEMENT

APOLITICAL PERFORMANCE REFORMS?

Performance data associated with neutrality

Offers objective account of the world, and will engender consensus

Reduces the role of politics by offering an alternative basis to make arguments

This is part of political appealHas implications for adoption and implementation

Elected officials motivated by symbolic valuesPrimary focus on adopting information reporting requirements, not broader change

POLITICS OF ADOPTION

ACTUAL PATTERN OF CHANGE

We fail to understand the nature of performance data

We assume data areComprehensiveObjective Indicative of actual performanceConsistently understoodPrompts a consensus

ONE BASIC REASON FOR CONFUSION

Examine same programs but disagree on data

Agree on data but disagree on meaningAgree on meaning, but not on next

action steps/resources

Not clear on how data links to budget decisions

THE AMBIGUITY OF PERFORMANCE DATA

Actors will select and interpret performance information consistent with institutional values and purposes

Greater contesting of performance data and less potential for solution seeking in forums featuring actors with competing beliefs

THE SUBJECTIVITY OF PERFORMANCE DATA

IMPLICATIONS: POLITICAL USE

Performance data is socially constructed by individuals subject to

personal biases, institutional beliefs, and partisan preferences

has qualities of ambiguity and subjectivityThese qualities make performance

management likely to operate as part of political process, not as alternative to it

EVIDENCE OF ADVOCACY

“Spinning” (Hood 2006) Claim credit when things go well, deny

responsibility when things do notAdvocacy by agents seeks to avoid blame and

respond to “negativity bias” disproportionate citizen dissatisfaction with missed

target (James 2011) political officials pay more attention to high and low

performers (Nielsen and Baekgaard 2013) more bureaucratic explanations of failed performance

(Charbonneau and Bellavance 2012)

STAKEHOLDERS

Political support for agency associated with performance information use (Moynihan and Pandey 2010) May worry less about blame, freedom to experiment

Belief that stakeholders care about performance or performance measures associated with bureaucratic use (Moynihan and Pandey 2010)

More performance information use when:stakeholders perceived as more influential, more in conflict, and when there is more networking with stakeholders (Askim, Johnsen, and Christophersen 2008; Moynihan and Hawes 2012)

Assumption: Use performance data to reduce information advantage that agencies have over center & elected officials

Reality: Some evidence of partisan biases in implementation

As long as agencies play role in defining, collecting, and disseminating information, they retain information asymmetry

PRINCIPAL AGENT ARGUMENT

AN EXAMPLE: PROGRAM ASSESSMENT RATING TOOL (PART)• Bush-era questionnaire used by Office of

Management and Budget to rank programs from ineffective to effective

Four sections: program purpose and design, strategic planning, program management, and program results/accountability

Burden of proof on agenciesAlmost all federal programs evaluated

HOW MIGHT POLITICS AFFECT PART IMPLEMENTATION?

Ostensibly neutral reforms may serve—or may be seen as serving—political ends:

Partisan reformers may implement reforms differently if programs/agencies are ideologically divergent

Managers of ideologically divergent programs may perceive bias (whether or not a reform effort is biased against their programs)

WAS PART POLITICAL?

Designed to be good government, politically neutral reform, and qualitative studies do not report overt partisanship, but…

More liberal agencies and programs get lower scores (Gallo and Lewis 2012; Gilmour and Lewis 2006)

PART scores only related to President’s budget proposals for liberal programs (Gilmour and Lewis 2006)

DID POLITICS AFFECT RESPONSE TO PART?

Liberal agencies, though smaller, had significantly higher PARTs completed

Two types of effort:Observable: self-reported effort in completing PART – higher for managers in liberal agencies (Lavertu, Lewis and Moynihan 2013)

Discretionary: performance information use – lower for managers in liberal agencies (Lavertu and Moynihan 2012)

WHY WOULD PART IMPOSE A GREATER ADMINISTRATIVE BURDEN ON LIBERAL AGENCIES?

Liberal agencies likely concerned about making their programs look as good as possible, given preference divergence

Potentially greater scrutiny of liberal programs, requiring more costly agency data collection and reporting

LESSONS: HOW DO WE ENCOURAGE

PURPOSEFUL USE

WHEN DOES PERVERSE USE OCCUR?

Goal displacement – e.g. cream-skimmingData manipulation – including outright

cheatingBecomes more likely when

Data is self-reported Task is complex and hard to measure High-powered incentives attached to measures

Especially in contracting Job-training programs, tuition programs Policymakers have imperfect knowledge of perversity,

amend contracts after problems occur

Quarterly performance reviewsGoal leadersChief operating officers/performance

improvement officersHigh-priority goalsCross-agency priority goals

For summary, see Moynihan 2013

NEXT GENERATION PERFORMANCE SYSTEM? GPRA MODERNIZATION ACT OF 2010

Create learning forums: routine discussions of performance data with supervisors/peers associated with use (Moynihan and Lavertu 2012)

GPRA Modernization Act: quarterly performance reviews

Not just routines, also learning culture Tolerates error Rewards innovation Brings together multiple perspectives Gives discretion to users

Tradeoff between learning and accountability Accountability evokes defensive reactions and gaming

CONTINUING CHALLENGE: HOW TO MAKE USE OF PERFORMANCE DATA

You might want to measure everything but you can’t manage everything

Problem with PART – equal attention to all goals

Modernization Act: focus on important targets, areas of opportunity (high priority goals, cross-agency priority goals)

LOOK FOR ACTIONABLE DATA

FOSTER GOAL CLARITY

Clear goals increase performance information use (Moynihan and Pandey 2010); may not be easy if:Service has many different aspectsTension between:

Few enough measures to generate attentionEnough measures to avoid encouraging workers to ignore unmeasured aspects

Appeal to altruistic motivations, not extrinsic reward (Moynihan, Wright and Pandey 2012)

Select goals that motivateClear line of sight between goals and actions

Celebrate achievementConnect to beneficiaries

APPEAL TO ALTRUISM

Performance data tells you if a measure moved up or down, evaluations tell you what affects performance

Discussion of evaluations should be incorporated into performance management

Assign evaluation funding for new policiesExample: Washington State Institute for

Public Policy provides meta-analyses of research on different policies, and provides return-on-investment estimates to policymakers

INTEGRATE PROGRAM EVALUATION AND PERFORMANCE MANAGEMENT

Leadership commitment associated with use (Dull 2009; Moynihan and Lavertu 2012)

How do you create commitment?Reputation: public commitments and responsibility (high priority goals)

Create leadership positions with oversight for performance (COOs, PIOs, goal leaders)

Select leaders based on ability to manage performance

INDUCE LEADERSHIP COMMITMENT

Welcome your feedback and questions

Performance Information Project: http://www.lafollette.wisc.edu/publicservice/performance/

index.html

dmoynihan@lafollette.wisc.edu

CONCLUSION

REFERENCES Ask im, J os te in , Åge Johnsen , and Knut -Andreas Chr i s tophersen . 2008 . Fac tor s beh ind

o rgan i za t iona l l ea rn ing f rom benchmark ing : Exper iences f rom Norweg ian mun ic ipa l benchmark ing ne tworks . J ourna l o f Pub l i c Admin i s t ra t i on Research and Theory 18 (2 ) : 297–320 .

Charbonneau , E t i enne , and Franço i s Be l l avance . 2012 . B lame Avo idance in Pub l i c Repor t ing . Pub l i c Pe r fo rmance & Mana gement Rev iew 35 (3 ) : 399 -421

Ga l l o , N i ck and Dav id E . Lew is . 2012 . The Consequences o f Pre s ident ia l Pa t ronage fo r Federa l Agency Pe r fo rmance J ourna l o f Pub l i c Admin i s t ra t i on Research and Theory. 22 (2 ) : 195 -217 .

Du l l , Ma t thew. 2009 . Resu l t s -mode l re fo rm leadersh ip : Ques t ions o f c red ib le commi tment . J ourna l o f Pub l i c Admin i s t ra t i on Research & Theory 19 (2 ) : 255–84 .

Hood, Chr i s topher. 2006 . Gaming in ta rge twor ld : The ta rge ts approach to manag ing B r i t i sh pub l i c se rv i ces . Pub l i c Admin i s t ra t i on Rev iew 66 (4 ) : 515– 21 .

James , O l i ve r. 2011 . Manag ing C i t i zens ’ Expec ta t ions o f Pub l i c Se rv i ce Pe r fo rmance : Ev idence f rom Observa t ion and Exper imentat ion in Loca l Government Pub l i c Admin i s t ra t i on , 89 (4 ) , 1419-35 .

  G i lmour , John B . , and Dav id E . Lew is . 2006a . Assess ing per fo rmance budget ing a t OMB: The influence o f po l i t i c s , pe r fo rmance , and program s i ze . J ourna l o f Pub l i c Admin i s t ra t i on Research a nd Theory 16 :169 -86 .

Laver tu , S téphane and Dona ld P. Moyn ihan . 2013 . Agency Po l i t i ca l Ideo logy and Re form Imp lementa t ion : Pe r fo rmance Management in the Bush Admin i s t ra t i on . J ourna l o f Pub l i c Admin i s t ra t i on Research and Theory

Moyn ihan , Dona ld P. a nd Da n ie l Hawes . 2012 . “Respons iveness to Re fo rm Va lues : The Influence o f the Env i ronment on Pe r fo rmance In format ion Use .” Pub l i c Admin i s t ra t i on Rev iew 72 (S1) : 95 -105 .

Lavertu, Stephane, David Lewis and Donald Moynihan. 2013Government Reform, Pol i t ica l Ideology, and Administrat ive Burden: The

Case of Performance Management in the Bush Administrat ion. Forthcoming in Publ ic Administrat ion Review

Moynihan, Donald P. 2008. The Dynamics of Performance Management. Washington DC: Georgetown University Press.

Moynihan, Donald P. 2013. The New Federal Performance System: Implementing the New GPRA Modernization Act. Washington D.C.: IBM Center for the Business of Government.

Moynihan, Donald, and San jay Pandey. 2010. The b ig quest ion fo r per formance management : Why do managers use per formance in fo rmat ion? Journa l o f Publ ic Admin is t ra t ion Research and Theory 20(4) : 849–66.

Moynihan, D. , Pandey, S . , & Wr ight , B . (2012a) . Prosoc ia l va lues and per formance management theory: The l ink between perce ived socia l impact and per formance in fo rmat ion use . Governance , 25(3) , 463–83.

Moynihan, Donald P. and Daniel Hawes. 2012. Responsiveness to Reform Values: The Influence of Environment on Performance Information Use. Publ ic Administrat ion Review 72(S1) : 95-105.

Moynihan, Donald, and Patricia Ingraham. 2004. Integrative leadership in the public sector: A model of performance-information use. Administration & Society 36(4): 427–53

Moynihan, Donald P. and Stéphane Lavertu. 2012. “Does Involvement in Performance Reforms Encourage Performance Information Use? Evaluat ing GPRA and PART.” Publ ic Administrat ion Review 7(4): 592-602