Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

46
Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

Transcript of Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

Page 1: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

Page 2: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

• Expect to see an open window on the presenter’s computer, hear audio and see a real-time camera image of the presenter.

• If you have a question during the presentation, simply type it into the box labeled “Questions” and press the “Send” button. The presenter will answer questions following the presentation as time permits.

• We record each webinar; expect an e-mail with a download link in the coming days.

Page 3: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

Prepared for the UNICON Research CommitteeSeptember, 2015

Tom Cavers, Jim Pulcrano, Jenny Stine

The Financial Times Executive Education Rankings – A 360° View

Page 4: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

4

• Background• Key Findings and Recommendations• Market Perspectives• Schools and Experts’ Perspectives• Challenges & Disconnects• Detailed Recommendations• Closing Thoughts

Agenda

Page 5: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

5

• This report was sponsored by UNICON; however, the viewpoints and materials represented here are those of the authors only

• Just prior to completion of this research, Pearson sold the FT to Nikkei

• Early into the research (in early 2015), Business Week decided to drop its biannual Executive Education and EMBA rankings – making the FT’s the only active EE ranking today

A Few Preparatory Remarks

Page 6: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

6

Executive Education Rankings

1. How do the FT’s EE rankings really work? (Covered in separate Webinar)

2. What do the rankings say about quality?

3. Do customers of executive education use the rankings, and if so, how?

4. How do business schools perceive the rankings, and what would they change?

5. What do the experts have to say about the rankings?

Ingoing Questions

If we were to do a 360° review of the FT’s EE rankings… what would we learn?

Page 7: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

7

Research Methodology

MarketSurveys & Interviews

Secondary Research

• Open participants – 713 surveyed and 5 interviewed

• Learning & development professionals (custom customers) – 279 surveyed and 9 interviewed*

• Detailed research into the FT’s rankings methodology and history

• Analysis of historical data, including correlation with MBA’s, swings in ranking data, etc.

• Research into academic literature, articles and best practices on university rankings in general

* The vast majority of L&D professionals were from Europe (231.) Over 70% were responsible for selecting providers of custom programs

School & Expert Surveys & Interviews

• Leaders at business schools – 94 surveyed (86 from UNICON schools; 8 from other EQUIS schools),

and 12 interviewed

• Experts – 14 interviewed, including both current and former rankings executives as well as the FT

Page 8: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

8

Key Findings from our Surveys and Interviews

• Customers of executive education are aware of rankings, and use them

• Many business schools are dissatisfied with rankings – but at the same time, they are overwhelmingly responsible for communicating them

• Experts stress that rankings are not quality measures and also may fail to capture – and may even stifle – innovation

• The rankings can be useful, but must be understood by those who use them

• The rankings can be improved, and this will require collaboration between the FT, schools and customers

Page 9: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

9

Authors’ High Level Recommendations

• Be more aware of the limitations of the rankings

• Engage with the FT to make sure the rankings criteria meet your needs

• Consider and respect the value the rankings provide to some customers

• Use the information in this report to participate in them more effectively

• Understand the effort that the FT puts into the rankings

• Collaborate with business schools and customers to build new and better rankings

• Address issues of transparency, diversity, and customer value

Open & Custom CustomersBusiness Schools

Page 10: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

10

Market Perspectives

Page 11: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

11

Overall, Strong Awareness

• Vast majority of both Open and Custom clients are aware• Awareness highest in Europe

Page 12: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

12

Considered Valuable by the Majority of Stakeholders

• Over two-thirds of ALL stakeholders believe rankings are at least “moderately valuable”

• Business Schools value rankings much less than clients

• Clients value rankings significantly more when they do not know the school well

Page 13: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

13

Many Open Customers/participants Use it, Especially In Europe

• Among Open customers, 45% check the rankings before attending a program- In Europe, 2/3rds, outside Europe, only about 1/3rd

• However, it is not a major factor in their decision:- 55% had already chosen the school!- 24% used the list to find a school- 18% used the ranking to support decision

Page 14: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

14

L&D Customers – Sophisticated Users

If you’re good at your job, you know who is good, and you don’t need the rankings for this. We certainly don’t choose based on FT rankings.

It’s never been something that I used in a decision. It might have navigated me where to look or it might have helped justify something after the fact, but it has never been core to the decision.

• L&D Customers place importance on the rankings; it’s not just about being in the top 10 – changes in position as well just being in the rankings are considered important (more so than accreditation, btw)

• All stressed in our interviews that the rankings were never core to their decisions

Source of quotes: L&D interviews

Page 15: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

15

L&D Customers’ Myriad Uses

When we’re starting the search for a partner for a new program, we start with the rankings to see who should be invited to the RFP. It’s a starting point, but you won’t win based on the rankings.

When I moved from strategic consulting to an L&D role I was completely lost, and the FT was a very valuable resource at that time. Through the FT I learned what the important criteria were. For me, it was a way of seeing what the global norm was in executive education. I kept it right on my desk so that I could easily refer to it as needed.

A Starting Point in the Search An Educational Resource

A Tool to Support Decisions A Guide for Foreign Markets

After a while I didn’t need the ranking for the big names in the US and Europe. But [the ranking] was still vital for us when we started to work with schools in Asia. For Asian schools, I had no idea who was doing what, so I went to the FT rankings.

I find them useful with procurement. They help me justify why we’re choosing this particular provider and why it costs so much.

Source of quotes: L&D interviews

Page 16: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

16

Schools and Experts’ Perspectives

Page 17: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

17

Schools – Not Everyone Participates

• 51 of the 94 schools we surveyed participate in the FT EE rankings

• Among those that do NOT:- 55% chose not to participate- 29% have never participated- 14% formerly participated

Our students don’t care if we participate. In our application process we ask, ‘how did you find the program?’ And one of the responses is by ranking. The response to this is ‘none’.

For starters, we are too small to qualify. But even if we did qualify, our offerings are all in a particular niche so the general rankings aren't that relevant.

We are a boutique ExecEd business. We meet the minimum requirements, but would find it difficult to impose the process on our select clients

Our schools' business model is different of the traditional, so FT criteria is not relevant

Source of quotes: Business School Exec Ed Leader interviews and “open responses” to surveys

Page 18: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

18

Of Greatest Importance to European Schools

We don’t participate in FT because in Latin America it is less important – the programs are mostly American and the comparison between schools will be difficult.

No, we have not considered participating... both FT and others are too US/Euro centric for us to be able to be fairly represented. We are a small school (comparatively) in terms of numbers, and remote geographically-speaking.

Source of quotes: Business School Exec Ed Leader interviews and “open responses” to surveys

Page 19: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

19

Satisfaction is Mixed at Best Amongst Schools

Among those that DO participate, they are not very satisfied:

- Open, >40% not at all/slightly- Custom, >60% not at all/slightly

Page 20: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

20

For Some, An Important Marketing Tool to Signal Quality

I can use this information to differentiate myself from other regional competitors since no one else in my region is even ranked by FT. I don't think it is necessarily the winning criterion for clients but it is something.....

The ranking demonstrates that we are the strongest regional player and that we are recognized globally. This helps us attract participants, who might otherwise travel to Europe or the US for training, and helps justify our premium pricing.

When we talk with companies about their executive development needs, they almost always want to know what other companies are doing. Exec Ed rankings provide confidence to potential clients that we have done valuable work with other companies.

Custom rankings are used to ratify a decision to work with a school – it is one of several factors including clients, accreditation, awards – they are used by senior HR executives and other executives....Rankings also feed into the general awareness of a school.

For open programs, we know that reputation is one of three crucial factors for individuals to choose a program. The others are faculty and cutting-edge content.... We assume that rankings are a big part of reputation.

We brag about it all of the time. We talk about having the Triple Crown [three accreditations], plus the FT. We even have the four logos on our letterhead.

We promote our ranking by the FT on every occasion possible.

Source of quotes: Business School Exec Ed Leader interviews and “open responses” to surveys

Page 21: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

21

Schools Communicate the Rankings Broadly

I find out about the rankings from the schools themselves. The schools tell us, not the FT!

The rankings are great branding and help with awareness. The head of the FT rankings, Della, is also very knowledgeable about the industry and a good speaker. Her comments carry a lot of weight.

Source of quotes: Business School Exec Ed Leader interviews and “open responses” to surveys

Page 22: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

22

Differing Views On Ranking Position

Rankings are a stamp of credibility. It says something that we are in the rankings.

What is probably most important is that you are in the top 10. Not whether you are 3 or 5. Being in the top 10 is really important.

Some schools choose not to participate because of the risk they won’t be in the top 20.

[T]he rankings signal belonging to a top group of schools internationally, and recognition as an ‘elite institution’, and hence signal distinction from other schools.

Some Say Ranking Position Matters

Others Say It’s Important Just to

Be In the RankingsWhen looking at the results we focus even more on the sub-rankings, as these usually make us look better.

Source of quotes: Business School Exec Ed Leader interviews

Page 23: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

23

A Quality Tool – But with Caveats

We’d be very cautious of making any change just based on the rankings – the need for this change came through other areas as well. You could be 20th on something, but there might be a very small margin between you and #s 1-19 because it is a very competitive criterion – there has to be other research and data to support making a change, not just the rankings.

We continually evaluate the FT data as part of our ongoing effort to improve quality, even within our most powerfully-performing programs.

We use rankings as a form of customer feedback – it helps us analyze where we can improve the service to get higher up.

• 29 of the 51 schools participating had made “programmatic changes” based upon the FT’s EE ranking

• However, in our interviews, some important qualifications were made

Source of quotes: Business School Exec Ed Leader interviews

Page 24: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

24

Some Datapoints on the Schools’ Process

4+ days to Participate

A Variety of Staffing Approaches

Not That Difficult

Few Schools Gather Data

• 35% marketing director and marketing department • 33% with the executive director (21% had executive directors,

program managers and marketing working together while 12% made it a shared responsibly among senior leaders)

• 10% delegated the process entirely to program managers• 21% relied on a range of offices• Only 3 schools had dedicated roles/offices

• 80% of all schools used a CRM program• However, only 2 schools query participants about rankings in

exit surveys• And only 16 schools survey about rankings in their internal

market research

Page 25: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

25

Many Pointed Critiques in Schools’ Survey Comments

Favors Big, Traditional

Schools

• Update survey to reflect issue important to executives and corporations in 2015• Find some way to smooth our the occasional dramatic shifts in results • Review questions and make sure it is understandable to a person outside learning

and development• Auditing process to all ranked schools to verify data• Flexibility on the $2 million cut-off… in countries with a weak currency• Now that the FT is in the exec ed business itself, outsource the entire process to an

honest broker.

More Transparency

Required

Survey Needs Improvements

• Remove bias towards old-fashioned programs run in residential, on campus facilities• For large schools 20 custom clients might represent their top 20% of all custom, for

many schools it represents 100% of their custom and that introduces sample bias• The OE ranking should be called the AMP/GMP ranking – we don’t offer these type

of programs• Emphasize quality over volume (participant/client experience impact)

• Find a way to share source data to allow people to do meaningful benchmarking.• Find an honest way to show the statistical significance / spread of results at every

level of granularity.• Provide more info on criteria and rationale for weights• Share raw data

Source: “open” responses to surveys

Page 26: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

26

• Rank within specialty areas - Strategy, Leadership, Innovation, Finance, and functional expertise in IT, Supply Chain Management, etc.

• Make the scaling simpler and only publish schools' positions within bands, rather than each position. the difference between some Schools is so insignificant that it is daft to think of the 5th ranked School as better than the 6th for example.

• Stop ranking 1, 2, 3, and instead qualify schools at levels, i.e. top tier, middle, etc. How do you really differentiate between #17 and #18. Their research processes are not this precise. Is there really a measurable difference? Rank by regions.

• Sub-rankings to compare more similar programs (in terms of staff and financial resources)

A Lot of Ideas to Update the Rankings As Well

Source: “open” responses to surveys

Page 27: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

27

Rankings Experts Also Have Concerns

It has to do with reputation and status, but very little to do with quality....The ranking does function for improvement because scales have been set. If we improve those things, it can get better by the scale set by the rankings and that might lead to actual improvement. But, if you define it as improved learning or teaching assessment, then it doesn’t define these.

Not a Quality Metric A Black Box Stifles Innovation?

Many of the things they measure have nothing to do with quality (for example, percent international faculty).

They are too much of a black box. To the extent a school wants to use them for improvement, and also given the amount of time it takes a school to respond, it would be great to understand more.

When I teach the rankings, I put the entire table up and I ask: Do you think this is transparent? Even if you spend an hour you won’t see it as transparent.

Non-traditional executive education – firms working the customized space ...are pushing the executive education frontiers, and they get no credit for this [because innovation is not included in the rankings criteria]. By not including leadership consultancies in the rankings, you are limiting opportunity for customers, and it also doesn’t push business schools to do better.

Today, innovations only come from the small schools, usually unranked, not the top-ranked schools.

They should all publish all of their data, but they won’t because there aren’t statistically significant differences.

Source of quotes: Rankings Experts interviews

Page 28: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

28

Challenges & Disconnects

Page 29: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

29

Challenges/Disconnects

• In the course of our work, we found there are some fundamental disconnects between what the FT EE rankings measure and what EE providers are offering

• In addition, we believe there are some significant challenges – some of the FT’s making and some that are inherent to this industry – in how the FT has chosen to measure schools

• These issues fall into 3 main buckets:

MethodologyRelevance Credibility

Page 30: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

30

Could the Rankings Have Greater, Industry-wide Relevance?

The biggest issue I have with the FT rankings is that the original survey was developed in 1998 and launched in 1999. The survey has more or less stayed intact over the preceding 16 years, yet our industry has dramatically shifted in that period. Are the questions the FT asks really the relevant questions for the future of Exec Ed? Is the survey itself ready for a refresh to reflect the digital age?

Are the criteria keeping up with the times?

Is there too much bias to international schools?

FT is so skewed to favor exec. ed. groups that have "international" participants that it's of little interest to us

Could the rankings be more inclusive?

Shorter courses are a mega-trend in corporate course demands from clients.

Are 3+ day programs the right thing to measure?

Increasingly there is a mixture of executive education offered by faculty centers and our EE department, and in our organization structure we don't have [custom and open[ separated. In fact our staff works in both kind of programs more and more which enhances the cooperation and exchange of knowledge.

The USD 2 million cut-off for consideration should vary by country. It is far easier to obtain this level in a developed country.

The FT requires accreditation. Is this keeping out good mid-tier schools? Or new schools in Asia?

Source of quotes: Business School Exec Ed Leader interviews

Page 31: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

31

Could the Methodology be Stronger?

The problem with the executive education ranking is that it asks an absolute question and answers it in a relative way. What did you think about the teaching at X school? And then the FT presents that against other schools who’ve been asked the same question.

Challenges with Satisfaction Data

The “Bullwhip Effect”

The highest we got was a [single digit]; the lowest was in the high 30s. You know you haven’t radically changed programs to account for that jump.

Source of quotes: Business School Exec Ed Leader interviews

Page 32: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

32

Could the Methodology be Stronger?

Everyone rates schools at 8, 9 or 10 – all good scores. I understand that the data gets normalized [into z-scores], but I think it functions in effect like a 3 point scale.

Weighting (esp. Custom)The 10-point Scale

Our biggest challenge is getting enough response data for the EE custom rankings. Schools need to have at least 5 clients to be included, and if it is low, then there can be a lot of variability in the rankings. This tends to smooth out if the school has a large number of clients..

Sampling

Companies in the custom survey that have more than one executive education program get greater weighting – why? We think if we get 100% of the company’s business, that’s a good sign...

There’s a problem with the way the questions in the surveys are scored – the average score is 9/10. The difference in data between schools is almost meaningless. In addition, the scale has very different interpretations from one region to the next.

The] FT asks for 2 [open] programs in detail. This allows institutions to cherry-pick a couple of standout programs while much of their offering is so-so. A better ranking would base itself on a wider number of programs.

[The FT] also attaches weight to things – like the "seniority" of the program sponsor for custom programs; whether they are described as "strategic,” "operational" or "functional"; how many exec. ed. providers a company works with; ask "are you planning to repeat" – without asking WHY (or WHY NOT).

Source of quotes: Business School Exec Ed Leader interviews

Page 33: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

33

What Could Be Done to Insure Credibility?

What goes into the questions?

Some questions are ambiguous, and there tends to be a bias towards traditional styles of delivery (content focus, face to face).

Regarding lack of transparency, I think the FT could do more. They could put all 30 questions on their site. They could put their rules on the site and advice on how to interpret the questions. They could add some statements about their rules around transparency of data, etc.

Why not share more information?

Can the FT remain objective?

Of course rankings are about ads and traffic. But at the same time they also help customers make informed decisions and elevate quality.

Why no audits? [To improve the ranking,] more auditing to ensure fairness and to verify data.

Source of quotes: Business School Exec Ed Leader interviews

Page 34: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

34

Detailed Recommendations

Page 35: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

35

1. Align your rankings strategy and tactics with your school’s strategy2. Take the rankings seriously & be proactive (if you believe they are useful to your school’s strategy3. Put someone in charge of the rankings as part of his or her job4. Focus on and survey the open programs that you are going to have ranked5. Innovate outside the programs that you have offered for the rankings 6. Be explicit about what you offer to participants by criteria7. Pay attention to your L&D counterpart and her needs8. Take a long-term view9. If you do decide to opt out, find other ways to generate awareness10. Proactively share your quality metrics with the FT and the world

Recommendations for Schools

Assuming the Rankings Stay The Same…

Page 36: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

36

1. Have your own strategy for how to use the rankings2. Develop your own criteria before you rely solely on those of the FT3. Go beyond just looking at the final column in the rankings – examine the criteria

important to you and your participants4. Do not assume a halo effect between rankings5. Understand what accreditation of a school means for you6. Let the Financial Times know what you need

Recommendations for Open & Custom Clients

Assuming the Rankings Stay The Same…

Page 37: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

37

1. Reconsider the broad, current definition of a “program”

2. Update the criteria3. Rank within specialty areas –

Strategy, Leadership, Innovation, etc.4. Provide alternative rankings for large

and small custom providers5. Consider creating an online tool for

users to create their own weightings/rankings

6. Consider adopting the Princeton Review approach and publishing more types of rankings, perhaps updating each less frequently

7. Consider having the participants and customer surveys in additional languages

Recommendations for the FT

8. Update to a Likert Scale9. Use a random and/or more

representative sampling methodology10. Simplify the ranking down to a few

variables11. Find another way to smooth out the

occasional dramatic shifts in results12. Audit schools on a rotating, regular

basis (as with MBA ranking)13. Apply purchasing parity to the

revenue minimum14. Instead of a numerical ranking,

consider qualifying schools by level (e.g., top tier, middle tier, etc.)

15. Specify in greater detail which types of programs are being ranked

16. Be more explicit about what the major criteria mean and share the questions behind them

17. Provide the ranking formula18. Face the FT/IE Corporate Learning

Alliance issues head on and outsource more of the process

Modernization Methodology Transparency

Page 38: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

38

Closing Thoughts

Page 39: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

39

Closing Thought: What If…

What if the rankings reflected the richness

and diversity of thought within this

industry?

What if the rankings

truly reflected quality?

What if the rankings actually

encouraged innovation?

What if the rankings

were a truly useful tool to help find programs?

• What if the FT, clients and business schools all gathered around a table?• What if the rankings were re-imagined to reflect today’s market?• What kind of ideas would bubble up?

What if the rankings

were open source?

Page 40: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

40

Webinar and infographic on “How the Rankings Work”

Additional Resources

Visit UNICON website for:

Copies of all the ranking magazines since inception

(PDF’s)

Historical ranking data for open, custom and MBA (Excel)

Page 41: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

41

UNICON is a global consortium of 110 business-school-based executive education organizations. Its community of member organizations is engaged in accelerating the development of leaders and managers, thereby enhancing performance in public and private organizations globally through executive development initiatives. UNICON’s primary activities include conferences, research, benchmarking, sharing of best practices, staff development, recruitment/job postings, information-sharing, and extensive networking among members, all centered on the business and practice of executive education. To learn more, please visit http://uniconexed.org/.

About UNICON

Page 42: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

42

Appendix

Page 43: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

43

Geographical Distribution of Survey ResponsesOpen Surveys

(English & Spanish Version)

L&D Survey School Surveys

Page 44: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

44

Custom Criteria Detail/DefinitionsThe first 10 criteria are supplied by companies that commissioned courses; the last five by business schools. these criteria are presented in rank form, with the leading school ranked number one. the final two criteria are for information only, and do not inform the ranking. Figures in brackets show the weight each criterion contributes to the overall ranking. the weighting accorded to the first nine criteria, from preparation to value for money, accounts for 72 per cent of the total ranking’s weight. It is determined by the level of importance that clients attach to each.

Preparation (8.3): level of interaction between client and school, the extent to which clients’ ideas were integrated into programmes, and effectiveness of the school in integrating its latest research. Programme design (8.4): flexibility of the course and the willingness of schools to complement their faculty with specialists and practitioners. Teaching methods and materials (8.0): extent to which teaching methods and materials were contemporary and appropriate, and included a mix of academic rigour and practical relevance.Faculty (8.5): quality of teaching and the extent to which faculty worked together to present a coherent programme.New skills and learning (8.4): relevance to the workplace of skills gained, the ease with which they were implemented, and the extent to which the course encouraged new thinking. Follow-up (6.8): extent and effectiveness of follow-up offered after the course participants returned to their workplaces. Aims achieved (8.6): extent to which academic and business expectations were met, and the quality of feedback from individual participants to course commissioners. Facilities (7.0): rating of the learning environment’s quality and convenience, and of supporting resources and facilities. Value for money (8.0): clients’ rating of the programme’s design, teaching and materials in terms of value for money. Future use (8.0): likelihood that clients would use the same school again for other customised programmes and whether they would commission again the same programme from the school. International clients (5.0): percentage of clients with headquarters outside the business school’s base country and region.International participants (3.0): extent to which customised programmes have participants from more than one country.Overseas programmes (4.0): international reach of the school’s customised programme teaching. Partner schools (3.0): quantity and quality of programmes developed or taught with other business schools. Faculty diversity (5.0): according to nationality and gender.Total responses: number of individual surveys completed by the school’s clients. Figures in brackets indicate the number of years of survey data counted towards the ranking.Custom revenues: income from customised programmes in 2013 in $m, provided optionally by schools. Revenues are converted into Us$ using the average dollar currency exchange rates for 2013.

Page 45: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

45

OE Criteria Detail/DefinitionsThe first 10 criteria are supplied by programme participants; the next six from each business school. these criteria are presented in rank form, with the leading school ranked number one (apart from women participants, which are shown as a percentage). Revenue data are provided for information only and do not inform the ranking. Figures in brackets show the weight each criterion contributes to the overall ranking, as determined by participants on the programmes. the weighting accorded to the first 10 criteria, from preparation to facilities, accounts for 80 per cent of the total ranking’s weight. It is determined by the level of importance participants attach to each.

Preparation (7.7): provision of advance information on programme content, and the participant selection process.Course design (8.6): flexibility of the course and appropriateness of class size, structure and design.Teaching methods and materials (8.3): extent to which teaching methodsand materials were contemporary and appropriate, and included a suitable mix of academic rigour and practical relevance.Faculty (8.7): quality of the teaching and the extent to which teaching staff worked together to present a coherent programme.Quality of participants (8.0):extent to which other participants were of the appropriate managerial and academic standard, the international diversity of participants, and the quality of interaction among peers. New skills and learning (8.7): relevance of skills gained to the workplace, the ease with which they were implemented, and the extent to which the course encouraged new ways of thinking.Follow-up (7.3): level of follow-up offered by the school after participants returned to their workplaces, and networking opportunities with fellow participants. Aims achieved (8.6): extent to which personal and professional expectations were met, and the likelihood that participants would recommend the programme.Food and accommodation (6.6): rating of their quality. Facilities (7.5): rating of the learning environment’s quality and convenience, and of supporting resources and facilities. Female participants (2.0): percentage of female course participants.International participants (3.0): amalgamation of the percentage of participants from outside the business school’s base country and region. Repeat business and growth (5.0): amalgamation of growth in revenues and percentage of repeat business. International location (3.0): extent to which programmes are run outside the school’s base country and region.Partner schools (3.0): quantity and quality of programmes taught in conjunction with other business schools. Faculty diversity (4.0): diversity of school faculty according to nationality and gender.Open-enrolment revenues: income from open programmes in 2013 in$m, provided optionally by schools. Revenues are converted into Us$ using the average dollar currency exchange rates for 2013.

Page 46: Use the orange arrow at the top left of the Go-to-Meeting window to open all of your controls.

46

Is There Alignment on Criteria?

FTOE Participant Company Rep

1. New skills and learning2. Faculty3. Course design4. Aims achieved5. Teaching methods and materials6. Quality of participants7. Preparation8. Facilities9. Follow-up10. Food and accommodation11. Repeat business and growth12. Faculty diversity13. International participants14. International location15. Partner schools16. Female participants

1. New skills and learning2. Course design3. Faculty4. Quality of participants5. Teaching methods and materials6. Aims Achieved/Objectives met7. % International participants8. Selection/Preparation9. Faculty diversity10. Follow-up (post-program)11. International location12. Facilities13. Repeat business and growth14. Partner schools15. % Female participants16. Food and accommodation

1. Course design2. Faculty3. Teaching methods and materials4. Aims Achieved/Objectives met5. New skills and learning6. Quality of participants7. % International participants8. Faculty diversity9. Follow-up (post-program)10. Selection/Preparation11. International location12. Repeat business and growth13. % Female participants14. Facilities15. Partner schools16. Food and accommodation

* Only Open program criteria were included in the survey

Strong Alignment on Top 6

Weaker Alignment on Middle Tier

Relevance/weight of bottom tier / school-provided stats?