Are MBA Rankings All They’re Cracked Up To Be?

Depending on who you ask, MBA rankings are either the great leveller that sort quality programs from the rest or a great waste of time that are only useful for comparing apples and oranges. As with many things, the truth lies somewhere in between.

Australia’s highest profile ranking – AFR BOSS magazine’s biennial sorting of MBA and EMBA programs – was released recently. It is an incredibly competitive space with business schools around Australia going all out to maximise their result. This year the University of Sydney took honours on both lists.


Internationally, a swag of publishers undertake annual or biennial benchmarking projects including The Financial Times, The Economist, Forbes, Times Higher Education, QS, Bloomberg Businessweek and Locally the Graduate Management Association of Australia (GMAA) has undertaken a biennial five star ranking project for many years. This year’s GMAA ranking has been delayed to allow for a re-working of the methodology underpinning the project.

Rankings have been a hallmark of the global MBA ecosystem since 1988 when Businessweek launched its first comparative analysis of top US MBAs. They are great marketing tool for the publishers that compile them and a great marketing tool for the business schools that achieve good results. Many students use the rankings as a key part of the decision-making process for deciding which school to attend.

Growing Disquiet

Over recent years criticism of the way rankings are compiled and used has grown louder. It reached a crescendo earlier this year with the release of two major reports that pinpointed the shortcomings of the benchmarking system.

United Nations Global Compact report (funded by Aviva Investors) recommended a major overhaul of the way rankings are produced and specifically a re-prioritising of the post-degree salary outcomes that traditionally have made up a large part of the weighting.

second report compiled by the Association of MBAs (AMBA) and the Business Graduates Association (BGA) found that while nine out of 10 business school stakeholders believe that rankings have an influence on demand for individual programs, only one in ten think rankings reflect the true performance of MBA programs “very well”. Shockingly, approximately a third (34%) do not think rankings reflect an MBA’s performance ‘very well’ or ‘at all well’.

For people considering embarking on an MBA, the rankings system can be confusing and misleading. It is important they understand that not all rankings are created equal and not all the results are reflective of a program’s strengths and weaknesses. My advice is simple. Approach every ranking and every claim of superiority with a high degree of cynicism and focus on what YOU want from the program, not what others achieved. Here are just a few of the reasons.


A notable absence from this year’s AFR BOSS rankings was Melbourne Business School (MBS). The MBS MBA is easily regarded as one of the top programs, not just in Australia, but globally. It is easy to mount an argument that a ranking that does NOT include MBS could be regarded as significantly flawed and misrepresentative.

The voluntary nature of many of the rankings means that some schools can, and do, choice not to participate. MBS has recently rationalised its participation in rankings to the Financial Times and The Economist.

Every time you participate in a rankings survey, you need to ask alumni to take part and commit time to the process. There are so many different rankings that participating in all of them would put a strain on that relationship.” Laura Bell, Academic Registrar at Melbourne Business School

The data gathering exercise for most rankings is put back on business schools. Each ranking can take hundreds of hours of preparation and data gathering. For business schools and providers to participate in every ranking would require an investment of thousands of hours and it is simply unfeasible to take part in more than one or two.

Just as important as who is on each list, is who didn’t have the time or inclination to participate. If you’re considering a particular school and worried they are not on a particular ranking, don’t assume they didn’t make it. Ask them why they are not on the list for clarity.

Methodology issues

The United Nations Global Compact Report took a deep-dive into the issues around MBA benchmarking and highlighted many of the structural issues around methodology. It highlighted that many of the rankings criteria currently in use were developed years or decades ago, and use simple measures such as salary and salary progression. Less emphasis has been placed on what is taught and learned at the schools. The broad criteria used by the major ranking publications are shown below (SOURCE: John A Byrne .

In my view the AFR BOSS methodology is among the most efficient, despite not including salary outcomes as part of the ranking algorithm. While a pay increase is not the sole reason for students doing an MBA, it is an important component of the outcome, particularly for self-funded students who need some basis for determining the ROI of their degree.

Results of the AFR BOSS ranking are based on findings from two surveys – one from schools, one from alumni – to ensure the programs are evaluated from both perspectives. Data from alumni is weighted at 55 per cent, and school data at 45 per cent.

The AMBA/BGA report showed that a significant number of MBA stakeholders are unconvinced about the evidence-collection techniques employed by rankings agencies. One Business School leader said: ‘Sometimes, the methodologies of rankings are ambiguous, or the questions are open [to] interpretation, making it difficult to know whether Schools are providing comparable answers.’

Will Dawes, Research and Insight Manager at AMBA & BGA, said: ‘This perceived lack of transparency and resulting ambiguity could be associated with wider cynicism about the accuracy of MBA rankings. Much of the feedback generated in the survey points to low trust in the way responses are collected, with stakeholders suggesting that self-reporting of evidence could potentially lead to distorted results.’

Student Expectations

With more than 110 different MBA and EMBA programs now on offer in Australia the MBA as a degree has been democratised over the last decade. Students can now gain the qualification 100% online, for just over $7,000. This is nearly $120,00 less than the country’s most expensive program (MBS’ Senior Executive MBA).

Ultimately, the most important factor in assessing any MBA program is the meeting of an individual’s expectations.

A young engineer in regional Australia hoping to up-skill to take on more responsibility in his family firm and a highly-paid accountant in Melbourne hoping to jump into the c-suite want vastly different things from their MBA. They may also have vastly different financial resources, prior learning, career and lifestyle goals and professional and personal support networks.

How does any ranking system determine what is the ‘best’ program for them?

If they approach it with the right attitude and discipline, both will benefit from an MBA regardless of the program they do.

Until rankings begin to reflect the outcomes of an MBA with the expectations of students they will continue to draw criticism.

Ben Ready founded MBA News in 2014 and is the Managing Editor. He is a former business and finance journalist with Australian Associated Press (AAP) and Dow Jones Newswires in London. Ben completed his MBA in 2012 and was awarded the QUT GMAA Entrepreneurship Prize. He is also the founder and Managing Director of RGC Media & Mktng (