The Bitter Truth

There's a lot of heartburn regarding B-school surveys. A look at some myths and facts.

The Bitter Truth
info_icon

Although initially intimidated, I soon felt happy about it. They were all wrong; they didn’t have a clue about how B-school rankings are done; many, who sat in their cosy, air-conditioned offices, were criticising it only because their particular institutes had been ranked low in some surveys; then there were those who were shell-shocked by their institute’s rankings and opted out of most surveys. So, when my turn came to speak, I confidently parried most of the accusations.

Out there in Bangalore, among people who were mostly hostile towards the media, I decided to take the bull by the horns. I decided to write a piece dispelling these myths about B-school rankings the next time Outlook did its annual survey. Hence, here I am, making an attempt to tell the readers what’s-so-right with B-school surveys. You may feel I am a biased party but, hopefully, you won’t be so sure once you have read this. For, this article will also spell out what more can be done.

At the outset, I must point out that the history of B-school rankings in India is short—not more than seven years old. It took the US a few decades to perfect this science. In India, we are still groping, experimenting and innovating in a bid to perfect it. I must also add that any survey will leave a bad taste in some people’s mouths. For example, a senior administrator from Kellogg’s business school told me how their college had been ranked an abysmal 40 in one survey. Imagine, Kellogg’s being ranked so low! Now, it’s time to churn out a few facts to correct some fallacies.

info_icon

Myth 1: B-school surveys are subjective.
Some are, some aren’t. The Outlook-Cfore rankings are based on objective parameters. We look at six broad parameters (with different weightages), which include over three dozen sub-parameters. All this is supplemented by satisfaction surveys among stakeholders like recruiters, faculty and students (see methodology on Page 86). Over the past few years, our methodology has evolved to include new parameters, change the weightages if required, and make the process more scientific.

The problem: Schools with low scores blame the parameters or the weightages. We have had several requests—in some cases admonitions—that we change them. Reason: directors asking for the changes were sure that their college’s ranking will improve by such tinkering. But since the parameters have been finalised after long discussions—and critical inputs from experts—we don’t feel a need to change them because some schools want higher rankings.

What’s important is that we believe that the survey should be useful to students. It should help them choose the better colleges and, hence, the parameters and the weightages are decided on that basis. A student is more interested in placements and how much money s/he will earn after graduation—so these become important. Similarly, high satisfaction levels among recruiters will help students—so, such scores get a higher weightage. We also want the schools to improve in areas like intellectual capital and industry interface. So, these two parameters become critical.

Myth 2: The rankings are non-transparent.
All the magazines which conduct them spell out their methodologies. Nothing is hidden from the public. Everyone knows how the rankings have been arrived at.In our case, we go a step forward. Most of the data relating to our survey—even the scores against each sub-parameter—is posted on indiabschools.com (the same website from where colleges can download the forms to participate in the survey). Therefore, each school can find out their scores and compare it with others.

info_icon

Many colleges send their representatives to our agency to discuss the various issues, once the data has been posted on the site. They sit with the Cfore managers to find out why they scored less under some parameters, what is it that other colleges have that they don’t, what is the difference between, say, faculty members in their institutes and the ones that scored higher, and similar issues. And we encourage such discussions. Obviously, not everyone is happy with it.

Myth 3: It’s all about the money, honey.
This is a major grouse that most critics have against B-school surveys: it’s conducted to make huge profits from the series of ads released by various schools, it’s rigged and any institute can improve its rank by issuing more ads, it has no credibility. Let me accept that we do receive unusually high advertising. But let me categorically state that there is a huge—and thick—China wall between the editorial and marketing departments.

In the case of Outlook, the marketing people have no idea of the rankings, till the issue has been printed. In fact, none of the scores—in general or in the case of specific schools—are shared with them. They are not even fully aware of the various pieces we write in the B-school rankings issue. In a nutshell, rankings are done by Cfore and shared with the editorial. And that’s where it stops. In fact, I get regular queries from colleges that wish to know their rankings in advance—it’s always refused.

Myth 4: B-school rankings don’t make a difference.
Well, if that was true, I don’t understand why it upsets so many directors, faculty members and even students. But, here, let me draw on voices from the heads of various institutes. "B-school surveys are like a mirror, you can see your face and know things like where to comb your hair. It’s a great reality check," says Pritam Singh of MDI, Gurgaon. Adds A.K. Sengupta ofSIES, Navi Mumbai, "The rankings allow us to reflect, to know how to improve and set benchmarks for ourselves."

Pritam Singh says that in all the institutes he has worked with, as well as in MDI, internal discussions are held after each survey. "We look at what were our weak areas, what more can be done to improve our ranking. So, in a way, it creates pressure on the faculty and also gives a new dream to it," he explains. Others admit that it allows them to figure out the shortcomings in three broad areas—physical ambience (infrastructure), academic ambience (intellectual capital) and industry interface.

A number of institutes have improved because of the B-school rankings. For example, many of the Mumbai-based institutes were quite comfortable with their visiting faculty and didn’t recruit teachers so as to cut costs. Once they realised that it was denting their ranking, they have increased the number of faculty members. Even Sengupta admits that his faculty strength has gone up from eight to 30, both to improve rankings and also to makeSIES a better school that can compete with the best ones in India and abroad.

info_icon

Last year, MDI came fourth in one survey. It realised that it was lacking in one area—global placements. This year the institute sent people to Hong Kong, Singapore and some EU nations to improve on that parameter. Similarly,SIES started an academic journal. "Since intellectual capital also included sub-parameters like ‘papers published’ and other research work, it helped us in more than oneway. It forced faculty members to write papers and helped improve our research work," says Sengupta.

I was more amazed by another insight that I gained on my Bangalore trip. The head of a leading school revealed why he had opted out of all B-school surveys. "We were ranked quite low—and inconsistently. So, we decided to take a break, improve on various parameters and then participate so that we could be ranked higher," he admitted. As usual, there’ll always be directors who feel that theirs is the best college, they don’t need to do anything—and it’s the surveys that are lopsided and need to be changed.

To conclude, B-school surveys have a long way to go. We still need to learn and improve, we need to become more rigorous and scientific, we have to join hands with leading institutes and the regulators to improve our ranking process. Still, such surveys are more objective and transparent than what most people believe. They also help improve the general standards of B-schools.

Published At:
Tags
×