The Ministry of Human Resource Development (MHRD) must be commended for releasing “India Rankings” (2016), the first-ever effort by the government to rank higher education institutions (HEIs) in the country. These rankings will become an annual feature, and it is expected that both public and private institutions, and certainly students and parents, will find them useful. For now, however, the participation of HEIs in the initiative is still voluntary and needs to improve further. Also, as the report itself acknowledges, there are question marks over the quality of data submitted by the participating institutions. In fact, rankings under two categories — architecture and general degree colleges — could not be released due to lack of reliable data and/or low levels of participation. Questions have also been raised about the methodology used to prepare the rankings for the other four categories.
In recent years, it has become quite impossible to ignore university rankings of various kinds, which include those that are global, regional and subject-wide in scope, especially by Times Higher Education (THE) and Quacquarelli Symonds (QS). The MHRD’s decision to launch its India-wide university rankings was, in fact, partly in reaction to the poor performance of Indian institutions — including the IITs and the central universities — in world university rankings prepared by THE and QS. The stated intent of the government was to prepare India-centric ranking parameters that were sensitive to metrics such as access to higher education and social inclusion. Interestingly, if one goes through the details of the National Institutional Ranking Framework (NIRF), the weightage given to India-specific parameters is not pronounced.
The decision to prepare India-centric rankings was initially criticised by many higher education commentators. However, since these rankings are here to stay, it makes more sense to focus on improving the NIRH framework even while acknowledging that any rankings exercise will suffer from some limitations. For example, there have been very credible criticisms of the ranking methodologies used by the THE and QS rankings over the years; some have been corrected but newer problem issues have cropped up.
More specifically about the MHRD rankings, apart from the problems already identified, there have been other kinds of criticisms about the report. Without naming names, Maheshwar Peri expressed surprise at the inclusion of some not-so-good business schools among the top 50 institutions in the “management” category and the exclusion of others, which deserve to be there. He has also expressed concern that the rankings will (mis)lead prospective students to those business schools whose record on placement and average salaries is unimpressive.It is also a matter of concern and surprise that the IITs have chosen to participate in the rankings under the “engineering” category. Though they are recognised as engineering schools first, they compete under the category of “universities” in THE and QS world and regional university rankings. To the extent that they aspire to compete globally as universities, it is strange they should compete as engineering institutions in “India Rankings”.
It is also worth speculating whether “India Rankings” will be used for other purposes, especially in the context of recent announcements by the government. In his budget speech, the finance minister stated the plan to help 10 public and 10 private universities to become world-class institutions. More recently, the PM noted that these universities would have “complete autonomy in academic, administrative and financial matters”. The question is: How will the government go about selecting these elite institutions? It is possible that “India Rankings” (2016), and its later editions, will be used to identify and select a group of universities which will be granted complete autonomy.