The National Institutional Ranking Framework (NIRF), launched by the Ministry of Education in 2015, has undergone a transformative decade. It has evolved from a nascent government initiative with a clear domestic mandate into a central pillar of India's Higher Education ecosystem.
Over the past ten years, NIRF has demonstrably achieved many objectives, motivating the Higher Education Institutions (HEIs) to improve their academic and research standards to a global level. It now also serves as a key benchmark for government funding and policy formulation, though it continues to grapple with significant challenges, concerning the methodological as well as the data-related aspects. This article reviews the achievements from the stakeholders' perspectives, analyses the challenges, and suggests a road map for the future.
Genesis of NIRF and objectives
The National Institutional Ranking Framework (NIRF) was born from the recommendations of a Core Committee, established by the Government of India. Its purpose has been to evaluate HEIs in India, in alignment with the social objectives, in the context of challenges faced by the Indian education sector.
NIRF was established with three broad goals: benchmarking the quality of Indian HEIs; guiding students and parents in admissions; and enabling an informed education policy and funding allocations by the Government of India. The framework's initial design was not focused on global comparison, but with a deliberate emphasis on the metric like outreach and inclusivity, which is uniquely tailored to India's socio-political context. NIRF was conceived to create a performance culture and build world-class educational institutions in India.
Evolution of NIRF in the last decade
NIRF has undergone significant evolution in the last ten years, expanding the scope and refining its methodology to better reflect the changing landscape of Indian higher education. Starting with four disciplines in 2016, it expanded to seven, to include Medical, Dental, and Law, and the number of categories in 2025 grew to seventeen, including categories like Open Universities, Skill Universities, State Public Universities, and United Nations' Sustainable Development Goals (SDGs), signaling a clear move toward evaluating social and environmental responsibility.
The methodology continues to be structured around five broad parameters: Teaching, Learning, and Resources (TLR), Research and Professional Practices (RP), Graduation Outcomes (GO), Outreach and Inclusivity (OI), and Perception (PR), each with specific sub-parameters and a weighted score. However, the 2025 framework included metrics on implementation of the National Education Policy (NEP) 2020, like integration of Indian Knowledge Systems (IKS) and the promotion of multiple entry and exit options for students.
Initially, the methodology was heavily reliant on metrics like student numbers, publication counts, and financial data. Now, responding to the complaints on gaming of research publications by some HEIs, NIRF has incorporated a penalty for retracted papers, for the first time in the world, to uphold the ethics and integrity of academic research. As per the NIRF 2025 report, committees of experts examined the data submitted by the HEIs minutely to identify possible outliers, aberrations, anomalies, and more, and get them addressed.
Progress on several fronts
Though the participation is voluntary, due to the influence it wielded on institutional reputation, student admissions, most critically, government funding, and policies concerning the grant of autonomy, the number of participating unique institutions scaled up from 2,734 in 2016 to 7,692 in 2025. In the last few years, there has been a substantial increase in public awareness of the NIRF, with the result that students and parents increasingly refer to the NIRF ranks for admission decisions.
The regulators, like UGC and the government, have linked funding and institutional autonomy to NIRF outcomes. Due to its focus on research and projects, the share of NIRF-ranked Indian institutions in the global output of research publications has gone up, and the number of HEIs that found a place in global rankings like QS and Times Higher Education has increased significantly.
Trend of participation of HEIs in NIRF
During NIRF 2025, out of 8,863 unique institutions that registered, about 87% of them completed the data uploading process to qualify as participants. Those who had already participated in 2024 were more likely to complete the registration process to confirm participation. It was noticed that while 92% institutes from 2024 completed the participation process, only 67% of the newly registered institutes did so. A similar trend was seen in 2023 and 2024, with 65% and 66% of newly registered institutions not completing the participation process respectively.
About 37% of degree-granting universities participated, in comparison to 14% colleges. Though a large number of Engineering Colleges and Business Schools showed up, only a minuscule percentage of arts/science/commerce colleges did.
An analysis of the participation of NAAC-accredited institutions reveals that about 90% of the universities participated in NIRF, compared to 64% colleges. A deeper analysis may be needed to identify the reasons for the reluctance of these institutions to participate. NIRF, being an annual review, helps the institutions to understand where they stand with regard to the major parameters of academic performance so as to enable them to improve.
Sensitivity of ranks: Influence of parameters
An analysis of parameter-wise scores of NIRF 2025 top 100 ranked institutions in the overall category shows that Research and Professional Practices (RP) has the highest influence on the overall rank of the individual institution, with the Spearman correlation coefficient of 0.89 and a large dispersion between maximum and minimum values. A similar trend is noticed for all other categories, as well.
The second highest impacting parameter is Perception (PR), wherein the scores range from the lowest score of 0.14 to a maximum of 100, with the largest standard deviation, which is a measure of the difference of the individual scores from the average score. Though the weightage for this parameter is only 10%, the correlation with the overall score is as high as 0.85. While the established institutions have an undue advantage , as the survey captures only 10 names in each category from the respondents, the lesser known institutions end up with very low score. As per the NIRF 2025 report, most of the respondents for the survey, both peers and employers, are concentrated towards the overall category, followed by Engineering, Colleges, and Management, with only a handful of respondents focusing on other categories, casting doubts on the methodology of the survey. More so, it is a subjective qualitative assessment of the institutions.
Interestingly, the influence of Teaching, Learning, and Resources (TLR) and Graduation Outcomes (GO) , which are expected to be the key performance areas in HEIs, is much less on the overall score.
Concerns about the parameters by the stakeholders
As per the NIRF 2025 report, 58% of the management institutions (530 out of 909) reported "Nil" research publications. The same trend was observed in earlier years as well. A viewpoint raised by some experts in this regard is: while research is important and can definitely be useful to improve the quality of education in the Business Schools, is the weightage given to RP disproportionately high?
This applies to all the teaching-intensive institutions, which are in the majority today. On the other hand, all the sub-parameters under TLR are input-centric, with little focus on the teaching-learning process and learning outcomes. It overlooks crucial aspects such as classroom observations, student evaluations, and alumni feedback, thereby providing an incomplete picture of an institution's educational performance.
An analysis of the recent allocation of B.Tech seats at the IITs by the Joint Seat Allocation Authority (JSAA) in 2025 threw up an interesting observation. Though IIT Madras has consistently secured first rank in the Engineering category for the last 10 years, it was preferred by only 6% of the top 100 JEE rankers, followed by 19% for IIT Delhi (ranked 2), with the vast majority of 73% preferring IIT Bombay, which was ranked 3rd.
However, an analysis of the NIRF data submitted by them on placement and higher studies reveals that there is a marked difference among the three IITs in the average median salary of the placed graduates in the last three years, which may be the reason for this seemingly anomaly. Curiously, the Perception (PR) scores of these three IITs also do not reflect this difference, as IITB has secured the lowest PR score of 85.04 among the three, with the IITM scoring 100. The dominating effect of RP on the overall score, coupled with the emphasis on overall ranks, seems to be obscuring the balance parameters, which may be more relevant for different stakeholders .
This raises a pertinent question, as different parameters are of differing importance to various stakeholders, for instance placement record matters more to students, research for government funding, talent for hiring companies, and likewise, can one NIRF rank ,with fixed weightages , fit all the stakeholders?
Concerns on the methodology
One of the major concerns has been the need for more transparency about the methodology. During 2025, the Sustainable Development Goals were introduced as a new category, but the methodology, along with parameters, was not shared on the NIRF website, even after the release of the NIRF 2025 ranks. Likewise, the methodology for negative marking for retractions of research papers is not communicated to the participating institutions. The long-pending issue on making the methodology for the perception survey more transparent is yet to be addressed.
Large year-on-year rank swings (sometimes over 50 ranks) are mostly observed in mid-tier and emerging institutions, which impacts the credibility of the ranking system. The RP metrics (publications, citations, patents) and TLR resource changes are found to be the key drivers of rank volatility.
Considering that a lot of data is self-reported, verification mechanisms remain limited, particularly in aspects like placement figures. Without standardised reporting practices, the credibility of the rankings will be affected, as their reliability depends on the accuracy and consistency of the data underpinning the evaluation process.
Road map for the future
In order to cover the vast majority of the Arts, Science and Commerce colleges, a category may be created for them. A new category for private institutions also may be created, as the challenges faced by them are different. In addition to quantitative data, incorporating qualitative metrics such as student satisfaction surveys, employer feedback, and alumni outcomes can provide a more holistic view of an institution performance.
One Nation One Data Platform, which is proposed by the Government of India, needs to be implemented quickly to maintain the integrity and consistency of data. Independent audits of institutional data may be conducted to improve credibility.
Longitudinal tracking of the progress of the institutions on all the parameters from year to year may be presented as dashboards, rather than as a static PDF document. In order to cater to the diverse requirements of the stakeholders, a Choice-based Ranking System may be designed, as suggested by the Dr. Radhakrishnan Committee report (2023), wherein the scores based on the gross parameters are provided so that the stakeholders can assign the weightages and arrive at the rankings, as per their choice. In order to avoid the rat race for ranks and the attendant gaming practices, institutions may be grouped into bands, rather than assigning individual ranks.
Though minor changes were incorporated, the basic structure of NIRF remained the same for the last 10 years. In order to gain more acceptance from all the stakeholders and wider participation by the institutions, there is a need for more dissemination through seminars and seeking feedback. After consultations with major stakeholders, suitable parameters may be devised to capture the data.
In order to attract international students and collaborations, there is a need to promote NIRF globally.
Conclusion
NIRF's first decade evidenced broad adoption and tangible improvements in Indian higher education benchmarking. It stands as a critical pillar for policymaking, student admissions, institutional strategy, and quality enhancement. The framework's evolution, adapted to India's educational diversity and NEP 2020 vision, combined with global best practices, will solidify its credibility and utility among all stakeholders. Addressing the challenges of data integrity, transparency, and methodological robustness are critical, to make NIRF a more credible and internationally acceptable ranking system.
(Prof. O.R.S. Rao is the Chancellor of the ICFAI University, Sikkim, Views are personal)