REMEMBER Eric “the Eel” Moussambani from Equatorial Guinea? He earned the world’s admiration when he splashed to stay afloat in the last lap of the solo 100m freestyle heat at the Sydney 2000 Summer Olympics. The then 23-year-old, who practised in a lake eight months before the games, was allowed to participate as a wild-card entry for athletes from poor countries by the International Olympic Committee (IOC).
There is an Olympic-type competition in academia too — the world university rankings. No wild-card entry there, but it takes as much grit and self-belief as Moussambani had to make the ranks.
Over the decades, I have worked at universities in parts of Asia and the West. It is almost self-evident that universities from poorer economies will not qualify for the heats. Libraries are poorly stocked, instructional technologies are dated and high impact research is scarce, hence generation of “new knowledge” is low. Educational goals are focused on “community development” and “nation building” rather than industry innovations and global outlook.
Where undergraduates are mostly passive and accepting (the lecturer is always right), rather than active and questioning, their employability is compromised by inadequate skills in problem-solving and decision-making. Where bureaucrats trump the academics, and academics dominate the students, many Moussambani-like universities can only aspire to meet some of the performance indicators, if at all, to qualify for ranking.
Two widely cited world university rankings are the London-based Times Higher Education (THE) and Quacquarelli Symonds (QS). THE’s ranking of the top 400 universities uses 13 performance indicators grouped into five areas: teaching (worth 30% of the ranking score), research (30%), citations (30%), international outlook (7.5%) and industry income (2.5%).
QS’ ranking of the top 800 universities uses six indicators: peer review of reputation (40%), faculty-student ratio (20%), citations of research papers (20%), employers’ reputation (10%), international student ratio (5%), and international staff ratio (5%).
THE’s ranking excludes universities that produce less than 200 research papers a year.This effectively excludes all Malaysian teaching-only universities. Likewise, nearly all Malaysian universities will register a low score for “learning environment” (30%) and “international outlook” (7.5%), given the race-based policies, the Sedition Act, and the Universities and University Colleges Act (UUCA) — all of which undermine transparent governance, academic freedom and a campus atmosphere that is socially engaging and intellectually stimulating.
The QS World Ranking, however, gives greater weightage to peer review of reputation, which is subjective and methodologically flawed, according to social scientists. Five universities that the Ministry of Higher Education recognises as research universities are listed in QS’ top 100 in Asia — Universiti Malaya (or UM, ranked 32), Universiti Kebangsaan Malaysia (UKM, 56), Universiti Sains Malaysia (USM, 57), Universiti Teknologi Malaysia (UTM, 66) and Universiti Putra Malaysia (UPM, 76).
Universities can also apply to QS for star ratings based on more indicators (about 50). QS introduced the five-star system in 2012 to cover universities that otherwise would not qualify for a ranking, which seems fair. But universities are required to pay a one-time fee of US$9,850 for an initial star rating, and an annual licence fee of US$6,850 to use the QS stars in their promotional materials.
Three Malaysian universities are QS star-rated: UM (5 stars), Universiti Malaysia Perlis (UniMAP, 3 stars) and Universiti Teknologi Petronas (UTP, 3 stars). The last two, however, did not qualify for QS World Ranking, which is odd.
As you can see, university rankings are inherently imperfect. To show the variability in THE’s and QS’ rankings, I looked at how the universities that I had attended were placed.
QS ranked USM (my alma mater) 309 and UM (where I dropped out), 151. Macquarie University (where I got my PhD) was ranked 254 and University of Wollongong (UOW, where I teach), 283.
Intrigued, and with no disrespect to UM and USM, I looked at THE’s ranking tables. UOW was in the 276-300 bracket, and Macquarie, 301-350. UM and USM did not qualify.
Behind each number, there is usually a story in context. Hence, I read the university league tables against their ranking criteria, their methodologies, limitations, sample of respondents, opinion surveys and objective measures to control the inherent subjectivities in interpreting qualitative data.
Other rankings are published by Academic World Ranking of Universities by Shanghai Jiao Tong University, Webometrics based in Spain, and Center for World University Rankings based in Jeddah, Saudi Arabia.
Universities today operate like commercial enterprises. More students mean more money. Administrators use ranking tables and star ratings to market their university programmes, attract reputable scholars, and rationalise higher budget allocations from the government. Academics use it to attract research grants from industry partners, and students use it to decide their university of choice.
When correctly understood, the ranking criteria can provide universities with a framework to find ways to improve their performance, standards of teaching, quantity of quality research, and educational outcomes that are relevant to local conditions. When misinterpreted, the rankings can be spun to mislead the public.
To dismiss the rankings as inconvenient truths — as the president of Universiti Malaya Students Association did when he called for a boycott of university rankings — is to ignore years of critical audits by research organisations. Equally shortsighted is to be carried away by a temporary positive ranking, such as UM’s improvement in the QS ranking.
Given the mix of rancour and respect for rankings, the International Ranking Expert Group, founded in 2004 by the Unesco European Centre for Higher Education in Bucharest, and the Institute for Higher Education Policy in Washington DC, met in Berlin in 2006 to formulate 16 principles of best practices in university rankings.
One of the Berlin principles noted: “Not all nations or systems share the same values and beliefs about what constitutes ‘quality’ in tertiary institutions, and ranking systems should not be devised to force such comparisons.”
I single out several principles that academics and journalists should consider when interpreting the rankings: “Specify the linguistic, cultural, economic and historical contexts of the educational systems being ranked; provide consumers with a clear understanding of all of the factors used to develop a ranking; and offer them a choice in how rankings are displayed.”
Reporting the ranking as a fact, without clarifying the context, is simply misleading. Toning down the negative and tuning up the positive likewise does no university any good. Vice-chancellors, academics and student unions should take from the ranking tables information to help identify the weak areas in our universities, and examine ways to raise the standard of teaching, number of high-impact research papers, quality of student intake and retention of reputable scholars, instead of throwing the UUCA at students and the Sedition Act at academics who see good reason to exercise their basic rights and obligation to add a dissenting voice on issues of national interest.
A university is only as good as its faculty staff, students and administrators. Quality research output and quality undergraduate teaching go together. The learning environment, academic freedom, university governance, academic ethos and autonomy - all these shape its constituents’ thinking and character.
In time, we become part of the university system. We reflect its values and intellectual culture.
Have our universities instilled in academics a certain ethos, and nurtured in its graduates values and worldviews that they can confidently carry to the international stage? Are our vice-chancellors, university bureaucrats, student unions and certain sections of academia so patronised by political affiliations, and so complacent by the lack of peer scrutiny that they don’t feel they need to legitimise their academic standing?
The government aims to have at least three universities listed within the top 100 in the world, and one among the top 50, by 2020. In reality though, as long as there is no competition from within and outside the system, as long as university education remains politicised and racialised, pushing reputable academics to leave for places where their expertise is well-received and rewarded, Malaysian universities, six years hence, are likely to remain as lowly ranked as they are today.
Eric Loo teaches journalism at the University of Wollongong in New South Wales, Australia. He worked as a journalist and taught journalism in Malaysia from the late 1970s to 1986.
This article first appeared in Forum, The Edge Malaysia Weekly, on October 27 - November 2, 2014.