LSE's positions in the main UK and global university rankings (league tables) can be found in this League tables chart.
University rankings are a vexed subject in higher education. Many criticisms have been voiced of their simplifications and of the methodologies they use. But they are here to stay; and in many parts of the world they are treated as one of the most important tools available for prospective students wishing to assess the relative merits of different universities to which they are considering applying. Whether they have uses beyond student recruitment is even more contentious, but with - for example - research quality being arguably a little easier to compare than quality of teaching, rankings are increasingly referred to as a shorthand for assessment of universities' merits in all they do. So we at LSE do take careful note of what the rankings purport to say about us.
LSE and league tables: an overview 2004-2012
At mid-2012, LSE has seen pleasing improvements over the last couple of years in our standing in all the main global rankings: those produced by Times Higher Education, QS and Shanghai Jiaotong University. We have also seen good rises in the domestic UK rankings. But we remain concerned that all of the global rankings - by some way the most important for us, given our highly international orientation - suffer from inbuilt biases in favour of large multi-faculty universities with full STEM (Science, Technology, Engineering and Mathematics) offerings, and against small, specialist, mainly non-STEM universities such as LSE.
Methodologies and commercial arrangements have changed over the years and the following selection of articles we have written in the past does not necessarily fully reflect the situation today. However we continue to offer the articles here, as many of the insights produced by the research we and others have done into the methodologies of rankings remain true.
September 2010: Times Higher university rankings bring big changes for UK universities
The 2010 Times Higher Education (THE) World University Rankings, published on 16 September 2010, entailed major changes in position for almost all British universities. Only three now feature in the top 10 and only five in the top 50. LSE maintains its position as the 11th UK university in the global table, but as a consequence of the general downward pressure its ranking has fallen from equal 67th to 86th.
The 2010 table uses a new data provider, Thomson Reuters, and a new methodology. Part of THE's aim in making these changes was to correct a perceived past bias in favour of UK institutions caused by the use of a relatively small and UK-heavy sample of academic peer opinion. The downward effect on UK universities as a group was therefore not unexpected. But the extent of the change was a surprise.
THE and Thomson Reuters also described the changes earlier in 2010 as designed in part to reflect better the strengths of small and specialist institutions such as LSE. LSE cannot hide its disappointment that the changes have not had the desired effect. We understand that the compilers found it impossible to accumulate all the data needed from universities worldwide in order to adjust fully for size of institution and academic portfolio. Unadjusted figures for funding per student, for example, tend to show that universities offering teaching and research in medical sciences and engineering (which require heavy investment in equipment) spend much more per student than those which offer only social sciences, arts or humanities.
Jonathan Adams, director of research evaluation at Thomson Reuters, said that LSE's position: '...although a fair reflection of their status as a world class university, will always be influenced by their specialist focus. The most heavily weighted single indicator, citation impact, was normalized to reflect subject mix. Similar subject-level weighting could not be applied to some other indicators because, for example, of the variable quality of financial data from many countries. Following this first year with a new methodology, there will be development in consultation with institutions and we expect their feedback to encourage us to apply similar subject modifications to all indicators. This will undoubtedly reveal the effectiveness of the LSE and similar based institutions working in less strongly funded areas.'
University rankings depend heavily on the choice of indicators and the weightings assigned to them. Other factual measures such as our performance in the last UK Research Assessment Exercise, our graduates' starting salaries and the large numbers of applicants for degree study places reinforce our belief that LSE remains at the forefront globally of teaching and research in the social sciences.
September 2010: LSE in QS World University Rankings 2010
LSE was again ranked as one of the best universities in the world for social sciences and management, in the latest rankings published by another university ratings provider, QS, on 8 September 2010.
LSE ranked fourth in these, its specialist areas - up from fifth in 2009. LSE also ranked very highly - joint 4th - on employer reputation.
LSE's position in the full table of the world's top 200 universities, at 80, continues to suffer from the bias against specialist institutions in social sciences, arts and humanities which the QS rankings exhibit.
Times Higher Education (THE) stopped using QS as data provider after the 2009 ranking was published.
February 2010: Times Higher Education accepts LSE unfairly treated in its world university rankings
Shortly after the Times Higher Education's (THE) world rankings for 2009 were published, the THE announced that they would no longer collaborate with the data providing company QS in their annual rankings exercise. Instead, they have switched to collaborating with Thomson Reuters and have pledged to find a fairer method of assessing universities' strengths and weaknesses.
The THE have accepted that the QS methodology "took no account of subject mix when calculating the number of citations per member of academic staff, penalising institutions...such as the London School of Economics" (THE, 10 December 2009).
Writing in the THE on 14 January 2010, Jonathan Adams of Thomson Reuters went further: "The London School of Economics is generally agreed to be an outstanding institution globally... [It] is intellectually vibrant and delivers excellent teaching...[but] The LSE stood at only 67th in the last Times Higher Education-QS World University Rankings - some mistake surely? Yes, and quite a big one. "
At LSE we warmly welcome this somewhat belated vindication of our complaints about the previous methodology. We shall collaborate with THE and Thomson Reuters as they work out how to assess universities' research quality without either unfair distortion or special pleading.
October 2009: Further background on the Times Higher Education - QS World University Rankings
We set out our concerns with the previous THE-QS methodology in some detail in 2008, see LSE Investigations Into THE_QS League Tables.
Between 2004 when the THE tables were first published and 2006, LSE ranked in the top 20 of the main table. In 2007, the LSE fell in the main table to number 59 and it has remained at about that level since. This change did not reflect any drop in the LSE's standards of excellence in research and teaching. Instead, it reflected a change in the way the tables were drawn up.
Previously, the LSE's very high scores in areas such as levels of international staff and students fed directly into the overall rank. From 2007 onwards, the calculation gave less weight to these figures and as a result the LSE's position slipped.
LSE's rankings in the main THE table from 2007 onwards were also depressed by an artificially low assessment of our research quality - indeed, one which is strikingly out of line with the LSE's very strong showing in the latest UK Research Assessment Exercise, the results of which were published on 18 December 2008: see Research Assessment Exercise 2008.
THE acknowledged in 2007 that the changes had had "a particularly chastening effect" on the LSE, as an "exceptional university".
Following the publication of the 2007 table, we worked with THE and QS to understand the apparent anomaly. We were not able to replicate exactly the citations data produced by QS. In fact our investigations suggested that LSE's research output was greatly underestimated by the method used.
In addition, we concluded that the main THE/QS methodology systematically disadvantaged specialist social science institutions such as LSE. The methodology used only citations from peer-reviewed journals, which are heavily employed in the hard sciences, medicine, engineering and similar disciplines. The social sciences make much more use of books and other channels to publish their research, but these were not counted. And where journals are used, citations in the social sciences are far fewer, and typically start to appear several years later than hard science citations, often missing the time window used by QS.
THE acknowledged in 2007 that "the methodology we use is designed mainly to capture excellence in multipurpose universities in the rich world. We are seeking better ways...of comparing the achievements of specialist and postgraduate institutions with those of full-spectrum universities".
With publication of the October 2009 rankings, the THE described LSE as the top specialist social science university in the world – explicitly recognising for the first time the difference between generalist universities and more focused or specialist institutions such as LSE. We welcome the subsequent switch of partner from QS to Thomson Reuters, the clear acceptance of the failings of the QS methodology and the commitment by the THE to produce rankings of higher quality in future.