SlideShare une entreprise Scribd logo
1  sur  105
Ranking of Universities:  Methods, Limitations, Challenges   Middle East Technical University Ankara, December 3,  2010 Anthony F.J. van Raan   Center for Science and Technology Studies (CWTS) Leiden  University
Leiden University,  oldest in the Netherlands, 1575,  European League of Research Universities (LERU) Leiden, historic city (2th, 11 th  C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
 
 
First and major challenge: - Establish as good as possible the  research performance in the recent past  in quantifiable terms (objectivity, reliability);   - If you succeed in quantifying performance, you can always make a  ranking
Two methodological approaches: (1) ‘broader’ peer review >  expert survey > reputation assessment   (2)  bibliometric indicators   > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
Conceptual problems peer review/surveys :  * Slow:  ‘old’ reputation vs. performance at the research  front now * Mix of opinions about research, training, teaching,  political/social status
Conceptual problems bibliometric analysis :  1. Evidences of research performance such as prizes and  awards, earlier peer review, invitations as key-note  speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators  correlate strongly with bibliometric indicators   2. Focus on journal articles: differences across disciplines
Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!!  > Sleeping Beauties
WoS/Scopus sub-universe journal articles only,>1,000,000p/y Total publ universe Natural/medical sciences >>> applied, social, humanities
WoS/Scopus sub-universe journal articles only,> 1,000,000p/y Refs  >  non-WoS Total publ universe non-WoS publ: Books Book chapters Conf. proc. Reports Non-WoS journals
Citing publications C(A) P(A) Cited publications C(f) P(f) C(A)/P(A)=  CPP C(f)/P(f)= FCS Field-specific normalization C(A)/P(A) ------------ =  CPP / FCSm   C(f)/P(f) + doc. type normalization + no self-citations, also not in C(f)!
Ja1 Ja2 Ja3 Ja4 Jb1 Jb2 Jb3 Jb4 Jb5 Ja2 Ja1 Ja4 Ja3 Bibliogr.coupl.
EC j label 1 http://www.neesjanvaneck.nl/journalmap   FCS JCS Economics journals 2008 Journal map, n=332 JCS/FCS
Economics journals 2008 Journal density map, n=332
Fa1 Fa2 Fa3 Fa4 Fb1 Fb2 Fb3 Fb4 Fb5 Fa2 Fa1 Fa4 Fa3 Bibliogr.coupl.
Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World average Science as a structure of 200 related fields World map
Urban Studies 2006-2008 Concept map, n=700
Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
F(t1) F(t2) Knowledge dynamics described by a continuity equation describing  the change of a quantity inside any region in terms of density and flow ?
University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department  ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
 
Indicators divided into 5 categories, with fraction of final ranking score Teaching:  0.30  Research:  0.30 Citations:  0.325 Industry income:  0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
 
 
 
Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation   0.40  Employer Reputation:  0.10 Stud/staff:  0.20 Citations:  0.20 International: 0.10
 
 
Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB:  0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
 
 
Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring  2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
Main Features of Leiden Ranking 2010: *  Best possible  definition  of universities by applying the CWTS  address unification  algorithm *   New indicators  next to the standard CWTS indicators, including university-industry collaboration (autumn) *   Field-specific ranking  and  top- indicators (15 major fields) >  Benchmarking
P=2004-2008, C=2004-2009           top-250 largest ranked by CPP                     CPP/     CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
Large, Broad European   University Focus: top 25 % in  publication output  and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
Smaller Specialized European   University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in  citation impact, but in the lower-50% of publication output ECON MATH PSYCH
Based on this updated and extended  ranking approach: Benchmarking studies of universities   Comparison of the ‘target’-university with 20 other universities at choice:
 
 
 
Current and recent benchmark projects Manchester, Leiden, Heidelberg, Rotterdam, Copenhagen, Zürich,  Lisbon UNL, Amsterdam UvA,  Amsterdam VU, Southampton Gent, Antwerp, Brussels VUB, UC London, Aarhus
EU Multi-Ranking Universiteit Leiden
Leiden University
Delft University of Technology
METU Ankara
Turkey 2001-2004
Turkey 2005-2008
Thank you for your attention more information:  www.cwts.leidenuniv.nl
 
 
E x ternal WoS coverage   Example: Uppsala 2002-2006
More about rankings and benchmarking
Ranking by Top-10%
 
 
Aarhus University Size: University specific
Aalborg University
 
 
CPP/ FCSm CPP/ FCSm Diseases of the Neurosystem Dept Output in Fields   Dept Impact from Fields Knowledge use by very high impact groups
CPP/FCSm < > MNCS1, 2
Figure 1  Relation between CPP/FCSm and MNCS 1  for 158  research groups Figure 2 Relation between CPP/FCSm and MNCS 2  indicator for 158  research groups
Figure 3  Relation between CPP/FCSm and MNCS 1  for the 365 largest universities  in the world Figure 4.  Relation between CPP/FCSm and MNCS 2  for the  365 largest universities  in the world
Figure 5  Relation between CPP/FCSm and MNCS 1  for the 58  largest  countries Figure 6  Relation between CPP/FCSm and MNCS 2  for the  58 largest  countries
Figure 7  Relation between CPP/FCSm and MNCS 1  for all WoS (nonAH)  journals Figure 8  Relation between CPP/FCSm and MNCS 2  for  all WoS (nonAH)  journals
Only journals with  CPP/FCSm and MNCS1 < 2.5 Figure 9  Relation between CPP/FCSm and MNCS 1  for all WoS (nonAH)  journals Figure 10  Relation between CPP/FCSm and MNCS 2  for  all WoS (nonAH)  journals
Figure 11  Relation between CPP/FCSm and  MNCS 1  for 190 researchers (large UMC)  with  > 20 WoS publications 1997-200 6  (citations counted up to 200 6 ). Figure 12  Relation between CPP/FCSm and  MNCS 2  for 190 researchers (large UMC)  with  > 20 WoS publications 1997-200 6  (citations counted up to 200 8 ). 0 1 2 3 4 5 0 1 2 3 4 5 A 0 1 2 3 4 5 0 1 2 3 4 5
Performance assessment
Application of Thomson-WoS Impact Factors for research performance evaluation is  irresponsible * Much too short  citation window * No  field-specific  normalization  * No distinction between  document types  *  Calculation errors/inconsistencies   nominator /denominator * Underlying citation  distribution is very skew: IF-value heavily determined by a few very  highly cited papers
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
 
 
 
 
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as  a whole 1996 2005
CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%,  half of which  0 cit  B: 10% C: 15% D: 10% E: 19% 1996 2005
Publications from 1991,….1995   time lag & citation window
CPP/FCSm  <  0.80: performance significantly below  internat. average, class A; 0.80 <  CPP/FCSm  <  1.20:  performance about internat.  average, class B; 1.20 <  CPP/FCSm  <  2.00: performance significantly above  internat. average, class C; 2.00 <  CPP/FCSm  <  3.00:  performance in internat. perspective  is very good, class D;    CPP/FCSm   > 3.00: performance in internat. perspective  is   excellent, class E.
University departments fields
Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],Breakdown by fields Chemistry at research group level Field (CPP/FCSm)
cluster Field = set of publications with  thematic/field-specific  classification codes again for new, emerging often interdisc. fields scientific  fine-grained  structure
Mesh delineation vs. journal-classification Problem of the  ‘right’ FCSm….. FCSm FCSm ISI j-category  based PubMed  classification based
 
 
 
Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations.  Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
 
 
Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size.  The higher the average journal impact of a university, the lower the number of not-cited publications.  Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications.  In other words,  universities that are cited more per paper also have more cited papers.
 
Finding 3: The average research performance of university measured by crown indicator  CPP/FCSm  does not ‘dilute’ with increasing size.  The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
 
 
 
 
Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations.  For lower-performance universities, field citation density provides a cumulative advantage in citations per publication.  Top universities publish in journals with higher journal impact as compared to lower performance universities.  Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
 
Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.
[object Object],[object Object],[object Object],[object Object]
[object Object],[object Object],[object Object],[object Object]

Contenu connexe

Similaire à LEIDEN UNIVERSITY 2010

Cnr elsevier workshop nov 14 2012 bonaccorsi
Cnr elsevier workshop nov 14 2012 bonaccorsiCnr elsevier workshop nov 14 2012 bonaccorsi
Cnr elsevier workshop nov 14 2012 bonaccorsi
Elsevier Events
 
Paper 9: How to increase one's ranking efficiently? (Xu)
Paper 9: How to increase one's ranking efficiently? (Xu)Paper 9: How to increase one's ranking efficiently? (Xu)
Paper 9: How to increase one's ranking efficiently? (Xu)
Kent Business School
 
From RAE to REF
From RAE to REFFrom RAE to REF
From RAE to REF
David Clay
 
Paper 6: World University's Evaluation (Qiu & Zhao)
Paper 6: World University's Evaluation (Qiu & Zhao)Paper 6: World University's Evaluation (Qiu & Zhao)
Paper 6: World University's Evaluation (Qiu & Zhao)
Kent Business School
 
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
Kent Business School
 
Broad altmetric analysis of Mendeley readerships through the ‘academic status...
Broad altmetric analysis of Mendeley readerships through the ‘academic status...Broad altmetric analysis of Mendeley readerships through the ‘academic status...
Broad altmetric analysis of Mendeley readerships through the ‘academic status...
Zohreh Zahedi
 

Similaire à LEIDEN UNIVERSITY 2010 (20)

Comparing scientific performance across disciplines: Methodological and conce...
Comparing scientific performance across disciplines: Methodological and conce...Comparing scientific performance across disciplines: Methodological and conce...
Comparing scientific performance across disciplines: Methodological and conce...
 
University rankings; an overview for the municipality of Delft July 2013
University rankings; an overview for the municipality of Delft July 2013University rankings; an overview for the municipality of Delft July 2013
University rankings; an overview for the municipality of Delft July 2013
 
Cnr elsevier workshop nov 14 2012 bonaccorsi
Cnr elsevier workshop nov 14 2012 bonaccorsiCnr elsevier workshop nov 14 2012 bonaccorsi
Cnr elsevier workshop nov 14 2012 bonaccorsi
 
AUA 2007 League Tables
AUA 2007 League TablesAUA 2007 League Tables
AUA 2007 League Tables
 
Uses and misuses of quantitative indicators of impact
Uses and misuses of quantitative indicators of impactUses and misuses of quantitative indicators of impact
Uses and misuses of quantitative indicators of impact
 
Aua 2008 A
Aua 2008 AAua 2008 A
Aua 2008 A
 
Paper 9: How to increase one's ranking efficiently? (Xu)
Paper 9: How to increase one's ranking efficiently? (Xu)Paper 9: How to increase one's ranking efficiently? (Xu)
Paper 9: How to increase one's ranking efficiently? (Xu)
 
Links prezi
Links preziLinks prezi
Links prezi
 
From RAE to REF
From RAE to REFFrom RAE to REF
From RAE to REF
 
Paper 6: World University's Evaluation (Qiu & Zhao)
Paper 6: World University's Evaluation (Qiu & Zhao)Paper 6: World University's Evaluation (Qiu & Zhao)
Paper 6: World University's Evaluation (Qiu & Zhao)
 
Study on mobility flows in the Marie Skłodowska-Curie Actions
Study on mobility flows in the Marie Skłodowska-Curie ActionsStudy on mobility flows in the Marie Skłodowska-Curie Actions
Study on mobility flows in the Marie Skłodowska-Curie Actions
 
Impact Vitality – A Measure for Excellent Scientists
Impact Vitality – A Measure for Excellent ScientistsImpact Vitality – A Measure for Excellent Scientists
Impact Vitality – A Measure for Excellent Scientists
 
Comunicação da produção científica na Sociedade de Informação: novos desafios
Comunicação da produção científica na Sociedade de Informação: novos desafios Comunicação da produção científica na Sociedade de Informação: novos desafios
Comunicação da produção científica na Sociedade de Informação: novos desafios
 
IJPR short presentation 2021 updated
IJPR short presentation 2021 updatedIJPR short presentation 2021 updated
IJPR short presentation 2021 updated
 
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
Paper 5: Rankings of International Research Institutes (Liu, Wenbin)
 
Citation analysis for research evaluation
Citation analysis for research evaluationCitation analysis for research evaluation
Citation analysis for research evaluation
 
Adams, "Profiles, not Metrics; Why it is important to drill into the data tha...
Adams, "Profiles, not Metrics; Why it is important to drill into the data tha...Adams, "Profiles, not Metrics; Why it is important to drill into the data tha...
Adams, "Profiles, not Metrics; Why it is important to drill into the data tha...
 
Sra Oa Arma
Sra Oa ArmaSra Oa Arma
Sra Oa Arma
 
A scientometric perspective on university ranking
A scientometric perspective on university rankingA scientometric perspective on university ranking
A scientometric perspective on university ranking
 
Broad altmetric analysis of Mendeley readerships through the ‘academic status...
Broad altmetric analysis of Mendeley readerships through the ‘academic status...Broad altmetric analysis of Mendeley readerships through the ‘academic status...
Broad altmetric analysis of Mendeley readerships through the ‘academic status...
 

Plus de URAP Research Center (7)

URAP 2012
URAP 2012URAP 2012
URAP 2012
 
TIMES HIGHER EDUCATION 2012
TIMES HIGHER EDUCATION 2012TIMES HIGHER EDUCATION 2012
TIMES HIGHER EDUCATION 2012
 
LEIDEN UNIVERSITY 2012
LEIDEN UNIVERSITY 2012LEIDEN UNIVERSITY 2012
LEIDEN UNIVERSITY 2012
 
TIMES HIGHER EDUCATION 2010
TIMES HIGHER EDUCATION 2010TIMES HIGHER EDUCATION 2010
TIMES HIGHER EDUCATION 2010
 
ERHAN ERKUT 2010
ERHAN ERKUT 2010ERHAN ERKUT 2010
ERHAN ERKUT 2010
 
WEBOMETRICS 2010
WEBOMETRICS 2010WEBOMETRICS 2010
WEBOMETRICS 2010
 
URAP 2010
URAP 2010URAP 2010
URAP 2010
 

Dernier

Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
negromaestrong
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
MateoGardella
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
ciinovamais
 

Dernier (20)

Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
Mattingly "AI & Prompt Design: Structured Data, Assistants, & RAG"
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
PROCESS RECORDING FORMAT.docx
PROCESS      RECORDING        FORMAT.docxPROCESS      RECORDING        FORMAT.docx
PROCESS RECORDING FORMAT.docx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
Ecological Succession. ( ECOSYSTEM, B. Pharmacy, 1st Year, Sem-II, Environmen...
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"Mattingly "AI & Prompt Design: The Basics of Prompt Design"
Mattingly "AI & Prompt Design: The Basics of Prompt Design"
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 

LEIDEN UNIVERSITY 2010

  • 1. Ranking of Universities: Methods, Limitations, Challenges   Middle East Technical University Ankara, December 3, 2010 Anthony F.J. van Raan Center for Science and Technology Studies (CWTS) Leiden University
  • 2. Leiden University, oldest in the Netherlands, 1575, European League of Research Universities (LERU) Leiden, historic city (2th, 11 th C.), strong cultural (arts, painting) & scientific tradition one of the largest science parks in EU
  • 3.  
  • 4.  
  • 5. First and major challenge: - Establish as good as possible the research performance in the recent past in quantifiable terms (objectivity, reliability); - If you succeed in quantifying performance, you can always make a ranking
  • 6. Two methodological approaches: (1) ‘broader’ peer review > expert survey > reputation assessment (2) bibliometric indicators > performance and impact measurement On the department or research program level (1) and (2) are strongly correlated
  • 7. Conceptual problems peer review/surveys : * Slow: ‘old’ reputation vs. performance at the research front now * Mix of opinions about research, training, teaching, political/social status
  • 8. Conceptual problems bibliometric analysis : 1. Evidences of research performance such as prizes and awards, earlier peer review, invitations as key-note speakers, etc. are not covered However, in many cases these ‘non-bibliometric’ indicators correlate strongly with bibliometric indicators 2. Focus on journal articles: differences across disciplines
  • 9. Citing Publications Cited Publications From other disciplines From emerging fields From research devoted to societal, economical and technological problems From industry From international top-groups These all f(t)!! > Sleeping Beauties
  • 10. WoS/Scopus sub-universe journal articles only,>1,000,000p/y Total publ universe Natural/medical sciences >>> applied, social, humanities
  • 11. WoS/Scopus sub-universe journal articles only,> 1,000,000p/y Refs > non-WoS Total publ universe non-WoS publ: Books Book chapters Conf. proc. Reports Non-WoS journals
  • 12. Citing publications C(A) P(A) Cited publications C(f) P(f) C(A)/P(A)= CPP C(f)/P(f)= FCS Field-specific normalization C(A)/P(A) ------------ = CPP / FCSm C(f)/P(f) + doc. type normalization + no self-citations, also not in C(f)!
  • 13. Ja1 Ja2 Ja3 Ja4 Jb1 Jb2 Jb3 Jb4 Jb5 Ja2 Ja1 Ja4 Ja3 Bibliogr.coupl.
  • 14. EC j label 1 http://www.neesjanvaneck.nl/journalmap FCS JCS Economics journals 2008 Journal map, n=332 JCS/FCS
  • 15. Economics journals 2008 Journal density map, n=332
  • 16. Fa1 Fa2 Fa3 Fa4 Fb1 Fb2 Fb3 Fb4 Fb5 Fa2 Fa1 Fa4 Fa3 Bibliogr.coupl.
  • 17. Enhance visibility of engineering, social sciences and humanities by proper publication-density normalization Size: World average Science as a structure of 200 related fields World map
  • 18. Urban Studies 2006-2008 Concept map, n=700
  • 19. Urban Studies 2006-2008 Concept density map n=700 More stable, more comprehensible
  • 20. F(t1) F(t2) Knowledge dynamics described by a continuity equation describing the change of a quantity inside any region in terms of density and flow ?
  • 21. University Departments Fields ‘ bottom-up’ analysis: input data (assignment of researchers to departments) necessary; > Detailed research performance analysis of a university by department ‘ top-down’ analysis: field-structure is imposed to university; > Broad overview analysis of a university by field
  • 22.  
  • 23.
  • 24. Indicators divided into 5 categories, with fraction of final ranking score Teaching: 0.30 Research: 0.30 Citations: 0.325 Industry income: 0.025 International: 0.05 Fractions are based on ‘expert enthusiasm’ and confidence in data’
  • 25.  
  • 26.  
  • 27.  
  • 28. Indicators divided into 5 categories, with fraction of final ranking score Academic Reputation 0.40 Employer Reputation: 0.10 Stud/staff: 0.20 Citations: 0.20 International: 0.10
  • 29.  
  • 30.  
  • 31. Indicators divided into 5 categories, with fraction of final ranking score Nobel Prize Alumni/Awards 0.10 + 0.20 HiCi staff, N&S, PUB: 0.20 + 0.20 + 0.20 PCP (size normaliz.) 0.10 Citations only in HiCi (but citation per staff measure)!
  • 32.  
  • 33.  
  • 34. Leiden Ranking Spring and Autumn 2010 500 Largest Universities Worldwide 250 Largest Universities in Europe Spring 2010: 2003-2008(P)/2009(C) Autumn 2010: 2003-2009 (P and C)
  • 35. Main Features of Leiden Ranking 2010: * Best possible definition of universities by applying the CWTS address unification algorithm * New indicators next to the standard CWTS indicators, including university-industry collaboration (autumn) * Field-specific ranking and top- indicators (15 major fields) > Benchmarking
  • 36. P=2004-2008, C=2004-2009           top-250 largest ranked by CPP                     CPP/     CPP MNCS2 P FCSm UNIV LAUSANNE CH 12.23 1.49 6950 1.55 UNIV DUNDEE UK 11.82 1.42 4413 1.51 LONDON SCH HYGIENE & TROPICAL MEDICINE UNIV LONDON UK 11.59 1.64 4232 1.62 JG UNIV MAINZ DE 11.41 1.32 5015 1.25 UNIV OXFORD UK 11.25 1.67 26539 1.63 UNIV CAMBRIDGE UK 11.15 1.70 25662 1.63 KAROLINSKA INST STOCKHOLM SE 11.09 1.36 16873 1.34 UNIV BASEL CH 11.02 1.55 8483 1.46 ERASMUS UNIV ROTTERDAM NL 11.01 1.49 12408 1.49 UNIV GENEVE CH 10.90 1.45 9342 1.47 UNIV COLL LONDON UK 10.85 1.48 26286 1.46 IMPERIAL COLL LONDON UK 10.47 1.55 21967 1.49 UNIV SUSSEX UK 10.33 1.60 3841 1.67 UNIV EDINBURGH UK 10.25 1.54 13188 1.54 UNIV ZURICH CH 10.16 1.46 13824 1.44 LEIDEN UNIV NL 10.02 1.43 12513 1.37 QUEEN MARY COLL UNIV LONDON UK 9.92 1.44 4586 1.45 UNIV HEIDELBERG DE 9.71 1.35 15445 1.32 STOCKHOLM UNIV SE 9.55 1.43 6427 1.50 VRIJE UNIV AMSTERDAM NL 9.51 1.40 12201 1.40 HEINRICH HEINE UNIV DUSSELDORF DE 9.50 1.29 6636 1.25 UNIV DURHAM UK 9.44 1.69 5848 1.65 ETH ZURICH CH 9.41 1.63 15099 1.64 UNIV GLASGOW UK 9.34 1.41 10435 1.45 KINGS COLL UNIV LONDON UK 9.33 1.38 13680 1.33 UNIV SOUTHERN DENMARK DK 9.30 1.29 4786 1.34 MED HOCHSCHULE HANNOVER DE 9.29 1.22 5233 1.16 LMU UNIV MUNCHEN DE 9.23 1.38 16995 1.30 UNIV AMSTERDAM NL 9.18 1.41 15492 1.36 UNIV BORDEAUX II VICTOR SEGALEN FR 9.16 1.24 4354 1.22
  • 37.
  • 38. Large, Broad European University Focus: top 25 % in publication output and citation impact Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25%
  • 39. Smaller Specialized European University Top 25% Bottom 25% Impact ranking Publ.ranking Top 25% Bottom 25% Specialized in Economy and related fields Among top 25 % in citation impact, but in the lower-50% of publication output ECON MATH PSYCH
  • 40. Based on this updated and extended ranking approach: Benchmarking studies of universities Comparison of the ‘target’-university with 20 other universities at choice:
  • 41.  
  • 42.  
  • 43.  
  • 44. Current and recent benchmark projects Manchester, Leiden, Heidelberg, Rotterdam, Copenhagen, Zürich, Lisbon UNL, Amsterdam UvA, Amsterdam VU, Southampton Gent, Antwerp, Brussels VUB, UC London, Aarhus
  • 47. Delft University of Technology
  • 51. Thank you for your attention more information: www.cwts.leidenuniv.nl
  • 52.  
  • 53.  
  • 54. E x ternal WoS coverage Example: Uppsala 2002-2006
  • 55. More about rankings and benchmarking
  • 57.  
  • 58.  
  • 59. Aarhus University Size: University specific
  • 61.  
  • 62.  
  • 63. CPP/ FCSm CPP/ FCSm Diseases of the Neurosystem Dept Output in Fields Dept Impact from Fields Knowledge use by very high impact groups
  • 64. CPP/FCSm < > MNCS1, 2
  • 65. Figure 1 Relation between CPP/FCSm and MNCS 1 for 158 research groups Figure 2 Relation between CPP/FCSm and MNCS 2 indicator for 158 research groups
  • 66. Figure 3 Relation between CPP/FCSm and MNCS 1 for the 365 largest universities in the world Figure 4. Relation between CPP/FCSm and MNCS 2 for the 365 largest universities in the world
  • 67. Figure 5 Relation between CPP/FCSm and MNCS 1 for the 58 largest countries Figure 6 Relation between CPP/FCSm and MNCS 2 for the 58 largest countries
  • 68. Figure 7 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 8 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
  • 69. Only journals with CPP/FCSm and MNCS1 < 2.5 Figure 9 Relation between CPP/FCSm and MNCS 1 for all WoS (nonAH) journals Figure 10 Relation between CPP/FCSm and MNCS 2 for all WoS (nonAH) journals
  • 70. Figure 11 Relation between CPP/FCSm and MNCS 1 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 6 ). Figure 12 Relation between CPP/FCSm and MNCS 2 for 190 researchers (large UMC) with > 20 WoS publications 1997-200 6 (citations counted up to 200 8 ). 0 1 2 3 4 5 0 1 2 3 4 5 A 0 1 2 3 4 5 0 1 2 3 4 5
  • 72. Application of Thomson-WoS Impact Factors for research performance evaluation is irresponsible * Much too short citation window * No field-specific normalization * No distinction between document types * Calculation errors/inconsistencies nominator /denominator * Underlying citation distribution is very skew: IF-value heavily determined by a few very highly cited papers
  • 73.
  • 74.
  • 75.  
  • 76.  
  • 77.  
  • 78.  
  • 79. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole 1996 2005
  • 80. CPP / FCSm 0.0 1.0 2.0 3.0 t world average institute as a whole A: 46%, half of which 0 cit B: 10% C: 15% D: 10% E: 19% 1996 2005
  • 81. Publications from 1991,….1995 time lag & citation window
  • 82. CPP/FCSm < 0.80: performance significantly below internat. average, class A; 0.80 < CPP/FCSm < 1.20: performance about internat. average, class B; 1.20 < CPP/FCSm < 2.00: performance significantly above internat. average, class C; 2.00 < CPP/FCSm < 3.00: performance in internat. perspective is very good, class D; CPP/FCSm > 3.00: performance in internat. perspective is excellent, class E.
  • 84. Neuroscience Cancer research Genomics Clinical research Cardio-vascular 2.0 1.0 3.0 0.0 CPP/FCSm P Same institute, now breakdown into its 5 departments world average
  • 85.
  • 86. cluster Field = set of publications with thematic/field-specific classification codes again for new, emerging often interdisc. fields scientific fine-grained structure
  • 87. Mesh delineation vs. journal-classification Problem of the ‘right’ FCSm….. FCSm FCSm ISI j-category based PubMed classification based
  • 88.  
  • 89.  
  • 90.  
  • 91. Finding 1: Size-dependent cumulative advantage for the impact of universities in terms of total number of citations. Quite remarkably, lower performance universities have a larger size-dependent cumulative advantage for receiving citations than top-performance universities.
  • 92.  
  • 93.  
  • 94. Finding 2: For the lower-performance universities the fraction of not-cited publications decreases with size. The higher the average journal impact of a university, the lower the number of not-cited publications. Also, the higher the average number of citations per publication in a university, the lower the number of not-cited publications. In other words, universities that are cited more per paper also have more cited papers.
  • 95.  
  • 96. Finding 3: The average research performance of university measured by crown indicator CPP/FCSm does not ‘dilute’ with increasing size. The large top-performance universities are characterized by ‘big and beautiful’. They succeed in keeping a high performance over a broad range of activities. This is an indication of their overall scientific and intellectual attractive power.
  • 97.  
  • 98.  
  • 99.  
  • 100.  
  • 101. Finding 4: Low field citation density and low journal impact universities have a size-dependent cumulative advantage for the total number of citations. For lower-performance universities, field citation density provides a cumulative advantage in citations per publication. Top universities publish in journals with higher journal impact as compared to lower performance universities. Top universities perform a factor 1.3 better than bottom universities in journals with the same average impact.
  • 102.  
  • 103. Finding 5: The fraction of self-citations decreases as a function of research performance, of average field citation density, and of average journal impact.
  • 104.
  • 105.