Citation analysis has an ever increasing significance in evaluating scientific achievement. Scientific journals, individual researchers, research groups, research institutes, universities and whole countries are evaluated on the basis of scientific publications and citations they receive.
A citation analysis is a quantitative method whereby important and essential literature of a field can be identified on the basis of how often a publication is cited in other publications. However, using citations as a quality indicator carries problems. Primarily, citations measure popularity rather than quality, and individual citations are always a researcher's subjective choice, which can be affected by many factors other than the quality of the article. Citations of an article reveal the positive or negative attention it has received rather than indicate its quality. Citation counts favour mainstream research and established paradigms, while research that challenges prevailing thought is not necessarily noticed straight away. Indeed, many important scientific breakthroughs have not been noticed until decades after their publication. Meanwhile some research results giving rise to criticism, or even proven to be wrong, may receive a great number of citations.
All evaluation methods based on citation analysis are dependent on the contents and quality of the databases that contain the information on citations. When considering such evaluation indicators, attention should always be paid to which database's information the calculations are from, because the value of even same indicators changes when the database changes. The number of references to a particular article differs according to the database, nor are the references in the different databases always exactly the same. Especially with macro level evaluations, such as when evaluating research groups, departments and countries, it is important to carefully investigate the calculating methods of the indicators used and the reference data upon which the calculations are based. Moreover, it is worth considering the interpretative limitations of the indicators and the related problems. Thus, evaluations often require the use of several indicators, and citation information from different databases. Evaluations based on citation information should be complemented with further expert assessments.
Citation reference databases each employ different data collection strategies, and this has effects both on what publications the databases contain and on the number of citations received by the publications. The contents of different databases concentrate on different things: journals, books, conference proceedings and other literature. Databases normally contain citation information only on the journals they carry. The number of citations is also dependent on how long a time span for citations is covered by the database, and how often the citation information is up-dated in the database. Citation information of all databases also contains some errors; citations might be missing or they might have been registered twice. The contents of all databases also changes continuously; database service providers include new journals in their collection and remove ones they have previously held, while they may up-date citation information also from older publications.
The most important multidisciplinary databases containing citation information are Web of Sciece (WoS) by Thomson Reuters and Scopus by Elsevier. Citations can also be retrieved from Google Scholar (GS) keeping in mind the limitations of the database. GS contains a lot of non-scientific citations. The number of citations can vary a lot between Web of Science, Scopus and especially GS.
In addition, there are some field specific databases which contain reference information such as Chemical Abstracts (SciFinder), CiteSeerX and MathScinet.
Table below: Comparison of WoS, Scopus and GS
Feature |
Web of Science (more information) |
Scopus (more information) |
Google Scholar (more information) |
---|---|---|---|
Availability |
subscription based |
subscription based |
freely accessible |
Number of journals |
22 000 peer-reviewed journals |
23 500 peer-reviewed journals |
information is not publically available |
Other contents |
conference proceedings, books |
conference proceedings, professional magazines, patents and book series |
books, pre-prints, theses and dissertations, and webpages |
Main disiplines |
Natural Sciences, Technology, Social Sciences, Fine Arts and Humanities |
Physics, Technology, Health Sciences, Bio sciences, Fine Arts and Humanities, |
information is not publically availabale |
Time span |
from 1900 (Science), 1956 (Social Sciences) and 1975 (Arts and Humanities), accessble |
records back to 1788
|
information is not publically availabale |
Up-dates |
weekly |
daily |
information not publically available, but more or less monthly |
Collection policy |
public |
public |
information not publically available, contracts with most significant publishing houses |
Citation analysis |
Citation Report -tool |
View citation overview -tool |
search report with a 'Cited by' link, giving all pulications which cite the publication in question |
Time span of citation information |
from 1900 (Science), from 1956 (Social Sciences) and from 1975 (Arts and Humanities); citation statistics available at Oulu University Library for the whole period, but the referencing articles only available from 1975 |
cited references dating back to 1970 |
information is not publically availabale |
Web of Science | Scopus | Google Scholar | |
Indicators | Journal Citation Reports: - Article Influence (AI) - Eigenfactor - H-index - Immediacy Index - Impact Factor (IF) |
- H-index - Raw impact per publication (RIP) - SCImago Journal Rank (SJR) - Source normalized impact per paper (SNIP) - Field-Weighted Citation Impact |
- H-index |
Tools | - Journal Citation Reports - Eigenfactor - ScienceWatch |
- Scival - SCImago Journal and Country Rank - CWTS Journal Indicators |
- Publish or Perish |
University rankings |
- Shanghai Ranking eli Academic Ranking of World Universities (ARWU) - Review of the state of scientific research in Finland by The Academy of Finland |
- Times Higher Education World University Rankings - QS World University Rankings - Webometrics |
- Webometrics |
Researcher profile | - ResearcherID | - Scopus Author Identifier - also Scopus Affiliation Identifier |
- Google Scholar Profile |
From the perspective of citation analysis, what matters most, is the number of records enhanced by cited references and the total number of cited references included in the database used. In addition to contributing citation data, citation enhanced databases may serve as a platform providing the analytical tools for bibliometric analysis. Both of these contributions are beset with several methodological and technical difficulties, including limited coverage of the scholarly literature, inconsistent and inaccurate data, and limited facilities for browsing, searching and analyzing data. Most of these difficulties arise because bibliographic databases are primarily designed for information retrieval and bibliometric analysis represents only a secondary use of the systems.
Both Web of Science and Scopus have master records with cited references, and they show the bibliographic and reference details of the citing records. Web of Science is available in many different versions regarding years of coverage and citation indexes included in the subscription of the user's institution. Web of Science always includes all the cited references for every record created, irrespective of the publication year and subscription. Scopus includes cited reference information for records of papers published from 1970 onward.
Google Scholar's coverage is unknown and might be uneven across different fields of science. Additonally, older publications are poorly covered. To continue non-scholarly literature and citations are included.