|Are you interested in contributing to HLWIKI International? contact
To browse other articles on a range of HSL topics, see the A-Z index.
- 20 February 2017
See also Altmetrics | Bibliographic citation software | Citation analysis | Eugene Garfield | ImpactStory | InCites research analytics | Scopus vs. Web of Science | Webometrics
"...the word "bibliometrics" first appeared in print in 1969 in Alan Pritchard's article, "Statistical bibliography or bibliometrics?" in the December issue of J Documentation..." — Hertzel, 2003
Bibliometrics, as a discipline, dates back to the mid-20th century and the work of Eugene Garfield. The term bibliometrics refers to the measurement and aggregation of publishing and citation data patterns which aim to quantify the impact of scholarly activities. Compared to other evaluation methods, bibliometrics offers important advantages and can be used to generate quantitative indicators of collaboration and interdisciplinarity. As analytical tools improve, they can be used to develop indicators of ‘quality’ and ‘excellence’. These analyses are supported by indicators of varying complexity which have been developed over the last few years. Robust bibliometric analysis requires an understanding of the strengths and limitations of the tools that accomplish data tracking in a digital world. A new field related to bibliometrics is altmetrics (see also webometrics) which aims to examine new metrics of the social web. Bibliometrics is not the only term used to refer to the quantitative study of document-related usage and its processes. Informetrics and librametry (coined by Ranganathan, 1948) have also been used. Scientometrics, technometrics, sociometrics and econometrics are fields that overlap with bibliometrics. Webometrics and cybermetrics are new areas that focus on the communication of information online. Historically, bibliometrics was called “statistical bibliography”. Citation analysis, a term used by Eugene Garfield, is also a type of citation indexing.
According to the chief engineer at Google scholar, there is a growing impact of non-elite journals (traditionally, those that have not been cited much) in the Internet age, and are being cited more due to the available of information on the web.
- Bibliometrics: belongs to research in the overlapping but separate areas of "library and information science" (LIS). Its purpose is to index citations in scholarly fields. In the sciences, bibliometrics is also referred to as scientometrics. Simply, bibliometrics is the quantitative analysis of bodies of literature (such as journal articles, monographs and patents) and their references: citations, and co-citations. Using quantitative analysis and statistics to discern patterns of publication in the sciences is a relatively new area. The purpose of bibliometrics is to quantify journal impact relative to other journals in a similar field. Researchers use metrics to determine author influence and identify relationships between two or more authors. A common way to do this research is to use the Social Science Citation Index, the Science Citation Index and/or the Arts and Humanities Citation Index on Thomson's Web of Science (WoS) or the Web of Knowledge. Even though bibliometric research can be done on Google scholar via the cited-by feature, GS is widely-criticized as having inflated citations counts. Peter Jacso calls into question its reliability for this kind of work. Bibliometric analysis of literature allows the study of the foundations of a discipline and (as a robust quantitative approach) augments the findings of more subjective literature reviews. When applied to patent data, this analysis allows the investigation of firm and inventor networks by describing the linkages that are evident in citation to other individuals, firms and technologies. Bibliometrics can be extended to illustrate the most influential citations, how they are related, how strong their relationships are, and how far removed from, or central to, other groupings they are. In other words, the relationships inherent in the intellectual structure of a field or patent space can be rendered graphically. Co-citation studies can reveal what topics, themes, and research methods are central, or peripheral, to a field, and how they have changed over time.
- Research assessment and analytics: the profession of librarianship is constantly evolving. As the information landscape changes, librarians of all stripes must adapt and expand the range of services provided to ensure we remain relevant and valuable to our user communities. One recent change in this regard is the growing demand for bibliometric support as well as research assessment and analytics at research institutions. These analytics-based activities (such as altmetrics) seek to understand who produces what research at any given institution, who they collaborate with to produce this knowledge, what exactly their research is about, and what impacts their research has in fields, and generally within the academy. As a response, librarians at academic institutions offer research assessment services. New roles are available to academic librarians in analyzing research within their organizations and to facilitate communication among those who want to provide aspects of research assessment services in their liaison work. The need for research assessment services must be determined in conjunction with librarians and information professionals who already provide these services, and opportunities should be identified to expand these services at other institutions.
Use of bibliographic information in research
Powell and Connaway (2004) suggest the use of bibliographic information in research can be outlined as follows:
- Improving the bibliographic control of a literature
- Identifying a core literature, especially in the journal literature
- Classifying a literature
- Tracing the spread of ideas within and a growth of a literature
- Designing more economic information systems and networks
- Improving the efficiency of information handling services
- Predicting publishing trends
- Describing patterns of book use by patrons
Bibliometrics and scientometrics
Bibliometrics and scientometrics are two closely-related fields that aim to measure scientific publications and science in general. A lot of the research that falls under this topic involves citation analysis, or examining how scholars cite one another in publications. Author citation data can show a lot about scholar networks and scholarly communication, linkages between scholars, and the development of areas of knowledge over time. Modern scientometrics is based on the work of Derek J de Solla Price and Eugene Garfield.
The field of scientometrics – the science of measuring and analyzing science – took off in 1947 when mathematician Derek J. de Solla Price was asked to store a complete set of the Philosophical Transactions of the Royal Society temporarily in his house. He stacked them in order and he noticed that the height of the stacks fit an exponential curve. Price started to analyze all sorts of other kinds of scientific data and concluded in 1960 that scientific knowledge had been growing steadily at a rate of 4.7 percent annually since the 17th century. The upshot was that scientific data was doubling every 15 years.
As with other scientific approaches, scientometrics and bibliometrics have their own limitations. Recently, a criticism was voiced pointing toward certain deficiencies of the journal impact factor (JIF) calculation process, based on the Web of Science such as: journal citation distributions may be highly skewed towards established journals; journal impact factor properties are field-specific and can be easily manipulated by editors, or even by changing the editorial policies; this makes the entire process essentially nontransparent. Regarding the more objective journal metrics, there is a growing view that for greater accuracy it must be supplemented with an article-level metrics and peer-review. Thomson Reuters replied to criticism in general terms by stating that "no one metric can fully capture the complex contributions scholars make to their disciplines, and many forms of scholarly achievement should be considered."
Bibliometrics for the academic unit
- Abramo G, D'Angelo CA, Viel F. Assessing the accuracy of the h- and g-indexes for measuring researchers' productivity. JASIS. 2013.
- Brown T. Journal quality metrics: options to consider other than impact factors. Am J Occup Ther. 2011;65:346–350.
- Bauer HH. Science in the 21st century: knowledge monopolies and research cartels. J Sci Explor. 2004;18(4):643–660.
- Borgman CL, Furner J. Scholarly communication and bibliometrics. Ann Rev Info Sci Tech. 2002;36:2–72.
- Bornmann L. Citation counts for research evaluation: standards of good practice for analyzing bibliometric data and presenting and interpreting results. Eth Sci Envir Polit. 2008;8:93–102.
- Brodman E. Methods of choosing physiology journals. Bull Med Libr Assn. 1944;32:479-83.
- Butler L. Using a balanced approach to bibliometrics: quantitative performance measures in the Australian Research Quality Framework. Ethics Sci Environ Polit. 2008;8:83–92.
- Cheung WWL. The economics of post-doc publishing. Eth Sci Envir Polit. 2008;8:41–44.
- Colaco M, Svider PF, Mauro KM, Eloy JA, Jackson-Rosario I. Is there a relationship between NIH funding and research impact in academic urology? J Urology. 1 March 2013.
- Gross PLK, Gross EM. College libraries and chemical education. Science. 1927;66:385-9.
- Powell RR, Connaway LS. Basic research methods for librarians (4e). Westport, CT: Libraries Unlimited, 2004.
- Cronin B. Bibliometrics and beyond: some thoughts on web-based citation analysis. J Information Science. 2001;27:1–7.
- Culyer T. Measuring IHPME’s impact: bibliometrics and beyond. Some thoughts for the Institute of Health Policy, Management & Evaluation. 2013?
- Franceschet M. A comparison of bibliometric indicators for computer science scholars and journals on Web of Science and Google Scholar. Scientometrics. 2010;(3):243-258.
- Garfield E. The history and meaning of the journal impact factor. JAMA. 2006;295:90–93.
- Giske J. Benefitting from bibliometry. Ethics Sci Environ Polit. 2008;8:79–81.
- Glassman NR, Sorensen K. Citation management. J Elec Res Med Libr. 2012;9(3):223-231.
- Harnad S. Validating research performance metrics against peer rankings. Ethics Sci Environ Polit. 2008;8:103–107p.
- Henderson M. The quantitative crunch: the impact of bibliometric research quality assessment exercises on academic development. Campus-Wide Info Systems. 2009;26(3):149-67.
- Higgins O, Sixsmith J, Barry MM. A literature review on health information-seeking behaviour on the web: a health consumer and health professional perspective. Stockholm: ECDC; 2011.
- Hirsch JE. An index to quantify an individual’s scientific research output. Proc Natl Acad Sci. 2005;102(46):16569–72.
- Hoeffel C. Journal impact factors. Allergy. 1998;53(12):1225.
- Lawrence PA. The mismeasurement of science. Curr Biol 2007;17(15):R583–R585.
- Ismail S. Bibliometrics as a tool for supporting prospective R&D decision-making in the health sciences: strengths, weaknesses and options for future development. RAND, 2009.
- Kear R, Colbert-Lewis D. Citation searching and bibliometric measures: Resources for ranking and tracking. Coll Res Libr. 2011;72(8):470-474.
- Kousha K, Thelwall M. Google Scholar citations and Google Web/URL citations: a multi-discipline exploratory analysis. J Am Soc Inform Sci Tech. 2007;58(7):1055–1065.
- Lawrence PA. Lost in publication: how measurement harms science. Ethics Sci Environ Polit. 2008;8:9–11.
- Li X, Thelwall M, Giustini D. Validating online reference managers for scholarly impact measurement. Scientometrics. 2011;91(2):461-471.
- Martín-Martín A, Orduña-Malea E, Ayllon JM, López-Cózar ED. The counting house: measuring those who count. Presence of Bibliometrics, Scientometrics, Informetrics, Webometrics and Altmetrics in the Google Scholar Citations, ResearcherID, ResearchGate, Mendeley & Twitter. arXiv preprint arXiv:1602.02412. 2016 Feb 7.
- Monastersky R. The impact factor, once a simple way to rank scientific journals, has become an unyielding yardstick for hiring, tenure, grants. Chron High Educ. 2005;52:A12.
- Ogden TL, Bartley DL. The ups and downs of journal impact factors. Ann Occup Hyg. 2008;52(2):73–82.
- Pickard KT. Impact of open access and social media on scientific research. J Participat Med. 2012 Jul 18;4:e15.
- Powell RR, Connaway LS. Basic research methods for librarians (4e). Westport, CT: Libraries Unlimited, 2004.
- Priem J, Hemminger BM. Scientometrics 2.0: toward new metrics of scholarly impact on the social web. First Monday. 2010;15(7).
- Pritchard A. Statistical bibliography or bibliometrics?. J Doc. 1969;25(4):348-349.
- Ranganathan SR. Librametry and its scope. JISSI: International Journal of Scientometrics and Informetrics. 1995;1(1):15-21.
- Rossner M, Van Epps H. Show me the data. J Cell Biol. 2007;179(6):1091–92.
- Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ. 1997;314(7079):497–502.
- Spearman CM, Quigley MJ, Quigley MR, Wilberger JE. Survey of the h index for all of academic neurosurgery: another power-law phenomenon? J Neurosurg. 2010 May 14.
- Taylor M, Perakakis P, Trachana V. The siege of science. Ethics Sci Environ Polit. 2008;8:17–40.
- Thelwall M. Bibliometrics to webometrics. J Information Science. 2008;34(4):605-621.
- Todd PA, Ladle RJ. Hidden dangers of a ‘citation culture’. Ethics Sci Environ Polit. 2008;8:13–16.
- Tsikliras AC. Chasing after the high impact. Ethics Sci Environ Polit. 2008;8:45–47.
- Van Noorden R. Metrics: a profusion of measures. Nature. 2010;465:864.
- Vaughan L, Shaw D. A new look at evidence of scholarly citation in citation indexes and from web sources. Scientometrics. 2008;74(2):317–330.