Вы находитесь на странице: 1из 14

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online)

e) Volume 1, Issue 1, January- April 2012, IAEME

INTERNATIONAL JOURNAL OF LIBRARY AND INFORMATION SCIENCE

ISSN : 2277 3533 (Print) ISSN : 2277 3584 (Online) Volume 1, Issue 1, January- April (2012), pp. 01-14 IAEME: www.iaeme.com/ijlis.html

IJLIS
IAEME

JOURNAL IMPACT FACTOR (JIF) IN DIGITAL ERA


N.TAMILSELVAN CHIEF LIBRARIAN & HEAD RATHINAM TECHNICAL CAMPUS, COIMBATORE Dr.S.BALASUBRAMANIAN PRINCIPAL RATHINAM TECHNICAL CAMPUS, COIMBATORE ABSTRACT Journal Impact measures and how their evolution Culminated in the journal impact factor (JIF) produced by the Institute for Scientific Information. The paper shows how the various building blocks of the dominant JIF (published in the Journal Citation Report - JCR) came into being. The paper argues that these building blocks were all constructed fairly arbitrarily or for different purposes than those that govern the contemporary use of the JIF. The results are a faulty method, widely open to manipulation by journal editors and misuse by uncritical parties. The discussion examines some solution offered to the bibliometrics and scientific communities considering the wide use of this indicator at present.

KEYWORDS: Journals Impact Factor INTRODUCTION Over the last three decades, librarians and bibliometricians have progressively come to rely on the journal impact factor (JIF). Moreover, interest in this indicator and its derivatives has grown exponentially in the scientific community since 1995. Many researchers have observed that the indicator is driving the publishing strategies of scientists who want to maximize their average impact factor and how, similarly, journal editors aspire to augment their JIF by using strategies that sometimes diverge considerably from widely held beliefs on the basic ethics of science (see, e.g., [SMITH, 1997]). Moreover, it is not uncommon to find these indicators being used to promote researchers (see, e.g., [FUYONO & CYRANOSKI, 2006]. In response, bibliometricians

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

have increasingly tried to tame the beast by suggesting numerous improvements aimed at increasing the validity of the JIF as a quantitative measure. Despite this growing interest, there is, apart from GARFIELDs historical accounts [2006] and intellectual biographies [BENSMAN, 2007], a notable scarcity of contributions to the conceptual history of this important indicator METHOD DEVELOPED BY UNITED STATES UNIVERSITY LIBRARIANS The literature on the use of journal impact measures uniformly concludes that GROSS & GROSS [1927] were the first to develop this method (see, e.g., [ALLEN, 1929; MCNEELY & CROSNO, 1930; GROSS & WOODFORD, 1931; HENKLE, 1938; BRODMAN, 1944; GARFIELD, 1955; RAISIG, 1960]. Gross and Gross sought to address the rising problems of small colleges at a time when one of the biggest of these [was] the problem of adequate library facility. GROSS & GROSS [1927] raised a question that is still highly relevant today: What files of scientific periodicals are needed in a college library successfully to prepare the student for advanced work, taking into consideration also those materials necessary for the stimulation and intellectual development of the faculty? Gross and Gross rhetorically considered the compilation of a list of relevant journals using a subjective approach; this strategy intended to outline the advantages of their more objective method: One way to answer this question would be merely to sit down and compile a list of those journals which one considers indispensable. Such a procedure might prove eminently successful in certain cases, but it seems reasonably certain that often the result would be seasoned too much by the needs, likes and dislikes of the compiler. Thus, one can note that the first use of journal impact calculation aimed to facilitate the task of journal selection using objective quantitative methods, which is one core aspect in the marketing of the most visible commercial product that has emerged from this work Thomson Scientific's (formerly the Institute for Scientific Information, or ISI) Journal Citation Report (JCR). Moreover, an important feature in the development of this method was that at the outset it was developed specifically to cater to the needs of US librarians. For example, in a study of mechanical engineering, MCNEELY & CROSNO [1930] stated: It will be noted that the list contains three American, one English, two German publications, and one French publication. The result is that the English language publications predominate, but it is assumed that such should be the case for American libraries. As a consequence, the scientific investigator, the editor, and the librarian have one thing in common i.e., they are required to base their decisions on certain objective measures of assessing journal quality. Over the past 50 years, these measures of assessing journal quality, also known as bibliometric indicators, have emerged as the chief quantitative measures of the quality of the research papers published, the authors, and that of the institution with which these researchers are associated.

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

IMPACT FACTOR Impact Factor is one of the quantitative tools for ranking, evaluating, categorizing, and comparing journals. It is a measure of the frequency with which the "average article" in a journal has been cited in a particular year or period. The annual impact factor is a ratio between citations and recent citable items published. Thus, the impact factor of a journal is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years. The impact factor, often abbreviated IF, is a measure reflecting the average number of citations to articles published in science and social science journals. It is frequently used as a proxy for the relative importance of a journal within its field, with journals with higher impact factors deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information (ISI), now part of Thomson Reuters. Impact factors are calculated yearly for those journals that are indexed in Thomson Reuters Journal Citation Reports. OBJECTIVES Gross and Gross first reported the use of counting references to rank scientific journals. It was Garfield and Sher of the Institute of Scientific Information (ISI) who first suggested how reference counting could measure impact but the term impact factor was not used until the publication of the 1961 Science Citation Index (SCI) in 1963. The ISI, which was founded by Eugene Garfield, is a Philadelphia-based company and is presently owned by the Thomson Corporation of Toronto. The aim of creating the Journal Impact Factor (JIF) was to help select journals for the SCI. The inventors recognized a core group of highly cited large journals that needed to be covered in the SCI, however, they felt that this way a small but important group of review journals would go unrecognized. As a result, the JIF was created to compare journals regardless of their size. A bi-product of the SCI was the Journal Citation Reports (JCR), which was first published in 1975. From 1975 to 1989, the JCR appeared as supplementary volumes in the annual SCI. From 1990 to 1994, they have appeared in microfiche and in 1995 a CDROM edition was launched. The current JCRs have two editions covering journals in the areas of science, technology, and social sciences. These JCRs cover a total of 8,400 journals with a total of 5,876 journals from the science and technology industries alone. Using optical character recognition software, the journals are first scanned. To store a research paper in its database, ISI employees highlight the following indicators/fields: author, address, journal title, volume, year, and page number. Next, a computer takes a few bytes of information from each highlighted field to build up an identifying code or 'tag' that is unique to that paper. A similar data capture and tagging process occurs for the references at the end of the paper. Algorithms then compare the citation tags with any article tags already in the database and each successful match counts as a citation.

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

The ISI has three standardized measures for calculating the citations and articles received over time. These measures are impact factor, immediacy index, and cited half life. JOURNAL IMPACT FACTOR Librarians and information scientists have been evaluating journals for at least 75 years. Gross and Gross conducted a classic study of citation patterns in the '20s. Others, including Estelle Brodman with her studies in the '40s of physiology journals and subsequent reviews of the process, followed this lead. However, the advent of the Thomson Reuters citation indexes made it possible to do computer-compiled statistical reports not only on the output of journals but also in terms of citation frequency. And in the '60s we invented the journal "impact factor." After using journal statistical data inhouse to compile the Science Citation Index (SCI) for many years, Thomson Reuters began to publish Journal Citation Reports (JCR) in 1975 as part of the SCI and the Social Sciences Citation Index (SSCI). Informed and careful use of these impact data is essential. Users may be tempted to jump to ill-formed conclusions based on impact factor statistics unless several caveats are considered. HISTORY AND MEANING OF THE JOURNAL IMPACT FACTOR I first mentioned the idea of an impact factor in Science in 1955. With support from the National Institutes of Health, the experimental Genetics Citation Index was published, and that led to the 1961 publication of the Science Citation Index. Irving H. Sher and I created the journal impact factor to help select additional source journals. To do this we simply re-sorted the author citation index into the journal citation index. From this simple exercise, we learned that initially a core group of large and highly cited journals needed to be covered in the new Science Citation Index (SCI). Consider that, in 2004, the Journal of Biological Chemistry published 6500 articles, whereas articles from the Proceedings of the National Academy of Sciences were cited more than 300 000 times that year. Smaller journals might not be selected if we rely solely on publication count,so we created the journal impact factor (JIF). The provides a selective list of journals ranked by impact factor for 2004. The also includes the total number of articles published in 2004, the total number of articles published in 2002 plus 2003 (the JIF denominator), the citations to everything published in 2002 plus 2003 (the JIF numerator), and the total citations in 2004 for all articles ever published in a given journal. Sorting by impact factor allows for the inclusion of many small (in terms of total number of articles published) but influential journals. Obviously, sorting by total citations or other provided data would result in a different ranking. The journal impact factor was created in the early 1960s by Eugene Garfield and Irving H. Sher to help select core group of highly cited journals for the Science Citation Index. From its onset it was intended solely to compare journals regardless of their size.

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

A journals impact factor is based on two factors: (a) a numerator denoting the number of citations in the current year to any items published in the journal in the previous two years, and (b) a denominator denoting the number of substantive articles in the last two years. DEFINITION Impact Factor is a measure of the number of citations within scientific journals. The impact factor is used to gauge the relative importance of a scientific journal within its field. Eugene Garfield, the founder of the Institute for Scientific Information (ISI) which is now part of Thomson, is the person who devised the impact factor. Thomson Scientific calculates impact factors annually for the journals it indexes.

CALCULATION FOR JOURNAL IMPACT FACTOR A= total cites in 1992 B= 1992 cites to articles published in 1990-91 (this is a subset of A) C= number of articles published in 1990-91 D= B/C = 1992 impact factor METHOD OF CALCULATION In a given period (Yearly/Half- yearly/ Quarterly/Monthly), the Journal Impact Factor of a journal is the average number of citations received per paper published in that journal during the one or two (One or More) Preceding periods. For example (Calculation of Journal Impact Factor(JIF) Yearly) , if a journal has an impact factor of 5 in 2009, then its papers published in 2007 and 2008 received 5 citations each on average. The 2011 impact factor of a journal would be calculated as follows: A = the number of times articles published in 2007 and 2008 were cited by journals, books, patent document, thesis, project reports, news papers, conference/ seminar proceedings, documents published in internet, notes and any other approved documents during2009 B = the total number of "citable items" published by that journal in 2007 and 2008. ("Citable items" are usually articles, reviews, proceedings, notes or any other documents pre-reviewed before publishing it) 2011 impact factor = A/B. New journals, which are indexed from their first published issue, will receive an impact factor after indexing it immediately.

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

For Example a journal published first issue in June 2011, can get Journal Impact Factor for July 2011 onwards. The journal impact factor relates to a specific time period; it is possible to calculate it for any desired period. The Journal Reference Reports (JRR) shows rankings of journals by journal impact factor, if desired by discipline, such as mechanical engineering or human resource management. IMPACT FACTOR CALCULATED Scientists have been trying to classify and evaluate the relative importance of scientific and social journals for many years. One way to do so is to use citations patterns, as developed by Gross and Gross in the 20s. Based on it, Thomson Reuters created two citation indexes (the Science Citation Index and Social Sciences Citation Index) that could be calculated by direct computer software, compiling all statistical reports. This lead to the creation of the now commonly called impact factor, which includes not only the output of journals but also the citation frequency of their articles. The Journal Citation Reports was then created in 1975. The impact factor is a tool that can be used to compare journals. Concretely, it measures the number of times an article is cited during a specific period of time. For this, an article considered average is chosen. In other words, the impact factor is the ratio between the number of citations that have been made for this average article and the number of items that could be cited because they have been published. The calculation is thus done by dividing the number of citations for the current year to the number of articles published in the journal during the last 2 years. The impact factor is calculated every year for each journal (starting from the 2nd year of publication) and the results for each journal are published in the Journal Citation Reports. SCIENTIFIC JOURNALS IMPACT FACTOR CALCULATED A= number of citations in year X for an average article B= total number of publications for years X-1 and X-2 A/B= IF= impact factor for year X The impact factor is particularly useful to estimate the real significance of the number of citations or their frequency. Otherwise, bigger journals would have more articles and more citations, but would not necessary be as relevant as smaller journal with fewer publications. However, tit has its own limitations and should be used with care and by informed readers. One important limitation is that the impact factor of a journal does not necessarily indicate the number of citations for a specific article. Another one is that the impact factor can depend on many variables such as the number of references in the average article or the type of articles published and cited (reviews, letters, original articles). The information it provides should thus be used wisely! TOOLS AND METHODS TO USE WITH IMPACT FACTORS

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

Journal Citation Reports (JCR) includes not only the impact factor, but a number of related metrics which can be used to evaluate journals (Mulford Library Help Sheet). Additionally, the JCR Web site provides information on how to more effectively compare journals with factors as review articles, self citations, and journal format changes. A research study also strongly recommends use of additional mathematical criteria to assess journal quality. It has concluded that the methodological quality of clinical research articles includes impact factors of the publishing journal in conjunction with citation rates, circulation rates, and low manuscript acceptance rates. However, while the above mathematical computations are useful, they can never replace peer review or reading and evaluating the quality of individually published scientific articles. IMPACT FACTOR IS GIVEN BY EUGENE GARFIELD It is a measure of the frequency with which the "average article" in a journal has been cited in a given period of time. The impact factor for a journal is calculated based on a three-year period, and can be considered to be the average number of times published papers are cited up to two years after publication. For example, the impact factor 2010 for a journal would be calculated as follows: A = the number of times articles published in 2008-9 were cited in indexed journals during 2010 B = the number of articles, reviews, proceedings or notes published in 2008-2009 impact factor 2010 = A/B (note that the impact factor 2009 will be actually published in 2010, because it could not be calculated until all of the 2009 publications had been received. Impact factor 2010 will be published in 2011) TITLE CHANGE A user's knowledge of the content and history of the journal studied is very important for appropriate interpretation of impact factors. Situations such as those mentioned above and others such as title change are very important, and often misunderstood, considerations. A title change affects the impact factor for two years after the change is made. The old and new titles are not unified unless the titles are in the same position alphabetically. In the first year after the title change, the impact is not available for the

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

new title unless the data for old and new can be unified. In the second year, the impact factor is split. The new title may rank lower than expected and the old title may rank higher than expected because only one year of source data is included in its calculation. Title changes for the current year and the previous year are listed in the JCR guide. Unified 1992 impact factor calculation for title change A=1992 citations to articles published in 1990-91 (a1 + a2) A1=those for new title A2=those for superseded title B=number of articles published in 1990-91 (B1 + B2) B1=those for new title B2=those for superseded title C=unified impact factor (A/B) C1=A1/B1 = JCR factor for the new title C2=A2/B2 = JCR factor for the superseded title IMPORTANT POINTS RELATED TO JOURNAL IMPACT FACTOR Journal Impact Factor cannot be calculated for new journals. I mean the impact factor of a journal is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years, hence impact factor can be calculated after completing the minimum of 3 years of publication. Journal Impact Factor will be a quotient factor only and will not be a quality factor. Journal Impact Factor will not be related to quality of content and quality of peer review, it is only a measure of the frequency with which the "average article" in a journal has been cited in a particular year or period. Journal which publishes more review articles will get highest impact factors.

THOMSON REUTERS JOURNAL SELECTION PROCESS Thomson Reuters is committed to providing comprehensive coverage of the world's most important and influential journals to meet its subscribers' current awareness and retrospective information retrieval needs. Today Web of Science SM covers nearly 12,000 international and regional journals and book series in every area of the natural sciences, social sciences, and arts and humanities. But comprehensive does not necessarily mean all-inclusive Select the question to show its corresponding answer Why Be Selective The Evaluation Process Basic Journal Standards

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

Editorial Content As mentioned above, an essential core of scientific literature forms the basis for all scholarly disciplines. However, this core is not static - scientific research continues to give rise to specialized fields of studies, and new journals emerge as published research on new topics achieves critical mass. Thomson Reuters editors determine if the content of a journal under evaluation will enrich the database or if the topic is already adequately addressed in existing coverage. With an enormous amount of citation data readily available to them, and their daily observation of virtually every new scholarly journal published, Thomson Reuters editors are well positioned to spot emerging topics and active fields in the literature. USING THE THOMSON REUTERS IMPACT FACTOR The Thomson Reuters Impact Factor, as explained in the last essay, is one of the evaluation tools provided by Thomson Reuters Journal Citation Reports(JCR). Many features of the JCR can be applied to the real-world task of journal evaluation, and the specific needs of the user ultimately determine which of those components is the most appropriate for the task. BRADFORD'S LAW Doomsday predictions about the exponential growth of scientific literature have not come to pass. While the growth has been slower than forecasted, it nevertheless warrants concern. Even though the reality of the current situation is not nearly as frightening as had been anticipated, the need to be selective in journal management is all the more imperative. As Bradford's Law predicts, a small percentage of journals accounts for a large percentage of what is published. An even smaller percentage accounts for what is cited. In other words, there are diminishing returns in trying to cover the literature exhaustively. Careful selection is, therefore, an effective way to avoid "documentary chaos." This term, coined by Samuel C. Bradford, the former librarian of the Science Museum in London, refers to the anxiety that one feels in contemplating the information explosion. Recognizing the need of readers to scan the most significant journals published was the raison d'etre for Current Contents. It is understandable that publishers are concerned that their journals are selected by Thomson Reuters for inclusion in its database. Indeed, it is sometimes argued that the survival of a particular journal depends on Thomson Reuters decision to cover it in Current Contents. A journal's ultimate success depends upon its quality, distribution, and many other competitive factors including cost and timeliness. Any one of these factors, including coverage by Thomson Reuters, can make the difference between success and failure.

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

SCI SEARCH STRATEGIES There are four different ways to search the printed SCI. Each focuses on a different strategy. A search might begin with the Citation Index, which is arranged alphabetically by the first author. All publications cited during the designated indexing period are listed under the first author's name, just as they appear in the journals. On the other hand, all authors of articles recorded by Thomson Reuters during the period are indexed in the Source Index. This index lists the full title of each paper. Use the Source Index to find out what an author has published. To research a topic by title word or subject, use the Permuterm Subject Index (PSI). This is essentially a title word or natural language index. However, the CD-ROM version has augmented this capability through SCI's KeyWords Plus based on recurring words or phrases appearing in a paper's list of cited references. The CD-ROM version also includes author key words. The Corporate Index is arranged geographically. It identifies papers published at a specific institution. The printed listings are organized both alphabetically and geographically. All of these indexing approaches can be combined when using the online and CD-ROM versions to find, for example, the papers that are published on a given topic at a particular university or company. With Journal Citation Reports, you can:

Librarians can support, evaluate and document the value of their library research investments. Publishers can determine journals influence in the marketplace, to review editorial policies and strategic direction, monitor competitors, and identify new opportunities. Authors and editors can identify the most appropriate, influential journals in which to publish. Researchers can discover where to find the current reading list in their respective fields. Information analysts and bibliometricians can track bibliometric and citation trends and patterns.

Journal Citation Reports


Sort journal data by clearly defined fields: Impact Factor, Immediacy Index, Total Cites, Total Articles, Cited Half-Life, or Journal Title. Sort subject category data by clearly defined fields: Total Cites, Median Impact Factor, Aggregate Impact Factor, Aggregate Immediacy Index, Aggregated Cited Half-Life, Number of Journals in Category, Number of Articles in Category. View a journals impact with a five-year Impact Factor trend graph.

10

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

Understand a journals citation influence and prestige with Eigenfactor Metrics five-year metrics that consider scholarly literature as a network of journal-to-journal relationships. Visualize impact factor by journal category with impact factor boxplots. Rank journals in multiple categories. See how journal self-citations affect impact factor. Full integration with ISI Web of Knowledge lets you link from Web of Science to JCR Web; from JCR journal records to ulrichsweb.com and recent Current Contents Connect tables of contents; and to and from your librarys OPAC.

VALIDITY

The impact factor is highly discipline-dependent. The percentage of total citations occurring in the first two years after publication varies highly among disciplines from 1-3 percent in the mathematical and physical sciences to 5-8 percent in the biological sciences. The impact factor could not be reproduced in an independent audit. The impact factor refers to the average number of citations per paper, but this is not a normal distribution. It is rather a Bradford distribution, as predicted by theory. Being an arithmetic mean, the impact factor therefore is not a valid representation of this distribution and unfit for citation evaluation. In the short term especially in the case of low-impact-factor journals many of the citations to a certain article are made in papers written by the author(s) of the original article. This means that counting citations may be independent of the real "impact" of the work among investigators. Garfield, however, maintains that this phenomenon hardly influences a journal's impact factor. Moreover, a study of author self-citations in diabetes literature found that the frequency of author selfcitation was not associated with the quality of publications. Similarly, journal self-citation is common in journals dealing in specialized topics having high overlap in readership and authors, and is not necessarily a sign of low quality or manipulation. Journal ranking lists constructed based on the impact factor only moderately correlate with journal ranking lists based on the results of an expert survey.

EDITORIAL POLICIES WHICH AFFECT THE IMPACT FACTOR A journal can adopt editorial policies that increase its impact factor.

Journals may publish a larger percentage of review articles which generally are cited more than research reports. Therefore review articles can raise the impact factor of the journal and review journals will therefore often have the highest impact factors in their respective fields. Conversely, journals may choose not to publish minor articles, such as case reports in medical journals, which are unlikely to be cited and would reduce the average citation per article. Journals may change the fraction of "citable items" compared to front-matter in the denominator of the IF equation. Which types of articles are considered

11

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

"citable" is largely a matter of negotiation between journals and Thomson Scientific. As a result of such negotiations, impact factor variations of more than 300% have been observed. For instance, editorials in a journal are not considered to be citable items and therefore do not enter into the denominator of the impact factor. However, citations to such items will still enter into the numerator, thereby inflating the impact factor. In addition, if such items cite other articles (often even from the same journal), those citations will be counted and will increase the citation count for the cited journal. This effect is hard to evaluate, for the distinction between editorial comment and short original articles is not always obvious. "Letters to the editor" might refer to either class. Several methods, not necessarily with nefarious intent, exist for a journal to cite articles in the same journal which will increase the journal's impact factor.

RESPONSES

Because "the impact factor is not always a reliable instrument" in November 2007 the European Association of Science Editors (EASE) issued an official statement recommending "that journal impact factors are used only - and cautiously - for measuring and comparing the influence of entire journals, but not for the assessment of single papers, and certainly not for the assessment of researchers or research programmes". In July 2008, the International Council for Science (ICSU) Committee on Freedom and Responsibility in the conduct of Science (CFRS) issued a "Statement on publication practices and indices and the role of peer review in research assessment", suggesting some possible solutions, e.g. considering penalising scientists for an excessive number of publications per year. In February 2010, the Deutsche Forschungsgemeinschaft (German Foundation for Science) published new guidelines to evaluate only articles and no bibliometric information on candidates to be evaluated in all decisions concerning "...performance-based funding allocations, postdoctoral qualifications, appointments, or reviewing funding proposals, [where] increasing importance has been given to numerical indicators such as the h-index and the impact factor". This decision follows similar ones of the National Science Foundation (US) or the Research Assessment Exercise (UK).[citation needed]

MISUSES OF THE IMPACT FACTOR AS A SOLE CRITERIA


Promotion and tenure decisions (impact factors of the journals where an author has published) Journal selection by researchers for article submissions University administrators rating or ranking academic and research programs within and across an institutions Establishment of journal reputations by their publishers to attract subscriptions and participation by top authors

12

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

An impact factor indicates to some extent the quality of a journal as a whole. However, the impact factor alone does not indicate the quality of individual articles within a journal, the overall quality of the research performed authors publishing within journals with impact factors, or the prestige of associated academic departments, research programs, or institutions.

CONCLUSION The impact factor is a very useful tool for evaluation of journals, but it must be used discreetly. Considerations include the amount of review or other types of material published in a journal, variations between disciplines, and item-by-item impact. The journal's status in regard to coverage in the ISI databases as well as the occurrence of a title change are also very important. In the next essay we will look at some examples of how to put tools for journal evaluation into use. Hoeffel and Garfield who expressed the situation succinctly as shown below: "Impact factors is not a perfect tool to measure the quality of articles but there is nothing better and it has the advantage of already being in existence and is, therefore, a good technique for scientific evaluation. Experience has shown that in each specialty the best journals are those in which it is most difficult to have an article accepted, and these are the journals that have a high impact factor. These journals existed long before the impact factor was devised. The use of impact factor as a measure of quality is widespread because it fits well with the opinion we have in each field of the best journals in our specialty". Finally Garfield "cautioned the use of impact factor to weigh the influence of a paper amounts to a prediction, albeit coloured by probabilities. REFERENCE 1. Gannon F. The impact of the impact factor. EMBO Rep 2000; 1:293 2. Martin-Sempere MJ, Rey-rocha J and Garzon-Garcia B. Assessing quality of Domestic Scientific journals in genographically oriented disciplines: Scientists' judgments versus citations. Res Eval 2002; 11:149-54. 3. Buchtel HA. Libraries and the Academy. Cortex 2001; 37:455-6 4. Davis PM. Where to spend our e-journal money? Defining a university library's core collection through citation analysis. Vol. 2. Baltimore, USA: John Hopkins University Press; 2002. p. 155-66. 5. http://www.sciencedirect.com 6. Resource: Science gateway 7. Eugene Garfield, PhD, Thomson Scientific, 3501 Market St, Philadelphia, PA 19104

13

International Journal of Library and Information Science (IJLIS), ISSN: 2277 3533 (Print) ISSN: 2277 3584 (Online) Volume 1, Issue 1, January- April 2012, IAEME

garfield@codex.cis.upenn.edu. 8. Impact Factor for Scientific Journals -2008 [PDF] 9. ISI website 10. Impact Factor for Scientific Journals -2007-PDF (Size-1.0 MB) 11. Garfield E. 2006. The history and meaning of the journal impact factor. JAMA. 295(1):90-93. Available to UT faculty, staff, and students at http://jama.ama- assn.org/cgi/reprint/295/1/90 12. Pringle J. 2008. Trends in the use of ISI citation databases for evaluation. Learned Publishing. 21(2):85-91. Available to UT faculty, staff, and students through the UT Libraries catalog at http://utmost.cl.utoledo.edu/record=b2593314 13. Roeser, RJ. 2007. The use and misuse of America and the JCR impact factor. International Journal of Audiology. 46(10):553. Available to UT faculty, staff, and students through the UT Libraries catalog at http://utmost.cl.utoledo.edu/record=b2593314 14. Current Contents print editions July 18, 1994, when Thomson Reuters was known as the Institute for Scientific Information (ISI). 15. Garfield E. The impact factor. Current Contents (25):3-7, 20 June 1994. 16. Garfield E. Prestige versus impact: Established images of journals, like institutions, are resistant to change. Essays of an Information Scientist. Philadelphia: ISI Press, 1989. Vol. 10. p. 263-4.

14

Вам также может понравиться