Posts Tagged ‘Google Scholar’
According this study related to Social Sciences publications, Google Scholar provides “vastly larger citation counts than either Scopus or Web of Science when all results are taken into account, but only slightly larger counts when only scholarly journals are considered“….
The study also deals with citation counting issue, saying that “ it is relatively easy to falsify citing references to research and create “search engine spam” which artificially inflates citation countswithin Google Scholar. While it is unclear as to whether this is occurring deliberately and if so, towhat extent, it remains an issue which should engender cautious use of search engine citation data“.
As a conclusion the study says that “ Google Scholar may not be as reliable as either Scopus or Web of Science as a stand-alone source for citation data“
Elaine M. Lasda Bergman. Finding Citations to Social Work Literature: The Relative Benefits of Using Web of Science, Scopus, or Google Scholar. The Journal of Academic Librarianship, Available online 23 October 2012
According this study, PubMed searches and Google Scholar searches often identify different articles. In this study, Google Scholar articles were more likely to be classified as relevant, had higher numbers of citations and were published in higher impact factor journals.
Nourbakhsh, E., Nugent, R., Wang, H., Cevik, C. and Nugent, K. (2012), Medical literature searches: a comparison of PubMed and Google Scholar. Health Information & Libraries Journal, 29: 214–222.
Skepticism of Google Scholar is merited. Google Scholar is lacking as a scholarly search tool because, first and foremost, it is not an abstracting and indexing service like the bibliographic databases frequently recommended by librarians. Those databases have literature indexed, often by humans, allowing it to be categorized with a controlled vocabulary and subject headings. Google Scholar is a search engine and as such it searches the full text, bibliographic information, and metadata of electronic documents. The computer programming that allows this to happen lacks the objective eye of a human indexer and, consequently, data is interpreted incorrectly and questionable sources pass through algorithms. Google Scholar’s methods of document retrieval are contrary to librarians’ understanding and expectation of information organization. Google Scholar’s inability or unwillingness to elaborate on what documents its system crawls and the uncertain quality of Google Scholar’s performance provides further reasons for information professionals and researchers to be wary of this tool, especially when so many quality databases exist and seem to sufficiently meet scientific information needs.
Gray, Jerry E. Scholarish: Google Scholar and its Value to the Sciences. Issues in Science and Technology Librarianship, Summer 2012. Available from: http://www.istl.org/12-summer/article1.html
Is Google Scholar useful for bibliometrics? A webometric analysis
(2012) Scientometrics, 91 (2), pp. 343-351.
Google Scholar, the academic bibliographic database provided free-of-charge by the search engine giant Google, has been suggested as an alternative or complementary resource to the commercial citation databases like Web of Knowledge (ISI/Thomson) or Scopus (Elsevier). In order to check the usefulness of this database for bibliometric analysis, and especially research evaluation, a novel approach is introduced. Instead of names of authors or institutions, a webometric analysis of academic web domains is performed. The bibliographic records for 225 top level web domains (TLD), 19,240 university and 6,380 research centres institutional web domains have been collected from the Google Scholar database. About 63. 8% of the records are hosted in generic domains like. com or. org, confirming that most of the Scholar data come from large commercial or non-profit sources. Considering only institutions with at least one record, one-third of the other items (10. 6% from the global) are hosted by the 10,442 universities, while 3,901 research centres amount for an additional 7. 9% from the total. The individual analysis show that universities from China, Brazil, Spain, Taiwan or Indonesia are far better ranked than expected. In some cases, large international or national databases, or repositories are responsible for the high numbers found. However, in many others, the local contents, including papers in low impact journals, popular scientific literature, and unpublished reports or teaching supporting materials are clearly overrepresented. Google Scholar lacks the quality control needed for its use as a bibliometric tool; the larger coverage it provides consists in some cases of items not comparable with those provided by other similar databases.
Sunday, April 1, 2012 | 3:00 AM
Most researchers are familiar with well-established journals and conferences in their field. They are often less familiar with newer publications or publications in related fields – there’re simply too many! Today, we’re introducing Google Scholar Metrics: an easy way for authors to quickly gauge the visibility and influence of recent articles in scholarly publications.
To get started, you can browse the top 100 publications in several languages, ordered by their five-year h-index and h-median metrics. You can also search for publications by words in their titles. For example, [design], [international law], [salud], and [otolaryngology]. To see which articles in a publication were cited the most and who cited them, click on its h-index number.
Scholar Metrics currently covers many (but not all) articles published between 2007 and 2011. It includes journal articles only from websites that follow our inclusion guidelines as well as conference articles and preprints from a small number of hand-identified sources. For more details, see the Scholar Metrics help page.
Here is hoping Google Scholar Metrics will help authors worldwide as they consider where to publish their latest article.
A new study, to be published into Scientometrics, shows that Google Scholar “lacks the quality control needed for its use as a bibliometric tool“.
- universities from China, Brazil, Spain, Taiwan or Indonesia are far better ranked than expected
- in some cases, the local contents, including papers in low impact journals, popular scientific literature, and unpublished reports or teaching supporting materials are clearly overrepresented
Aguillo, Isidro. Is Google Scholar useful for bibliometrics. A webometric analysis. Scientometrics, In press.
The four most popular search engines PubMed/MEDLINE, ScienceDirect, Scopus and Google Scholar are investigated to assess which search engine is most effective for literature research in laser medicine. Their search features are described and the results of a performance test are compared according to the criteria (1) recall, (2) precision, and (3) importance.
As expected, the search features provided by PubMed/MEDLINE with a comprehensive investigation of medical documents are found to be exceptional compared to the other search engines.
However the most effective search engine for an overview of a topic is Scopus, followed by ScienceDirect and Google Scholar.
With regard to the criterion “importance” Scopus and Google Scholar are
clearly more successful than their competitors.
All in all Scopus is the most effective search engine if one requires only an overview of the topic. For a widespread and in-depth investigation in the area of life science and closely related topics, PubMed/MEDLINE is more appropriate
Tober, Markus. PubMed, ScienceDirect, Scopus or Google Scholar – Which is the best search engine for an effective literature research in laser medicine? Medical Laser Application. Volume 26, Issue 3, August 2011,
Pages 139-144. Basic Investigations for diagnostic purposes
The latest project of Microsoft Research seems now ready to enter the market of free publication engines… and to compete with Scholar Google, PubGet, FreeFullPDF, etc.
A quick look on it tonight gave me a good first impression:
- a nice interface: nothing original here, but vital features are present
- interesting analytic options are proposed: co-authors graph, citations analysis, etc.: see http://academic.research.microsoft.com/About/Help.htm
- Content: they claim to have 27 million publications; a quick search gave me less hits than in PubGet (“Benfluorex”: 49 in MAS, 105 in PubGet)
To investigate further and to follow…
Comparison of PubMed and GS results for clinical topics in respiratory care.
“Our results suggest that PubMed searches whith the Clinical Queries filter are more precise than with the Advanced Search in Google Scholar for respiratory topics. PubMed appears to be more practical to conduct efficient, valid searches, for informing evidence-based patient-care protocols, for guiding the care of individual patients, and for educational purposes“
“GS is inappropriate as the sole alternative for clinicians. (…) For now, the optimal application of Google Scholar may be as an adjunct resource, for known authors and articles, or perhaps for initial searches to quickly find a relevant article“.
Anders, Michael E & Evans, Dennis P. Comparison of PubMed and Google Scholar literature searches. Respiratory Care, May 2010, Vol. 55, N°5, pp. 578-583
A long history…
Peter Jacso, one of the best experts in STM abstract databases, gives his opinion… In his latest publication, he compared 3 tools: Web of Science (WoS), Scopus and Google Scholar (GS).
A few findings and opinions:
- it is quite likely that more and more administrators will request librarians and other information professionals to churn out metrics-based research evaluation ranking lists about individuals, departments, and colleges
- I am in favor of using metrics-based evaluation. (…) However, because of the shortcomings of these special databases for evaluating individual researchers (as opposed to citation-based subject searching), I am also very much against replacing peer-based evaluation by bibliometric, scientometric and/or informetric indicators in ranking individual researchers, groups of researchers, institutions and countries by the traditional bibliometric indicators (total number of citations, average number of citations per publications), and the new ones alone that combine the quantitative and qualitative measures in a single number, such as the original h-index and its many, increasingly more refined variants
- I have also concerns about the level of search skill and the time needed from librarians and other information professionals to engage –…- in the very time consuming and sophisticated procedures. (…) Still, even such a highly qualified group can leave some methodological issues unexplained, make mistakes in the search process and/or in the compilation of data and/or in the data entry process
- Google-Scholar based metrics: The reason for this indifference is that the hit counts and the citation counts delivered by Google Scholar are not worth the paper they are printed on. Its metadata remain to be a metadata mega mess (Jacso, 2010), and its citation matching algorithm is worse than those of the cheapest dating services
Jacso, Peter. Savvy Searching. Online Information Review, 34 (6) pp. 972-982.