Posts Tagged ‘Scopus’
According this study related to Social Sciences publications, Google Scholar provides “vastly larger citation counts than either Scopus or Web of Science when all results are taken into account, but only slightly larger counts when only scholarly journals are considered“….
The study also deals with citation counting issue, saying that “ it is relatively easy to falsify citing references to research and create “search engine spam” which artificially inflates citation countswithin Google Scholar. While it is unclear as to whether this is occurring deliberately and if so, towhat extent, it remains an issue which should engender cautious use of search engine citation data“.
As a conclusion the study says that “ Google Scholar may not be as reliable as either Scopus or Web of Science as a stand-alone source for citation data“
Elaine M. Lasda Bergman. Finding Citations to Social Work Literature: The Relative Benefits of Using Web of Science, Scopus, or Google Scholar. The Journal of Academic Librarianship, Available online 23 October 2012
UK-based start-up Mendeley has announced that the number of queries to its database from external applications has surpassed 100 million per month. More than 240 applications received for research collaboration, measurement, visualisation, semantic markup and discovery – all of which have been developed in the past year – receive a constant flow of data from Mendeley.
The information fuelling this ecosystem has been crowdsourced by the scientific community itself, somewhat like Wikipedia. Using Mendeley’s suite of document management and collaboration tools, in just three years its global community of 1.9 million researchers has created a shared database containing 65 million unique documents.
This, according to recent studies, covers 97.2 to 99.5 percent of all research articles published. Commercial databases by Thomson Reuters and Elsevier contain 49 million and 47 million unique documents respectively, but access to their databases is licensed to universities for tens of thousands of dollars per year.
More information at:
Altmetric tracks tens of thousands of article mentions a month across Twitter, the scientific blogosphere and publishers including The Guardian, the NYT and New Scientist. It assigns scientific papers a score derived from this data. Around 10 – 15% of all new papers added to PubMed each month are covered (Altmetric covers articles not found in PubMed too).
Searching in the SciVerse Hub or on ScienceDirect while the app is active will rank articles by their Almetric score. Relevant information is also shown under the results themselves.
Tap through to see the actual tweets, snippets of blog posts, Mendeley & CiteULike reader counts and links to news sites
Video on Youtube:
Mentioned also by:
The four most popular search engines PubMed/MEDLINE, ScienceDirect, Scopus and Google Scholar are investigated to assess which search engine is most effective for literature research in laser medicine. Their search features are described and the results of a performance test are compared according to the criteria (1) recall, (2) precision, and (3) importance.
As expected, the search features provided by PubMed/MEDLINE with a comprehensive investigation of medical documents are found to be exceptional compared to the other search engines.
However the most effective search engine for an overview of a topic is Scopus, followed by ScienceDirect and Google Scholar.
With regard to the criterion “importance” Scopus and Google Scholar are
clearly more successful than their competitors.
All in all Scopus is the most effective search engine if one requires only an overview of the topic. For a widespread and in-depth investigation in the area of life science and closely related topics, PubMed/MEDLINE is more appropriate
Tober, Markus. PubMed, ScienceDirect, Scopus or Google Scholar – Which is the best search engine for an effective literature research in laser medicine? Medical Laser Application. Volume 26, Issue 3, August 2011,
Pages 139-144. Basic Investigations for diagnostic purposes
A former student who enjoyed Scopus in her university claims for a personal subscription model.
“I didn’t know what I had till it was gone. During my PhD at the University of Pittsburgh, I had access to the Scopus database of citation data. I proceeded to use it for various citation analyses. I graduated and moved on, swapping that university affiliation for a collection of at least five others. None of these has access to Scopus. I miss it! I need it to do my research! (…)
Dear Scopus, you know what would make this a lot easier? The ability to buy a personal subscription. I’d buy one. I have money for that. Unfortunately, you only offer access through institutional subscriptions. I’ve talked to librarians at many of my institutions, and they aren’t keen to buy an institutional Scopus subscription… they view it as an upstart “European” also-ran to ISI Web of Science. You and I both know that isn’t true, but it is a really steep hill for me to climb to convince them, as an individual postdoc researcher. Let me have a free personal trial, let me buy a personal subscription.”
Piwowar, Heather. Scopus is better than ISI Web of Science for bulk article-level metrics. Research Remix, Online. Posted on May 8, 2011.
Kristin Whitman has started a serie of studies regarding Web Of Science and Scopus for the (excellent) patent community Intellogist.
“The question “which is better” is really unanswerable - first you need to decide what “better” means” she says.
Some of her findings:
- Coverage: number of active or inactive titles (2/28/2011)
- Scopus: 29,566 titles, of 15,175 are unique
- Web of Science: 18,843 titles, of 4,452 are unique
- Common: 14,391
- Coverage: type of journals
- Scopus: 93% scholarly journal – 3% Trade – 2% Report – 2 Book series
- WoS: 98% scholarly – Book series: 2%
- Coverage: country of publication breakdown
- Scopus: US, 30%; UK, 18%; NL, 8%; DE, 7%; FR, 3%, etc.
- WoS: US, 38%; UK, 17%; DE, 7%; NL, 6%; FR, 3%, etc.
To be continued…
Whitman, Kristin. Web of Science Vs. Scopus: which is better. Intellogist, Online:
With his world map of scientific collaboration, Oliver Beauchesne, from the US-Canada based Science-Metrix, has built nice visualizaion of science collaboration.
It is based on Scopus data. For those interested in looking at how scientists are connected geographically, a number of companies already promise to help map the geographic reach of an individual or a discipline. These include Springer’s AuthorMapper, Transinsight’s GoPubMed and BioMedExperts.
Van Noorden, Richard. Picture post: world map of scientific collaboration. The Great Beyond (Nature), Posted on January 27,2011.
Research Trends, the bibliometric newsletter published by Elsevier and based on Scopus data, has been moved to a nice new WordPress platform.
It offers a fresh look and some social features: ratings of articles, share into an impressive range of social tools (excpet into 2collab which is pretty funny for an Elsevier product!!), etc…
A long history…
Peter Jacso, one of the best experts in STM abstract databases, gives his opinion… In his latest publication, he compared 3 tools: Web of Science (WoS), Scopus and Google Scholar (GS).
A few findings and opinions:
- it is quite likely that more and more administrators will request librarians and other information professionals to churn out metrics-based research evaluation ranking lists about individuals, departments, and colleges
- I am in favor of using metrics-based evaluation. (…) However, because of the shortcomings of these special databases for evaluating individual researchers (as opposed to citation-based subject searching), I am also very much against replacing peer-based evaluation by bibliometric, scientometric and/or informetric indicators in ranking individual researchers, groups of researchers, institutions and countries by the traditional bibliometric indicators (total number of citations, average number of citations per publications), and the new ones alone that combine the quantitative and qualitative measures in a single number, such as the original h-index and its many, increasingly more refined variants
- I have also concerns about the level of search skill and the time needed from librarians and other information professionals to engage –…- in the very time consuming and sophisticated procedures. (…) Still, even such a highly qualified group can leave some methodological issues unexplained, make mistakes in the search process and/or in the compilation of data and/or in the data entry process
- Google-Scholar based metrics: The reason for this indifference is that the hit counts and the citation counts delivered by Google Scholar are not worth the paper they are printed on. Its metadata remain to be a metadata mega mess (Jacso, 2010), and its citation matching algorithm is worse than those of the cheapest dating services
Jacso, Peter. Savvy Searching. Online Information Review, 34 (6) pp. 972-982.