Archive for December 2010
There is plenty of enthusiasm for search engines like Google from researchers and the general public alike.
Google and Google Scholar are well-known for the wide breadth of the information they search. Google brings in news, factual and opinion-related information, and Google Scholar also emphasises scientific content across many disciplines.
But do these search tools give as comprehensive a picture of a particular research field as a specialist database?
This is the question that the team behind the specialised scientific database on energy-related information, ETDEWEB (the Energy Technology Data Exchange – World Energy Base) set out to answer by studying user search results.
The ETDE team compared the results of 15 energy-related queries performed on all three systems – ETDEWEB, Google and Google Scholar – using identical words/phrases.
More than 40,000 search result records from the three sources were evaluated. The study concluded that ETDEWEB is a significant resource to energy experts for discovering relevant energy information. In the 15 searches, nearly 90 per cent of the results in ETDEWEB were not shown by Google or Google Scholar.
Google is certainly a highly-used and valuable tool to find significant ‘non-specialist’ information, and Google Scholar does focus on scientific disciplines.
If a user’s interest is scientific and energy-specific, ETDEWEB continues to hold a strong position in the energy research, technology and development (RTD) information field and adds considerable value in knowledge discovery
Cutler, Debbie (ETDE). Database versus search engine. Research Information, Dec. 2010 / Jan. 2011. online:
Katherine Allen kindly reported my Online’s presentation about Science 2.0 in InfoToday…
Allen, Katherine. “Science 2.0”: is there a role for InfoPros?. InfoToday, online, posted on 16th of December 2010.
Open access has become very popular over the last few years. It is evident in the increasing number of scientific journals being made available free to readers on the Internet, and the increasing number of institutions that are building repositories to house the electronic versions of open-access articles written by scholars at their institutions.
The academic and research communities seem to support this movement and their right to obtain easy and free access to publicly funded scientific information.
But, how often do researchers actually use such free publications as readers and how often do they choose to publish in an OA journal or institutional repository?
How trustworthy do they consider those journals and repositories? Would they prefer that OA repositories be more selective?
Although today about 10-15 percent of scientific peer-reviewed journals are OA and there are several declarations encouraging institutions to build OA repositories, there is still a long way to go, especially where OA repositories are concerned.
This research is trying to determine why acceptance and growth of open access, particularly open access repositories, has been so slow.
- OA repositories do not follow any standard procedures for selecting articles to include
- The vast majority of the participants in the survey state that they would be open to contributing to OA repositories that followed the selection procedures used in high-reputation subscription-based journals
- the majority of the participants seem to be well disposed towards acting as severe and strict reviewers for an OA repository
While we would expect that the scientific community would be accustomed to the use of open access publications, scientists and researchers seem to still be a little cautious. However, this research shows that they welcome changes that might lead to more credible publications, even if that means that their own work will undergo scrutinizing reviews.
OA repositories are certainly far more established now than in the last few decades. But still, in order to win over the scientific community as a whole, we have to take some steps to ensure the quality of published information.
Roxana Theodorou. OA Repositories: the Researchers’ Point of View
Journal of Electronic Publishing,Volume 13, Issue 3, December 2010.
This paper introduces two journal metrics recently endorsed by Elsevier’s Scopus: SCImago Journal Rank (SJR) and Source Normalized Impact per Paper (SNIP). SJR weights citations according to the status of the citing journal and aims to measure journal prestige rather than popularity.
It presents the main features of the two indicators, comparing them one with another, and with a journal impact measure similar to Thomson Reuters’ journal impact factor (JIF).
The journal impact factor, developed by Eugene Garfield as a tool to monitor the adequacy of coverage of the Science Citation Index, is probably the most widely used bibliometric indicator in the scientific, scholarly and publishing community. However, its extensive use for purposes for which it was not designed has raised a series of criticisms, all aiming to adapt the measure to the new user needs
In January 2010, Scopus endorsed two such measures that had been developed by their partners and bibliometric experts SCImago Research Group, based in Spain (…), and the Centre for Science and technology Studies (CWTS), based in Leiden, Netherlands, (…). The two metrics that were endorsed are SCImago Journal Rank (SJR) and Source Normalized Impact per Paper (SNIP).
Compared to other main fields, life sciences and health sciences tend to reveal the highest SJR and RIP values. Compared to the basic, JIF-like RIP (raw impact per paper), SJR tends to make the differences between journals larger, and enhances the position of the most prestigious journals, especially – though not exclusively – in life and health sciences.
The fact that Scopus introduced these two complementary measures reflects the notion that journal performance is a multi-dimensional concept, and that there is no single ‘perfect’ indicator of journal performance.
Lisa Colledge, Félix de Moya‐Anegón, Vicente Guerrero‐Bote, et al. SJR and SNIP: two new journal metrics in Elsevier’s Scopus. Serials: The Journal for the Serials Community.Volume 23, Number 3 / November 2010. Pages: 215 – 221