Science Intelligence and InfoPros

Little things about Scientitic Watch and Information Professionnals

Archive for the ‘02: Analysis’ Category

Big data needs Big science

with one comment

A great article by the Webtrends CEO:

We are living in “the age of big data, (…) No industry is untouched by big data, which is notably transforming the way social networks work today. However, the key factor that will determine success for companies in this age is not simply big data, but big science. (…)

The World Economic Forum’s report on data equated it with an asset such as gold. Others have declared that data is “the new oil.” But, as with gold or oil, data has no intrinsic value. (…) Gold requires mining and processing before it finds its way into our jewelry (…) Oil requires extraction and refinement before it becomes the gasoline that fuels our vehicles. Likewise, data requires collection, mining and, finally, analysis before we can realize its true value for businesses, governments, and individuals alike.

According to IDC, the amount of data that companies are wrestling with is growing at 50 percent per year — or more than doubling every two years. Many organizations are rich in data but poor in insight. That’s where big science comes in. (…)

The collection and mining of massive amounts of digital data currently defines the term big data. Those are processes that businesses largely handle. However, the analysis of that data — that magic ingredient of algorithms and advanced mathematics that bridges the gap between knowledge and insight — is big science. It is where the value is. It is the future. (…)

Put simply, the analysis that big science brings to the table makes big data relevant. I envision big science combining with big data to create big opportunities in three significant ways: real-time relevant content, data visualization, and predictive analytics. (…)

Read the full article at:

Yoder, Alex. Big data is worth nothing without Big science. C/Net, May 15, 2012. Available from: http://news.cnet.com/8301-1001_3-57434736-92/big-data-is-worth-nothing-without-big-science/ [Accessed on 15th of May 2012]

 

 

 

 

 

 

 

 

Written by hbasset

May 15, 2012 at 8:52 pm

Posted in 02: Analysis

Tagged with ,

Journal article mining and semantic search: practices and promises

leave a comment »

An impressive study research, sponsored by the Publishing Research Consortium: includes interviews of key-people from Pfizer, the CERN, Mendeley, the British Library, from TEMIS, Elsevier, Springer, Nature, Wiley, etc.

Journal Article Mining: a research study into Practices, Policies, Plans …..and Promises Eefke Smit and Maurits van der Graaf. PRC June 2011 153pp. This is a study commissioned by PRC which offers the first comprehensive look at what publishers and others are doing, and plan to do, in both data and text mining of the scholarly, mainly journal, literature. Lots of fascinating detail from a number of viewpoints – from 29 interviews and 190 detailed responses to a survey

Written by hbasset

July 21, 2011 at 7:42 pm

What impact are your online resources having? calculate your ROI

leave a comment »

This toolkit is designed and offered by the JISC to provide a guide to measuring the impacts of online scholarly resources.

It 1will help content creators, publishers and other information professionals understand the reach of their digital assets.

They can use the kit to help guide them through different aspects of measuring impact, both qualitative, such as focus groups, and quantitative, such as web metrics.

http://microsites.oii.ox.ac.uk/tidsr/welcome

Written by hbasset

May 24, 2011 at 8:08 pm

Posted in 02: Analysis

Tagged with ,

Social Media: can “sentiment” be analyzed automatically?

leave a comment »

An excellent white paper by SYNTHESIO.

Summary:

The web has made it possible for brands to discover what people are saying about their brands online. The next step for brands is finding out whether people are talking positively or negatively about their brand, and why.

 How is “sentiment” calculated by automatic software?

Sentiment is not analyzed via artificial intelligence, (…). Rather, it is analyzed via a systematic process that involves the use of a sentiment lexicon. This lexicon assigns a degree of positivity or negativity to a Word by itself that is then used to give meaning to the entirety of the article.

 The white paper explains different flaws of current automatic treatment of the sentiment into online media and how Natural Language Processing (NLP) and semantic technologies will improve the systems.

 But according Synthesio, “the best option is currently combining both machine and man“.

 Synthesio  – The Truth About Natural Language Processing -­ March 2011. White Paper, 10 pages. Available online: http://synthesio.com/corporate/wp-content/uploads/2010/11/SYNTHESIO-NLP.pdf

Written by hbasset

April 14, 2011 at 8:19 pm

Bibliometrics can be fun!

leave a comment »

With his world map of scientific collaboration, Oliver Beauchesne, from the US-Canada based Science-Metrix,  has built nice visualizaion of science collaboration.

It is based on Scopus data. For those interested in looking at how scientists are connected geographically, a number of companies already promise to help map the geographic reach of an individual or a discipline. These include Springer’s AuthorMapper, Transinsight’s GoPubMed and BioMedExperts.

Van Noorden, Richard. Picture post: world map of scientific collaboration. The Great Beyond (Nature), Posted on January 27,2011.
http://blogs.nature.com/news/thegreatbeyond/2011/01/picture_post_world_map_of_scie_1.html

Written by hbasset

February 8, 2011 at 9:02 pm

BioSumm: a summarizer for biology-related texts

leave a comment »

BioSumm is a flexible and modular framework which analyzes large collections of unclassified biomedical texts and produces ad hoc summaries oriented to biological information.

BioSumm is neither a traditional summarizer nor a extractor of dictionary terms. It is designed to be a summarizer oriented to the biological domain. Thus, its summaries have both the expressive power of the traditional summaries and the domain specificity of documents produced by a dictionary entry extractor.

The GUI interface is freely downloadable:
https://dbdmg.polito.it/twiki/bin/view/Public/BioSumm

Update (31/01/2011): New address:
http://dbdmg.polito.it/wordpress/research/bioinformatics/biosumm/

Written by hbasset

January 27, 2011 at 9:20 pm

Posted in 02: Analysis

Tagged with , ,

Real-time science!

leave a comment »

STM publisher Springer has announced the launch of a new free analytics tool, http://realtime.springer.com, which provides multiple visualisations of the usage that is generated worldwide by  springer’s online products, including journals, books, images and protocols.

Realtime.springer.com aggregates the raw data on downloads of Springer journal articles and book chapters in real time from all over the world, and displays them in a variety of interactive visualisations such as:

  • a map showing where the downloads are coming from
  • a constantly updating keyword tag cloud
  • and a visualisation of total downloads.

In addition, a search feature shows a chart of the downloads and the ‘Top Five Most Downloaded’ list for every journal or book.

The results provide book authors and journal editors with information on how intensively their content is used.

They gain insight into what topics are trending at the moment, and which areas of the world are currently looking at what type of topics in Springer books and journals.
Librarians get a clear overview of where Springer content is used in the many fields.

Realtime.springer.com currently receives input from the information platform SpringerLink with nearly five million documents from about 41,000 eBooks, 1,160 book series, 2,524 journals and 173 eReference works. Additionally, the tool receives feeds from the SpringerImages database with more than 2.7 million images and from SpringerProtocols, the database of reproducible laboratory protocols in the life and biomedical sciences

Written by hbasset

December 8, 2010 at 9:46 pm

Posted in 02: Analysis

Tagged with ,

Visualisation tools conference with Microsoft, Elsevier, etc. speakers

leave a comment »

The International Council for Scientific & Technical Information (ICSTI) has announced that its Winter Meeting will be held from February 6-7, 2011, and will be followed by a Workshop, the theme for which will be ‘Multimedia and Visualisation Innovations for Science’, on February 8, 2010.

 The Workshop will be open to both members and non-members. Both events will be hosted by Microsoft, an ICSTI member, on their Redmond campus, WA, USA.

Multimedia and visualisation tools and technology are seen to offer tremendous opportunity for accelerating scientific discovery.

This workshop will feature leading-edge innovations in science-oriented web multimedia, large-scale data exploration and visualisation, speech and object recognition, image indexing and analysis, human/computer interaction and virtual environments, among other topics.

 Presentations will be made by technology, science, and information professionals across the broad spectrum of academia, government, business, and industry.

visit  http://www.icsti.org

Written by hbasset

November 17, 2010 at 8:21 pm

Solutions to Information overload

leave a comment »

The solution to data overload is to provide decision makers with “Intelligent Information” – better organised and structured information rapidly conveyed to the users’ preferred devices, says Thomson-Reuters experts in a new study.

In todays’ times, as info pros are overwhelmed by exploding data volumes, they do tend to employ an overly intuitive decision-making style when faced with unorganised information.

It indicated that when faced with unsorted, unverified “raw” data, 60% of decision makers will make “intuitive” decisions that can lead to poor outcomes.
On the other hand, when professionals are given more organised, better structured information which has context, they are able to apply a more rational style that results in better and more consistent decisions.

In order to realise the full potential found in increasing amounts of information, professionals need more intelligent information and better tools, not merely more information, the study concluded.
Thomson Reuters study proposes solutions to the problems of information overload. Information World Review, 12/07/10.
http://www.iwr.co.uk/business/3010343/Thomson-Reuters-study-proposes

Written by hbasset

August 2, 2010 at 5:06 pm

Posted in 02: Analysis

Tagged with

Thomson InCites: new release

leave a comment »

Via one convenient, web-based platform, InCites delivers objective measures of institutional research performance allowing professionals to make strategic choices to effectively further their research, budgetary, hiring, and market positioning goals.

InCites now creates institution and author profile reports that provide top-level snapshots of performance and gives users the ability to dive into core data for deep analysis.

It also supports customization of the data and flexible tools to manage, refine, share, and save reports.

Now InCites users can easily create a picture of their research output and impact at the author and department level, as well as the institution level available in previous versions.

Press release: http://thomsonreuters.com/content/press_room/sci/New-Version-InCites

Web site:
http://incites.thomsonreuters.com/

Written by hbasset

July 1, 2010 at 4:47 pm

Posted in 02: Analysis

Tagged with