Scientific Research Results Visibility and Assessment
It is important for every researcher that the results of his or her research are easily discoverable, visible to other researchers and readers, and receive as much attention as possible, as these factors have a major impact on the quality of research, which is nowadays determined by the citation rate of the publication describing the research.
Accordingly, the visibility of scientific results (because of its impact on citation rates) is important not only for the authors but also for the institutions they represent, as their scientific progress and prestige are also assessed based on the quality of the institution's scientific output (mainly scientific publications in the name of the institution).
Accordingly, the visibility of scientific results (because of its impact on citation rates) is important not only for the authors but also for the institutions they represent, as their scientific progress and prestige are also assessed based on the quality of the institution's scientific output (mainly scientific publications in the name of the institution).
Ensuring the Visibility of Research Results
The key means to increase the visibility of scientific results:
Practical advice for implementing these means:
- Researcher Digital Identifiers (ORCID, Researcher ID, Scopus Author ID)
- Dissemination of research findings through an open-access publishing model
- Audience-focused dissemination of research findings (selecting the most appropriate publication in terms of disciplinary coverage and specificity). For information on how to choose a publication, click here >>>
- Use of informal dissemination sources to publicise research results (social networks, personal accounts, blogs, and others)
- Developing academic relations and active cooperation
Practical advice for implementing these means:
- Create your researcher’s ID and profiles (make sure that you have at least an ORCID profile)
- Try to include your researcher ID in all your academic activities, not only when publishing papers, but also, for example, when applying for research funding.
- Include your researcher ID in your personal profile and on social networks so that your name is linked to your published research, whatever the search engine is used for.
- Use the same spelling of your last name throughout your academic career.
- Use the standardised name of your institution and avoid using abbreviations of your institution's name, as misattribution of a publication to an institution can lead to misattribution to an author (and vice versa).
- Check your publication information in international databases and contact your service provider if you find errors in your name or contact information.
- Provide accurate contact information so that other authors can contact you for further information about the article or for collaboration opportunities.
- Seek out research and publication opportunities yourself, in collaboration with established researchers who are already well known in the academic community.
- Participate in conferences and scientific events. This participation will help you network and collaborate.
- Create a Google Scholar author profile.
- Be active in the academic virtual space, engage in discussions, and raise relevant questions and ideas. This will show that you are actively engaged in scholarly activities and will help you to make your competence and experience known.
- Provide links to the persistent identifiers (DOIs, URNs) of your scientific outputs on your social networking account.
- When writing an article, we mention the main (key) concepts/themes of the research in the abstract. Search engines and citation trackers are looking for your abstracts, so repeating keywords more often increases the chances that your article is found and cited.
- Publish your articles in highly reputable and popular publications representing your target audience.
- Host your unpublished research in open-access repositories.
- Take advantage of social bookmarking using Mendeley, Zotero, and CiteULike tools.
Author identifiers help solve problems with author disambiguation and incorrect attribution of publications to an author (or institution), which may occur due to:
ORCID
ORCID – is an open, free-of-charge, unique, and persistent researcher ID (Open Researcher and Contributor ID) dedicated to distinguishing researchers' research activities and results from those of other researchers who carry out similar research and/or have similar names.
The purpose of ORCID is to enable transparent and trustworthy connections between researchers, their contributions (publications, datasets, participation in projects, received research funding, peer-review activities, etc.), and their affiliations throughout their careers and engagement in research, scholarship, and innovation activities.
ORCID – International Standard
ORCID benefits for the researcher:
ORCID profile creation:
Privacy and data protection are the core principles of ORCID. The visibility of all information can be controlled following the provisions of the ORCID privacy policy. When signed in, every person can control which profile information and publications are visible to people or groups of people. Only the name and ORCID ID will be made public.
Other researcher IDs
ResearcherID – is a unique researcher identifier (and profile) created in the Web of Science database (WoS) environment that allows a researcher to manage their WoS-indexed publications, track citations and h-indexes, and avoid misidentification of the author. ResearcherID can be linked to an ORCID profile.
A ResearcherID profile can be created on the registration page. If an ORCID profile has already been created, it is possible to move information from one profile to another, view all variations of the author's name, create a common list of all publications, and so on.
Scopus Author Identifier (AuthorID) – In the Scopus database, each author of an indexed publication are assigned a unique identifier (AuthorIDs) and associated individual profile. As with ResearcherID, Scopus Author ID profile data can be linked to an ORCID profile.
More information about Scopus Author Identifiers can be found here >>>
- Changes in the author's last name during a scientific career (e.g., after marriage)
- High frequency/popularity of author's last name
- The complexity of the author’s name, leading to several possible spelling variations (e.g. existence of one (or more) middle names and/or surname prefixes)
ORCID
ORCID – is an open, free-of-charge, unique, and persistent researcher ID (Open Researcher and Contributor ID) dedicated to distinguishing researchers' research activities and results from those of other researchers who carry out similar research and/or have similar names.
The purpose of ORCID is to enable transparent and trustworthy connections between researchers, their contributions (publications, datasets, participation in projects, received research funding, peer-review activities, etc.), and their affiliations throughout their careers and engagement in research, scholarship, and innovation activities.
ORCID – International Standard
- ORCID is internationally recognized. This tool established by independent and non-profit initiatives is used by:
- research institutions (e.g., CERN, ETH Zurich)
- research funders;
- publishers (e.g., Springer Science + Business Media, Elsevier and Wiley)
- research societies and organizations (e.g., ACM, IEEE);
- databases (e.g., Web of Science, Scopus)
ORCID benefits for the researcher:
- Increases the visibility and findability of the researcher’s publications and research data.
- ORCID is a unique person identifier. Therefore, ORCID allows us to identify a particular person even in the cases of researchers with identical names, events of surname changes, occurring spelling errors, and/or different spelling variants.
- Links a researcher's career achievements with research outputs (publications, research data, software, etc.).
- ORCID protects personal and private data.
- Facilitates communication with funders, associations, publishers, repositories, and registration to conferences or other scientific events.
- Facilitates the management of all staff within the university, faculty, department, institute, or research group.
- ORCID is a global system. All information about new publications can be automatically placed in the ORCID profile using the update function.
ORCID profile creation:
- Sign up: free and fast registration (sign-up link).
- Login: fill in the ORCID profile, and link it to other IDs (such as your Scopus Author ID or ResearcherID).
- Make a list of your published research outputs (manually or by importing publication data from other sources (Google Scholar, Web of Science, Scopus, CrossRef, etc.)).
Privacy and data protection are the core principles of ORCID. The visibility of all information can be controlled following the provisions of the ORCID privacy policy. When signed in, every person can control which profile information and publications are visible to people or groups of people. Only the name and ORCID ID will be made public.
Other researcher IDs
ResearcherID – is a unique researcher identifier (and profile) created in the Web of Science database (WoS) environment that allows a researcher to manage their WoS-indexed publications, track citations and h-indexes, and avoid misidentification of the author. ResearcherID can be linked to an ORCID profile.
A ResearcherID profile can be created on the registration page. If an ORCID profile has already been created, it is possible to move information from one profile to another, view all variations of the author's name, create a common list of all publications, and so on.
Scopus Author Identifier (AuthorID) – In the Scopus database, each author of an indexed publication are assigned a unique identifier (AuthorIDs) and associated individual profile. As with ResearcherID, Scopus Author ID profile data can be linked to an ORCID profile.
More information about Scopus Author Identifiers can be found here >>>
Research Evaluation Method – Scientometrics
Scientometrics is the application of mathematical and statistical methods to quantitative quality assessment.
Bibliometrics is a branch of scientometrics that focuses on the quantitative assessment of scientific publications and their citation data.
Bibliometric analysis is the main method used to assess and/or compare the quality of scientific output of individual researchers, institutions, and faculties.
For the members of academic community bibliometric analyzes are important and are performed for several main purposes:
There are several main objectives for which bibliometric analyses are important for members of the academic community:
The results of bibliometric analysis are not only applied within academia, but also in a wide range of other areas, such as funding allocation, education policy, curriculum planning, technological and innovation development, economic development, and social issues.
Bibliometric data sources
The main data for bibliometric analyses are scientific publications and citations. A wide range of sources are currently available to the scientific community, but the most suitable for bibliographic analyses are bibliographic databases (DB), which do not contain the publications themselves, but their abstracts and metadata.
Although the number and variety of bibliographic databases have grown considerably over the last decade, the best known and most widely used are the Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier) databases (DB).
Both of these DBs are:
Detailed information on the WoS and Scopus databases can be found in the Scientific Communication section on the VILNIUS TECH Library website.
Google Scholar (GS) is a publicly available academic literature search engine. It is the most widely used source of academic literature, but the primary sources of GS data and citation data are not fully reliable. By creating a GS account, you can increase the visibility of your publications, monitor the citation rate of your publications, and see who has cited your article.
For subject-specific data analyses, it may be more convenient to use data from specialized DBs (e.g., PubMed, EconList, ERIC, PsycINFO, etc.).
Bibliometrics is a branch of scientometrics that focuses on the quantitative assessment of scientific publications and their citation data.
Bibliometric analysis is the main method used to assess and/or compare the quality of scientific output of individual researchers, institutions, and faculties.
For the members of academic community bibliometric analyzes are important and are performed for several main purposes:
- To assess the impact of research results, when applying for a new job, promotion, salary supplement, or research funding;
- Evaluating other researchers for collaborative purposes;
- Choosing a journal for publishing research results;
- Selecting high quality scientific information for studies and graduation works;
- Assessing the relevance of literature sources for subscription;
- Bibliometric data are also analyzed in compiling University rankings.
There are several main objectives for which bibliometric analyses are important for members of the academic community:
- Assess the impact of researchers' findings on their research, whether for the purposes of applying for a new job, promotion, bonus, or research funding;
- Evaluating other researchers for collaborative purposes;
- Selecting a journal to publish an article;
- Selecting high-quality scientific information for research papers;
- Evaluating the relevance of literature sources for subscription;
- Bibliometric indicators are used for university rankings.
The results of bibliometric analysis are not only applied within academia, but also in a wide range of other areas, such as funding allocation, education policy, curriculum planning, technological and innovation development, economic development, and social issues.
Bibliometric data sources
The main data for bibliometric analyses are scientific publications and citations. A wide range of sources are currently available to the scientific community, but the most suitable for bibliographic analyses are bibliographic databases (DB), which do not contain the publications themselves, but their abstracts and metadata.
Although the number and variety of bibliographic databases have grown considerably over the last decade, the best known and most widely used are the Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier) databases (DB).
Both of these DBs are:
- Multidisciplinary - covers different scientific fields and types of scientific publications.
- Selective, that is, only scientific publications published in the highest-quality journals are included (indexed). Publications are selected for indexing by specially appointed committees using strict selection criteria.
- Commercial - full access to the indexed content and analysis tools is only available to members of the DB subscribing institutional community. VILNIUS TECH subscribes to both DBs, but access is only available from VILNIUS TECH host computers and can only be accessed outside the University through a VPN service. It should be noted that the Scopus DB is partially open and provides author and publication information for the public (free of charge).
Detailed information on the WoS and Scopus databases can be found in the Scientific Communication section on the VILNIUS TECH Library website.
Google Scholar (GS) is a publicly available academic literature search engine. It is the most widely used source of academic literature, but the primary sources of GS data and citation data are not fully reliable. By creating a GS account, you can increase the visibility of your publications, monitor the citation rate of your publications, and see who has cited your article.
For subject-specific data analyses, it may be more convenient to use data from specialized DBs (e.g., PubMed, EconList, ERIC, PsycINFO, etc.).
Scientific Output Assessment Tools – Scientometric Indicators
The main tools for evaluating scientific output are scientometric indicators. These are estimates of the quantitative quality of scientific output (publications) based on mathematical formulae and/or specially designed algorithms.
They are determined using data on publications and citations on the assumption that if a scientific paper has been cited, this is evidence of the significance/impact of the research and/or its results on scientific progress.
They are determined using data on publications and citations on the assumption that if a scientific paper has been cited, this is evidence of the significance/impact of the research and/or its results on scientific progress.
Source quality (impact) indicators are the most widely used in the evaluation of scientific results. These indicators are designed to assess the quality of publications (mostly journals). However, they are not indicators for assessing the performance of a researcher. Nor do they indicate the quality of an individual article published in a journal. An article published in a journal with a high index of the journal is not necessarily better in quality than an article published in a journal with a lower index.
The quality indicators for publications are available in the main bibliographic databases Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier).
Both databases provide different Scientometric indicators for assessing the quality of a publication but in terms of the principle of their calculation and their intended use, they can be seen as equivalents of the same type of indicators.
For detailed information on the key publication quality indicators provided by Scopus and WoS, their calculation principles, similarities, and differences, see here >>>
h-index (Hirsch Index) has been developed to assess the productivity of researchers and the quality of their output but can also be used to assess institutions, sources or any other sets of publications. This indicator is called hybrid because it measures both quality (citations) and quantity (number of publications).
h-index = number of publications (of an individual, institution, or other evaluated entity) h, which were each cited at least h times.
h-index values for authors are presented in all major bibliographic DBs, such as Google Scholar, Scopus and Web of Science.
Important! Different DBs provide different numbers of citations and h-index values, as these indicators are determined only from the data indexed in that DB.
More information about the different metrics is provided in the Metrics Toolkit.
The quality indicators for publications are available in the main bibliographic databases Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier).
Both databases provide different Scientometric indicators for assessing the quality of a publication but in terms of the principle of their calculation and their intended use, they can be seen as equivalents of the same type of indicators.
For detailed information on the key publication quality indicators provided by Scopus and WoS, their calculation principles, similarities, and differences, see here >>>
h-index (Hirsch Index) has been developed to assess the productivity of researchers and the quality of their output but can also be used to assess institutions, sources or any other sets of publications. This indicator is called hybrid because it measures both quality (citations) and quantity (number of publications).
h-index = number of publications (of an individual, institution, or other evaluated entity) h, which were each cited at least h times.
h-index values for authors are presented in all major bibliographic DBs, such as Google Scholar, Scopus and Web of Science.
Important! Different DBs provide different numbers of citations and h-index values, as these indicators are determined only from the data indexed in that DB.
More information about the different metrics is provided in the Metrics Toolkit.
Scientometric indicators have been developed to allow quantitative measurement of quality. The most widely used are source (journal) citation indicators (also called impact) which have been developed and designed to assess the overall quality of a source to aid in making a more informed decision on journal subscription or indexing. However, nowadays the use of journal impact indicators has spread far beyond their intended purpose, since today the quality of individual publications and/or researchers is also judged by the value of these indicators, which is not a good practise, because:
- These indicators estimate the average level of quality of the scientific output published in the journal (or other periodical sources), i.e., individual publications published in the same source may differ significantly in their quality (both fundamentally and by citation levels) and therefore cannot be judged based on the overall impact of the journal;
- In addition to the chosen source in which the publication is published, the quality and/or citation of an individual publication is influenced by several factors, such as research discipline, specificity of the subject, popularity and relevance of the research, country of implementation and/or publication, language, and even co-authors. However, many of these aspects are not considered in the calculation of the most widely used journal indicators, again showing their inadequacy in assessing the quality of individual publications.
A new approach to the use of bibliometric indicators in recruitment, promotion, funding, and other processes for the evaluation of researchers and scientific performance is now emerging in the scientific community. Evaluation of scientific results should be done responsibly, fairly, and appropriately, and a review of the principles of scientific evaluation is proposed to achieve these objectives. The dissemination of scientific communication is currently not limited to traditional dissemination channels (publications), and the measures used to assess the impact of research do not sufficiently reflect the reality of the situation: scientometric (citation) indicators can only reflect the quality/impact of research in academia, but do not assess the societal impact of the research - the relevance of the research to the wider public.
The following should be considered when using science metrics:
- A high number of citations does not necessarily indicate high publication quality. For example, authors may cite an article by criticising it or refuting its conclusions or results;
- In some disciplines (e.g., medicine), articles are traditionally cited more often than in others (e.g., humanities);
- Researchers at different stages of their careers should not be directly compared. Experienced researchers tend to have higher indicator values than young researchers;
- The indicator values may vary depending on the database used for the evaluation of the results.
The main principles of the proper use of Sciencometric indicators in practise are announced in:
DORA Declaration San Francisco Declaration on Research Assessment.
The main principles of DORA:
- Eliminate the use of journal-based evaluation in the assessment of research and researchers;
- To assess researchers and research, evaluation systems designed for their evaluation should be used;
- Use scientific technologies to improve scientific communication and evaluation of scientific input.
Alternative Publication Quality Assessment Indicators (Altmetrics)
With the emergence of new forms of scholarly communication and the growing number and variety of electronic publications, alternative indicators for evaluating scholarly output based on usage and popularity data (e.g., number of views, number of downloads, hashtags, mentions on social networks, blogs, videos, comments, number of shares, etc.) have started to be developed and applied.
Altmetric indicators are primarily qualitative quantified indicators that complement traditional science evaluation (scientometric) by measuring not only the scientific but also the social relevance and impact of scientific research.
Altmetric data are derived from a wider range of sources for the dissemination of information that is used not only by academia but also by the general public. This has the added advantage that alternative data accumulate much more quickly than traditional citation rates, allowing for faster insight and assessment of trends in research impact. Providers of alternative evaluation indicators collect data from open sources.
Good to know: Alternative indicators provide a wide range of information, but it should be noted that these indicators do not have a precise definition (a standardised definition is currently being developed by the NISO Alternative Assessment Metrics (Altmetrics) Initiative). Accordingly, these indicators are not standardised, different suppliers collect different types of data, making it difficult to compare different altmetric datasets. As alternative indicators are just beginning to gain acceptance in academia, they are usually only listed in recently published papers, which means that you may not find them in later papers. In addition, altmetric indicators are usually only available for publications that have a DOI.
The main providers of altmetric data:
Altmetric is one of the best-known altmetric data platforms, covering a wide range of different types of scientific output data on usage and popularity. It uses a unique symbol to visualise the impact of research results. It can be added to your article (as a badge), data set, or other scientific output.
ImpactStory Profiles – By creating a profile on this website you will be able to follow Twitter and blogposts.
Paperbuzz – On this website, you can enter the DOI of a publication to see how many times it has been mentioned online. The information is only available for publications published since 2017.
Plum Analytics is a tool for capturing and mapping data for the use of scholarly output, owned by Elsevier and integrated into its platform. Plum Analytics collects data not only on scholarly publications, but also on datasets, open access publications, presentations, blogs, and other types of scholarly output.
Plum Analytics range of indicators consists of five categories:
Snowball Metrics is an initiative developed by university researchers around the world to set common standards for science assessment. These metrics are not linked to any single data provider or tool.
Altmetric data monitoring tools:
Almetric indicators are also available in the VILNIUS TECH Virtual Library. These indicators are displayed at the bottom of the publication entry:
For more information on the various indicators, visit the Metrics Toolkit.
VILNIUS TECH Library staff will assist you with any questions related to bibliometric indicators and provide necessary consultations:
Altmetric indicators are primarily qualitative quantified indicators that complement traditional science evaluation (scientometric) by measuring not only the scientific but also the social relevance and impact of scientific research.
Altmetric data are derived from a wider range of sources for the dissemination of information that is used not only by academia but also by the general public. This has the added advantage that alternative data accumulate much more quickly than traditional citation rates, allowing for faster insight and assessment of trends in research impact. Providers of alternative evaluation indicators collect data from open sources.
Good to know: Alternative indicators provide a wide range of information, but it should be noted that these indicators do not have a precise definition (a standardised definition is currently being developed by the NISO Alternative Assessment Metrics (Altmetrics) Initiative). Accordingly, these indicators are not standardised, different suppliers collect different types of data, making it difficult to compare different altmetric datasets. As alternative indicators are just beginning to gain acceptance in academia, they are usually only listed in recently published papers, which means that you may not find them in later papers. In addition, altmetric indicators are usually only available for publications that have a DOI.
The main providers of altmetric data:
Altmetric is one of the best-known altmetric data platforms, covering a wide range of different types of scientific output data on usage and popularity. It uses a unique symbol to visualise the impact of research results. It can be added to your article (as a badge), data set, or other scientific output.
ImpactStory Profiles – By creating a profile on this website you will be able to follow Twitter and blogposts.
Paperbuzz – On this website, you can enter the DOI of a publication to see how many times it has been mentioned online. The information is only available for publications published since 2017.
Plum Analytics is a tool for capturing and mapping data for the use of scholarly output, owned by Elsevier and integrated into its platform. Plum Analytics collects data not only on scholarly publications, but also on datasets, open access publications, presentations, blogs, and other types of scholarly output.
Plum Analytics range of indicators consists of five categories:
- Citations
- Usage (number of clicks/views)
- Captures (number of downloads, inclusion of publications in readers' digital libraries (Mendeley))
- Mentions (on news sites, blogs, Wikipedia)
- Social media (likes, shares)
Snowball Metrics is an initiative developed by university researchers around the world to set common standards for science assessment. These metrics are not linked to any single data provider or tool.
Altmetric data monitoring tools:
- Altmetric bookmark is a free tool that, once downloaded and installed, helps you find altmetric indicators (by DOI) for any scientific output.
- Figshare is an online digital repository for research that displays Altmetric symbols for published data. Users of this repository can access all data from all online comments, shares and related discussions.
- ImactStory profile – allows you to easily track the impact of your research by linking this profile to your online published research results.
Almetric indicators are also available in the VILNIUS TECH Virtual Library. These indicators are displayed at the bottom of the publication entry:
For more information on the various indicators, visit the Metrics Toolkit.
VILNIUS TECH Library staff will assist you with any questions related to bibliometric indicators and provide necessary consultations:
- By e-mail: crypt:PGEgaHJlZj0ibWFpbHRvOnB1Ymxpa2FjaWpvc0B2aWxuaXVzdGVjaC5sdC5sdCI+cHVibGlrYWNpam9zQHZpbG5pdXN0ZWNoLmx0PC9hPg==:xx
- by phone (8 5) 274 4903
- by visiting the Scientific Information Department (SRC, room 109)
-
- Page administrators:
- Jolanta Juršėnė
- Asta Katinaitė-Griežienė
- Olena Dubova
- Orinta Sajetienė
- Ugnė Daraškevičiūtė