Home
Sitemap
Mano VILNIUSTECH
lt
  • Library
    • About us
    • Open hours
    • News
    • VILNIUS TECH map
    • F.A.Q.
    • Contacts
  • Spaces
    • Library spaces
    • Reading Rooms in Faculties
    • Exhibition Gallery A
  • Services
    • Lending
    • E-catalogue
    • Suggest a book
    • Interlibrary Loan (ILL)
    • Print services
    • Workrooms reservation
    • Services for special needs users
    • Training and consultations
    • Lecturer recommended literature for students
  • E-resources
    • Databases
    • VILNIUS TECH virtual library
    • VILNIUS TECH e-books
    • VILNIUS TECH repository
    • Lithuanian electronic thesis and dissertations
    • Information management tools
    • LST Standards
    • Patent
  • Scientific Communication
    • VILNIUS TECH publications
    • VILNIUS TECH theses and dissertations (eLABa ETD)
    • Journal Selection for Publishing
    • Open Access
    • Research data
    • Scientific Research Results Visibility and Assessment
    • Scopus (Elsevier)
    • Web of Science (Clarivate Analytics)
  • Publishing
    • Books
    • VILNIUS TECH e-books
    • Scientific journals
    • Conference proceedings
    • Doctoral dissertations
  • Ebooks
  • Eshop
Home
lt
Library Scientific Communication Scientific Research Results Visibility and Assessment
  • VILNIUS TECH publications
  • VILNIUS TECH theses and dissertations (eLABa ETD)
  • Journal Selection for Publishing
  • Open Access
  • Research data
  • Scientific Research Results Visibility and Assessment
  • Scopus (Elsevier)
  • Web of Science (Clarivate Analytics)

Scientific Research Results Visibility and Assessment

For every researcher it is important that their research results are easy to find, visible to other researchers and readers, and receive as much attention as possible, as these factors have a significant impact on the research quality assessments. Accordingly, the visibility of research results (due to its relation to received citations) is important not only for authors but also for the institutions they represent, as their scientific progress is assessed by the quality of the institution's scientific output (mainly – by the quality of scientific publications affiliated to the institution)

Ensuring the Visibility of Research Results

  • Measures to increase the visibility of research results
  • Author identifiers
Measures to increase the visibility of research results
Essential measures to increase the visibility of research results:
  1. Researcher identifiers (ORCID, Researcher ID, Scopus Author ID)
  2. Dissemination of research results by open access publishing model
  3. Audience-focused dissemination of research results (by choosing the most appropriate source for publishing in terms of discipline coverage and thematic specificity). Information on how to choose a source for publishing can be found here >>>
  4. Utilizing unofficial information resources to publicize research results (social networks, personal accounts, blogs, etc.)
  5. Development of academic relations and active cooperation
Practical advice for implementing these measures:
  • Create your researcher’s IDs and profiles (create at least an ORCID profile)
  • Try to include your researcher IDs in all academic activities, not only when publishing articles, but also, for example, when applying for research funding.
  • Include your researcher IDs in your personal profiles and social space so that your name is associated with your published research, no matter what search system is being used.
  • Use the same spelling of your last name throughout your academic career.
  • Use a standardized name for your institution: avoid using institution’s name abbreviations, as incorrect attribution of a publication to an institution can also lead to incorrect attribution to the author.
  • Check the information of your publications in international databases and contact the service provider if you notice any errors in your name or contact information.
  • Provide accurate contact information so that other authors can contact you for additional information about the article or for collaboration opportunities.
  • Look for opportunities to conduct research and publish articles in collaboration with recognized and already well-known researchers in the academic community.
  • Attend conferences and other scientific events. This participation will help you to make beneficial connections for future collaboration.
  • Create a Google Scholar author profile.
  • Join academic social networks, e.g. ResearchGate, Academia.edu, LinkedIn. Link the profiles of these networks to each other.
  • Be active in the academic virtual space: get involved in discussions, raise relevant questions and ideas. This will show that you are actively involved in research and will help to publicize your competencies and experience.
  • Provide persistent identifiers (DOI, URN) for your research output in your social networking accounts.
  • When writing an article, mention the main (key) research concepts / topics in the summary. Search systems and citation tracking tools search for a summary of your article, so more frequent repetition of keywords increases the chances of your article being found, read and consequently – cited.
  • Publish articles in reputable and popular sources that represent your target audience.
  • Host your unpublished research in open access repositories.
  • Take advantage of social bookmarks by using Mendeley, Zotero, and CiteULike tools.
Author identifiers
Author identifiers help solve problems with author disambiguation and incorrect attribution of publications to an author (or institution), which may occur due to:
  • Changes in the author 's last name during a scientific career (e.g. after marriage)
  • High frequency/popularity of author's last name
  • The complexity of author’s name, leading to a number of possible spelling variations (e.g. existence of one (or more) middle names and/or surname prefixes)

ORCID



ORCID – is an open, free-of-charge, unique, and persistent researcher ID (Open Researcher and Contributor ID) dedicated to distinguish researchers' research activities and results from those of other researchers who carry out similar research and / or have similar names.

The purpose of ORCID is to enable transparent and trustworthy connections between researchers, their contributions (publications, datasets, participation in projects, received research funding, peer-review activities, etc.), and their affiliations throughout their careers and engagement in research, scholarship, and innovation activities.

ORCID – International Standard
  • ORCID is internationally recognized. This tool established by independent and non-profit initiative is used by:
  • research institutions (e.g., CERN, ETH Zurich)
  • research funders;
  • publishers (e.g., Springer Science + Business Media, Elsevier and Wiley)
  • research societies and organizations (e.g., ACM, IEEE);
  • databases (e.g., Web of Science, Scopus)
Accordingly, it is recommended that all researchers create an ORCID profile and use their ORCID identifier in all scientific activities they are engaging in.

ORCID benefits for the researcher:
  • Increases the visibility and findability of researcher’s publications and research data.
  • ORCID is a unique person identifier. Therefore, ORCID allows to identify particular person even in the cases of researchers with identical names, events of surname changes, occurring spelling errors, and/or different spelling variants.
  • Links a researcher's career achievements with research outputs (publications, research data, software, etc.).
  • ORCID protects personal and private data.
  • Facilitates communication with funders, associations, publishers, repositories, and registration to conferences or other scientific events.
  • Facilitates the management of all staff within university, faculty, department, institute, or research group.
  • ORCID is a global system. All information about new publications can be automatically placed in the ORCID profile using the update function.

ORCID profile creation:
  1. Sign up:  free and fast registration (sign-up link).
  2. Login:  fill in the ORCID profile, link it to other IDs (such as your Scopus Author ID or ResearcherID).
  3. Make a list of your published research outputs (manually or by importing publication data from other sources (Google Scholar, Web of Science, Scopus, CrossRef, etc.)).
 
Privacy and data protection are the core principles of ORCID. The visibility of all information can be controlled in accordance with the provisions of the ORCID privacy policy. When signed in, every person can control which profile information and publications are visible to people or groups of people. Only the name and ORCID ID will be made public.

Other researcher IDs

ResearcherID  – is a unique researcher identifier (and profile) created in the Web of Science database (WoS) environment that allows a researcher to manage their WoS indexed publications, track citations and h-indexes, and avoid misidentification of the author. ResearcherID can be linked to an ORCID profile.

ResearcherID profile can be created at the registration page. If an ORCID profile has already been created, it is possible to move information from one profile to another, view all variations of the author's name, create a common list of all publications, and so on.

Scopus Author Identifier (AuthorID) – In the Scopus database, each author of an indexed publication are assigned with unique identifier (AuthorIDs) and associated individual profile. As with ResearcherID, Scopus Author ID profile data can be linked to an ORCID profile.

More information about Scopus Author Identifiers can be found here >>>
 

Research Evaluation Method – Scientometrics

Scientometrics – application of mathematical and statistical methods in quantitative evaluation of quality.

Bibliometrics – a branch of scientometrics focused on the quantitative evaluation of scientific publications and their citation data.

Bibliometric analysis – the main method used to assess and/or compare the quality of scientific output of individual researchers, institutions, faculties.

For the members of academic community bibliometric analyzes are important and are performed for several main purposes:
  • To assess the impact of research results, when applying for a new job, promotion, salary supplement, or research funding;
  • Evaluating other researchers for collaborative purposes;
  • Choosing a journal for publishing research results;
  • Selecting high quality scientific information for studies and graduation works;
  • Assessing the relevance of literature sources for subscription;
  • Bibliometric data are also analyzed in compiling University rankings.

The results of bibliometric analysis are applied not only within academic space – various other decisions are also based on the estimated quality of scientific production, for example: distribution of funding, education policy, study program planning, technological and innovation development, economic and social issues, etc.
 
Bibliometric data sources

The main data of bibliometric analyzes are scientific publications and their citations. Nowadays, the scientific community has access to a wide range of sources that provide this data, but the most suitable for bibliographic analysis are bibliographic databases (DB), which contain not the publications themselves, but their abstracts and metadata.

Although over the past decade the number and variety of bibliographic DBs has expanded significantly, the best known and most widely used are Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier) databases (DB).

Both of these DBs are:
  • multidisciplinary – contains data on scientific publications of various disciplines and types.
  • selective, i.e. only scientific publications published in the highest quality sources are included (indexed) in their content. The selection of sources for indexing is performed using strict selection criteria by commissions set up specifically for this purpose.
  • commercial – full access to indexed content and analysis tools is available only to members of the communities subscribing to the DB. VILNIUS TECH subscribes to both of these DBs, but access is possible only from university’s internal network, and they can be accessed from external network only by using the VPN service. It should be noted that Scopus DB is partially open and provides information on authors and indexed sources publicly (free of charge).

More information about WoS and Scopus databases is available in the Scientific Communication section on VILNIUS TECH Library website.

Google Scholar (GS) is a publicly available search engine for academic literature. It is the most comprehensive source of academic literature, but the primary GS data sources and citation data are not completely reliable. By creating a GS account, you can increase the visibility of your publications, track the citations of your publications, and view citation sources.

For subject specific data analyzes, it may be more convenient to use data from specialized DBs (e.g., PubMed, EconList, ERIC, PsycINFO, etc.).
 

Scientific Output Assessment Tools – Scientometric Indicators

The main tools for evaluating scientific output are scientometric indicators. These are quantitative estimates of the quality of scientific output (publications) calculated on the basis of mathematical formulas and/or specially designed algorithms.
They are estimated by the data of the publications and their citations, based on the assumption that if the research work has been cited, it proves the significance/impact of the described research and/or its results on scientific progress.
 
  • Scientometric indicators
  • Scientometric indicators usage
Scientometric indicators
Source quality (impact) indicators are the most widely used in the evaluation of scientific results. These indicators are designed to assess the quality of sources (usually journals). Therefore, they are not indicators of researcher’s performance. They also do not indicate the quality of an individual article published in the journal, as an article published in a journal with a high impact indicator value is not necessarily qualitatively better than an article published in a journal with a lower value.

Source/journal impact indicators are presented in the main bibliographic DBs – Web of Science (WoS) (Clarivate Analytics) and Scopus (Elsevier).
Both databases provide different Scientometric indicators for evaluating the quality of a source, but they can be considered as equivalents of the same type of indicators in respect of their calculation principle and purpose.


More information about the main journal impact indicators provided by Scopus and WoS, the principles of their calculation, similarities and differences, is provided here >>>

h-index (Hirsch Index) has been developed to assess the productivity of researchers and the quality of their output, but can also be used to assess institutions, sources or any other sets of publications. This indicator is called hybrid because it measures both quality (citations) and quantity (number of publications).
h-index = number of publications (of an individual, institution, or other evaluated entity) h, which were each cited at least h times.

h-index values for authors is presented in all major bibliographic DBs, such as Google Scholar, Scopus and Web of Science.

Important! Different DBs provide different number of citations and h-index values, as these indicators are determined only from the data indexed in that DB.

More information about the different metrics is provided in the Metrics Toolkit.
 
Scientometric indicators usage
Scientometric indicators have been developed to enable quantitative measurement of quality. The most widely used are source (journal) citation (also called impact) indicators, which have been developed and are designed to assess the overall quality of a source to aid in making a more informed decision on journal subscription or indexing. However, nowadays the use of journal impact indicators has spread far beyond their intended purpose, as today the quality of individual publications and/or researchers is also jugged by the values of these indicators, which is not a good practice, because:
  • these indicators estimate the average level of quality of the scientific output published in the journal (or other periodical source), i.e. individual publications published in the same source may differ significantly in their quality (both fundamentally and by citation levels) and therefore cannot be judged on the basis of the overall impact of the journal;
  • In addition to the chosen source in which the publication is published, the quality and/or citation of an individual publication is influenced by a number of factors, such as research discipline, specificity of the subject, popularity and relevance of the research, country of implementation and/or publication, language, and even co-authors. However, many of these aspects are not taken into account in the calculation of the mostly used journal indicators, which again shows their inadequacy to assess the quality of individual publications.
Currently a new view towards the use of bibliometric indicators in recruitment, promotion, funding and other evaluation processes is emerging within the scientific community. The evaluation of scientific results should be responsible, fair and appropriate, and aiming to achieve these objectives it is suggested to review current principles and practices of research evaluation. The dissemination of scientific output is currently not limited to traditional dissemination channels (publications), so the applied research impact assessment measures do not adequately reflect the real situation: scientometric (citation) indicators can only reflect the scientific quality/impact of the research, and do not assess the social impact on the general public.

The following should be taken into account when using science metrics:
  • A high number of citations does not necessarily indicate high publication quality. For example, authors may cite an article by criticizing it or refuting its conclusions or results;
  • In some disciplines (e.g. medicine), articles are traditionally cited more often than in others (e.g. humanities);
  • Researchers at different stages of their careers should not be directly compared. Experienced researchers tend to have higher indicator values than young researchers;
  • Indicator values may vary depending on the databases used for result evaluation.

The main principles of the proper use of sciencometric indicatos in practice are announced in:
DORA Declaration San Francisco Declaration on Research Assessment.

The main principles of DORA:
  • Eliminate the use of journal-based evaluation in the assessments of research and researchers;
  • For assessing researchers and research, evaluation systems designed for their evaluation should be used;
  • Use scientific technologies to improve scientific communication and evaluation of scientific input.
The Leiden Manifesto for research metrics. 
 

Alternative Publication Quality Assessment Indicators (Altmetrics)

With the emergence of new forms of scientific communication and the growing number and variety of electronic publications, the development and application of alternative indicators for the evaluation of scientific results, based on publication usage and popularity data (e.g., number of views, number of downloads, tags and mentions on social networks, blogs, videos, comments, number of shares, etc.), has begun.

Altmetric indicators are first and foremost qualitative indicators that complement traditional research evaluation (scientometric) indicators, as they allow the assessment of not only the scientific but also the social significance and impact of research.

Altmetric data are derived from a wider range of information dissemination sources used not only by the academic community but also by the general public. Another advantage is that alternative data accumulate much faster than the data used for calculation of traditional citation indicators, thus allowing to indicate research impact trends more quickly. Alternative assessment indicator providers collect data from open sources.

Useful to know: Alternative indicators provide a lot of information, but it should be noted that these indicators do not have a precise definition (NISO Alternative Assessment Metrics (Altmetrics) Initiative  is currently developing a standardized definition). Accordingly, these indicators are not standardized, thus different providers collect different types of data, making it difficult to compare different altmetric data sets. Since alternative indicators have just begun to gain their recognition in academia, they are usually available only for recently published works, and may not be provided for earlier works. Also, altmetric indicators are usually provided only for publications that have a digital object identifier DOI.

The main providers of altmetric data:

Altmetric is one of the best known altmetric data platforms, covering a wide range of data on the usage and popularity of various types of scientific output. A unique symbol is used to visualize the impact of research results, which can be added (as a badge) to your article, dataset, or other publication.
https://vilniustech.lt/images/3696/184/9/15_1/gfdrfg.jpg
ImpactStory Profiles – by creating a profile on this page you will be able to follow Twitter and blogposts. 

Paperbuzz – by entering the publication’s DOI on this webpage, you can see how many times the publication has been mentioned online. Information is available for publications published since 2017. 

Plum Analytics – is an altmetric data collection and visualization tool owned by Elsevier. Plum Analytics collects data on scientific publications (both published by traditional and Open Access models), datasets, presentations, blogs, and other types of scientific output.
Plum Analytics indicators range consists of five categories:
  • Citations
  • Usage (views)
  • Captures (number of downloads, inclusion of publications in readers' electronic libraries (Mendeley))
  • Mentions (on news sites, blogs, Wikipedia)
  • Social media (likes, shares)


Snowball Metrics is an initiative developed by researchers from universities around the world to set common standards for science assessments. These metrics are not related to any data provider or tool.

Altmetric data monitoring tools:
  • Altmetric bookmark – a free tool that helps to find altmetric indicators of any scientific output (by DOI) (requires installation). 
  • Figshare – an online digital repository for academic research that displays Altmetric badges for stored research outputs, allowing users to access a collated record of all of the comments, shares and discussion relating to an item. 
  • ImactStory profile – allows to easily track the impact of your research by linking this profile to your online published research results.
Useful to know: Altmetric data are provided for all publications in Dimensions DB, and for publications published in MDPI journals.
Almetric indicators are also available in the VILNIUS TECH Virtual Library. These indicators are displayed at the bottom of the publication entry:

https://vilniustech.lt/images/3699/184/9/18_1/Untitled.png
For more information on the various types of metrics, visit the Metrics Toolkit.



VILNIUS TECH Library staff will assist you with any questions related to bibliometric indicators and provide necessary consultations:
  • By e-mail: crypt:PGEgaHJlZj0ibWFpbHRvOnB1Ymxpa2FjaWpvc0B2aWxuaXVzdGVjaC5sdC5sdCI+cHVibGlrYWNpam9zQHZpbG5pdXN0ZWNoLmx0PC9hPg==:xx 
  • by phone (8 5) 274 4903
  • by visiting the Scientific Information Department (SRC, room 109)

 
    • Page administrators:
    • Jolanta Juršėnė
    • Asta Katinaitė-Griežienė
    • Orinta Sajetienė
    • Indrė Ereminė
Library
Library
About us
Open hours
News
VILNIUS TECH map
F.A.Q.
Contacts
Spaces
Spaces
Library spaces
Team Space
Active Learning Space
Reading Room Gallery
Reading Room 24/7
Workrooms
Reading Rooms in Faculties
Architecture and Creative Industries Sciences Reading Room
Technology and Management Sciences Reading Room
Exhibition Gallery A
Video about exhibitions
Virtual Exhibitions
Services
Services
Lending
E-catalogue
Suggest a book
Interlibrary Loan (ILL)
Print services
Workrooms reservation
Services for special needs users
Training and consultations
Training and consultations
E-learning
Virtual guides
Lecturer recommended literature for students
E-resources
E-resources
Databases
Subscribed databases
Electronic books
Electronic journals
Trial
Research evaluation
Subject information
Electronic dictionaries
Library created
Text audio records
VILNIUS TECH virtual library
VILNIUS TECH e-books
VILNIUS TECH repository
Lithuanian electronic thesis and dissertations
Information management tools
LST Standards
Patent
Scientific Communication
Scientific Communication
VILNIUS TECH publications
VILNIUS TECH theses and dissertations (eLABa ETD)
Journal Selection for Publishing
Open Access
Research data
Scientific Research Results Visibility and Assessment
Scopus (Elsevier)
Web of Science (Clarivate Analytics)
Publishing
Publishing
Books
VILNIUS TECH e-books
Scientific journals
Conference proceedings
Doctoral dissertations
Ebooks
Ebooks
Eshop
Eshop
Mano VILNIUSTECH
vilniustech.lt
  • Contacts
  • Open hours
  • F.A.Q.
Saulėtekio al. 14, LT-10223 Vilnius
Phone +370 5 274 4900
Fax +370 5 274 4904
E-mail crypt:PGEgaHJlZj0ibWFpbHRvOmJpYmxpb3Rla2FAdmlsbml1c3RlY2gubHQiPmJpYmxpb3Rla2FAdmlsbml1c3RlY2gubHQ8L2E+:xx
e-solution Mediapark
e-solution Mediapark