Science, Technologies, Innovations №4(28) 2023, 93-99 р

http://doi.org/10.35668/2520-6524-2023-4-09

Balanchuk I. S. — Head of the Department, Ukrainian Institute of Scientific and Technical Expertise and Information, 180, Antonovycha Str., Kyiv, Ukraine, 03150; +38 (044) 521-09-81; slavira218@gmail.com; ORCID: 0000-0002-5179-7350

Mykhalchenkova O. Ye. — Senior Researcher, Ukrainian Institute of Scientific and Technical Expertise and Information, 180, Antonovycha Str., Kyiv, Ukraine, 03150; +38 (044) 521-09-81; alenasimchuk5566@gmail.com; ORCID: 0000-0001-7784-9668

CRITERIA AND SYSTEMS FOR INDIVIDUAL ASSESSMENT OF A SCIENTIST: WORLD EXPERIENCE

Abstract. The scientific research method of bibliometrics has been used for many years to evaluate and analyze specific areas of research; However, the structure according to which the results of the scientific work of individual researchers are assessed using bibliometric methods has not yet been clearly formulated. This work is aimed at searching, identifying and developing key indicators of a scientist’s individual performance and the factors that shape this performance.
Today, the most common and most convenient ways to determine the degree of effectiveness of individual researchers’ scientific searches is the number of publications and the number of citations. However, they do not fully reflect the results that researchers have in opposite fields of knowledge, for example, physics and linguistics.
The presented research includes the development of practical recommendations for the selection and focus of individual indicators and criteria, respectively, in the areas of their application (branches of the scientist’s scientific activity). The findings and results obtained can be used as a basis for further scientific research on this topic, as well as for the development of government methods for individual assessment of a scientist during various events, such as competitions, scholarships, etc.

Keywords: bibliometrics, effectiveness, scientific activity, methodology, system.

REFERENCES

  1. Li, W., Zhang, S., Zheng, Z. Skyler J. Cranmer & Aaron Clauset. (2022). Untangling the Network Effects of Productivity and Prominence among Scientists. Nature Commun 13, 4907. Retrieved from: https://www.nature.com/articles/s41467-022-32604-6.
  2. Abramo, G., Cicero, T., & D’Angelo, C. (2013). Individual research performance: A proposal for comparing apples to oranges. Journal of Informetrics. 7 (2), 528– https://doi.org/10.1016/j.joi.2013.01.013.
  3. Smit, J., & Hessels, L. (2021). The Production of Scientific and Societal Value in Research Evaluation: A Review of Societal Impact Assessment Methods. Research Evaluation. 30 (3), 323– https://doi.org/10.1093/reseval/rvab002.
  4. Bornmann, L., & Marx, W. (2013). How to Evaluate Individual Researchers Working in the Natural and Life Sciences Meaningfully? A Proposal of Methods Based on Percentiles of Citations. Scientometrics. Administrative Headquarters of the Max Planck Society, Max Planck Institute for Solid State Research. 39 Retrieved from: https://arxiv.org/ftp/arxiv/papers/1302/1302.3697.pdf. http://dx.doi.org/10.1007/s11192-013-1161-y.
  5. Andrashko, Y. (2017). Evaluation Methods of the Results of Scientific Research Activity of Scientists Based on the Analysis of Publication Citations. Eastern-European Journal of Enterprise Technologies. 3 (87), 4– Retrieved from: https://www.researchgate.net/publication/328902671_Evaluation_methods_of_the_results_
    of_scientific_research_activity_of_scientists_based_on_the_analysis_of_publication_citations
    .
  6. Ahangar, H., Siamian, H., & Yaminfirooz, M. (2014). Evaluation of the Scientific Outputs of Researchers with Similar H Index: A Critical Approach. Acta Informatica Medica. 22(4), 255– https://www.researchgate.net/publication/268282437_Evaluation_of_the_Scientific_Outputs_of_Researchers_with_Similar_H_Index_a_Critical_Approach.
  7. (1993). Government Performance Results Act of 1993. Office of Management and Budget. Retrieved from: https://georgewbush-whitehouse.archives.gov/omb/mgmt-gpra/gplaw2m.html.
  8. Fortunato, S., Bergstrom, C., Börner, K., Evans, J. Helbing, D., & Milojević, S. et al. (2018). Science of Science. Science, 359, 6379. https://www.science.org/doi/10.1126/science.aao0185.
  9. REF 2021: Quality ratings hit new high in expanded assessment. (2022). Retrieved from: https://www.timeshighereducation.com/news/ref-2021-research-excellence-framework-results-announced.
  10. Research Excellence Framework (REF). (2021). University off Nottingham. Retrieved from: https://www.nottingham.ac.uk/english/research/ref-rankings.aspx.
  11. Petersen, A., Wang, F., & Stanley, H. (2010). Methods for Measuring the Citations and Productivity of Scientists cross Time and Discipline. Physical Review, No. 81. 25. Retrieved from: https://www.researchgate.net/publication/43020887_Methods_for_measuring_the_citations_and_productivity_of_scientists_across_time_and_discipline. http://dx.doi.org/10.1103/physreve.81.036114.
  12. Sahel, J. (2011). Quality Versus Quantity: Assessing Individual Research Performance. Science Translational medicine. 3(84), 84cm13. https://doi.org/1126/scitranslmed.3002249.