SIKORSKY.Library

High-Level Project Summary

We are introducing the possibility of creating a dynamic rating in connection with the differentiation of evalua.We have introduced a rating coefficient to evaluate the study, which takes into account: reviewers score, anti-plagiarism score, g/i indexes, Bill's list and Kardashian coefficient. A prototype of a global rating system was developed, which contributes to the structuring, accessibility and popularization of open science. The system has internal formulas,metrics and entities for calculating ratings: the author of the article, the reviewer and the institution that the reviewer represents, and the created chain of dependencies will be able to quickly respond to сhanges.

Detailed Project Description

After conducting a comprehensive study of the problem, we created the following system:

The introduction of a universal rating for research, which will allow adapting and structuring a large amount of research.

The point is that each work should have its own assessment. So, in order to realize maximum objectivity, we studied the factors that are taken into account when evaluating a study. The main factors that we took into account when creating the coefficient:


  • rating from reviewers;
  • anti-plagiarism score;
  • i-index;
  • g-index;
  • k-index;
  • Bill's list.

Algorithms have also been developed that allow leveling the subjectivity of the overall assessment of reviewers. Also, the mechanism involves the introduction of a rating system among: authors, reviewers, and institutions whose representatives are reviewers and authors. Thus, if a reviewer deliberately conducts a biased assessment, then other reviewers, through a quick survey, will help the system determine how fair this assessment was in their opinion. As soon as bias is identified, the algorithm will automatically lower the rating of the reviewer and his institution. Thus, the assessment will be formed not only on specific metrics, but also on public assessment, depending on a set of factors.


Such a system allows you to create a complex that will become a qualitative basis for the fairness of research evaluation, which will be a catalyst for the creation of a fair rating and search system among scientific works.


Why is it relevant:

unfortunately, modern rankings of research libraries are not objective enough. Often, citations are conditionally created (with the help of colleagues at the institution), views are spun using algorithms, and good reviews are simply bought, and access to scientific research can be very difficult to obtain. All this creates an unfavorable atmosphere and is a stimulator of degradation. So, our system has several internal ratings and control methods that suppress the above factors. So, for example, in order to prevent special underestimation of points, we introduced “Bill's list”. This is a list of unscrupulous researchers and reviewers, and if a person's ID is in this database, then the system will not give him the opportunity to particularly influence the evaluation of the work, but it also retains the opportunity to correct himself and leave this list.

In addition, the system has thought out the implementation of a smart search, which will select the necessary resources for scientists according to subjective wishes, and the possibility of rating ordinary inhabitants has been thought out to make it clear to the system which of the works is suitable for a particular request, and the interface provides the ability to read comments ordinary people (not reviewers) and immediately understand whether he is interested in this work before he reads it, which will speed up the search for the information necessary for the layman.

The programming language of the experimental program is Python.

We offer further implementation on the laravel framework and the PHP language for the server part of the program and the vue.js framework for the implementation of the client part of the program. To work with the database, we suggest using the MySQL database management system.

To get acquainted with our project in more detail, we suggest reading a brief manual that we wrote for you (in additional materials for the project).

It is fair to say that the main advantage of our method is flexibility. We deliberately did not introduce the percentage method with reference keywords, since our system is more developed in this regard: moderation is not carried out by an individual, but by groups of several independent methods based on the opinion of the scientific community. And this, in turn, creates an unprecedented innovative approach, since no one has previously turned to such a method of analysis before. And our experimental code, after successful tests of the early stage, proved the possibility of practical use of our method.

Key point: we have developed a completely new approach, which is based not on the number of article evaluators, but on the professionalism of the evaluator. That is, we are introducing the possibility of creating a dynamic rating in connection with the differentiation of evaluators. And now, with a relatively small number of reviews, we get a high estimation accuracy. We analyzed the largest resources that work in topics similar to ours and confidently declare: none of the resources uses the approach that we offer!

Space Agency Data

NASA’s Open-Source Science Initiative

We have processed the sample codes that were provided and improved our test code based on them.


NASA’s Transform to Open Science (TOPS) initiative

This site helped us stabilize our goals and understand what the product should be in the end.


From Open Data to Open Science

Was used to update information regarding problems to be solved


Data Chat: Kaylin Bugbee | Earthdata

Thanks to this site, we understood what open science is.


NASA’s EARTHDATA Site

On this site, we analyzed how information can be submitted and how it should be structured.


EO Dashboard

We also used this site to define criteria for categorizing studies.


ESA and Open Science


This site, like the one above, helped us clarify the concept of "open science"


CSA Open Science Action Plan

This site, like the one above, helped us clarify the concept of "open science" .

Having a plan for the implementation of "open science", we had an idea which areas would be the most relevant in the near future

Hackathon Journey

Our team got to the hackathon in a very unusual way. Team leader Yaroslav Burlakov, after the start of the hackathon, personally called each of the participants and told them why they needed to participate. This is how our team was formed. Why this topic?

Unfortunately, in modern realities, we observe instability in the systematization of research. Imagine where you saw this, so that you had to pay for writing a good review. This is exactly what happened to one of the team members.

He needed to publish his research in an international journal, and for this he needed the editor of that journal to write a review.

The condition is: you pay $100 and you have a good review with a high rating. You pay $50 - the article will be in the magazine, but the rating will be average.

If you do not pay, there are no reviews, which means there is no publication. This means that a really cool scientific study that could help someone in this direction was deliberately slowed down. Actually, such an injustice led us to this topic. We have developed this project to remove the factor of this injustice. On the spot we had a wonderful mentor Anna Viktorovna and simply the best organizer - Ivan Zagorulko. Vanya helped us from start to finish. This is just an incredible job on the part of the organizers and we as a whole team want to say a huge thank you to them, you are the best.

References

  1. https://www.sciencedirect.com/science/article/pii/0304387894000034
  2. https://link.springer.com/chapter/10.1007/3-540-08755-9_9
  3. https://www.tandfonline.com/doi/abs/10.1080/07366987309450078?journalCode=uedp20
  4. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5370619/
  5. https://openscience.in.ua/
  6. https://uk.wikipedia.org/wiki/%D0%92%D1%96%D0%B4%D0%BA%D1%80%D0%B8%D1%82%D0%B0_%D0%BD%D0%B0%D1%83%D0%BA%D0%B0
  7. https://www.unesco.org/en/natural-sciences/open-science

Tags

#NASA, #open_science,#data,#reviewers, #anti-plagiarism score;,#i-index;,#g-index, #k-index, #Bill's _list, #PHP, #Python, #MySQL, #science , #logic , #World, #research, #new, #website , #2022