High-Level Project Summary
Odyssey was initiated by six individuals from Colombo, Sri Lanka, as the newest platform to measure ANY open science activities’ effectiveness.Odyssey metrics is solely based on three categories; Feedbacks (50%), Author based(20%) and Activity based(30%). All Open Science activities will be categories upon submission into the branch of science, area of research, subcategory and activity.By using Odyssey, we strongly believe users will get the newest kind of experience in terms of innovation, effectiveness and accuracy. Odyssey will be the game changer in open science. It will be the ultimate “social media” of open science.
Link to Final Project
Link to Project "Demo"
Detailed Project Description
What is an open science activity?
An open science activity can be considered an activity that has all its data shared among the community. The main purpose of the shared data is to spread knowledge on the relevant fields as the activity. Open science is an initiative created to make research and innovation projects more effective with improved quality.
What is the Odyssey metric?
The odyssey metric is a score type that can be used to evaluate the effectiveness of any given open science project regardless of its outcome. The concept of the metric can be easily explained as an average of multiple marking criteria that is used to evaluate the effectiveness. The index value will span between 0 and 100. Most effective activities will be granted a higher score and others will be measured accordingly. The index value will change with time and when mentioning the index value the accessed date is also required to be alongside. To showcase the usage of the metric we will be creating the odyssey metric platform. Which is created as a place for any open science project needs, In it we will be scaling the activities, providing insights, providing feedback, and providing advanced searches as well.
Why Odyssey index?
As of the research we have done toward achieving a solution for the challenge of Measuring open science, we were unable to find any platform/matric that can be used to evaluate any activity with a numbered scale. Therefore, the OI gives the author/publisher/organizer the ability to evaluate any open science activity with the ability to have an easily understandable scoring system. By using the odyssey index we aim to assess open science activities and encourage the science community to use the assessed activities as their information. The main aim of the odyssey index is not to create competition but to evaluate any open science activity by its effectiveness in the community.
A to Z on Odyssey index
In order to calculate the odyssey index, we will be needing a clear format to proceed as we have many activities to evaluate. The submission will be the place where the calculations happen in order to score the relevant activity, therefore to score any activity this submission area must be a common ground.
At first glance, a full detailed form will be needed to complete in order to submit any activity. The form is created to accommodate any kind of specified activity within the scope of open science activities. The following attributes will be there to fill and are explained below.
- Category - As we evaluate any open science activity, these projects may be under many subject scopes. As to narrow down that scope, we will be giving the author/publisher the chance to select a category for their project. Biology, Physics, and Chemistry can be named as a few examples for the category selection.
- Subcategory - As we are measuring a science subject-related project and we are continuing to narrow down the scope, we will be using a subcategory as well for any given activity. This subcategory falls under the main category and will depend on which category you have chosen. For example, if we choose biology as the main category, you will be having Zoology, Neurobiology, Marine Biology, and many more subject topics.
- Activity - This is the place where we are going to give the author/publisher the chance to select the topic which describes their project the most. This is also done to minimize the scope of the relevant fields. A research paper is an example of this selection. The author/publisher can choose the activity type user in the subcategory to publish their activity.
All three selections are basically done to create the best possible questionnaire to evaluate the projects. As we narrow down the scope, the metric is used to be a fine combo and go through narrow details on relevant activities. A questionnaire will be created with their category, subcategory, and activity selection and that will be used by the author, the activity participants, and people who view this to provide necessary feedback on the activity.
- Topic - This is a place to type the topic of the activity. As we are providing the same format to submit any activity, this is the first place that might catch the eye of a viewer.
- Abstract - As the name suggests, the author/publisher is required to create a small abstract on their activity. This is a high-level summary that will be used to evaluate the activity using a machine learning model in the future. At the moment this is used to provide a summary of the project to the viewer.
- Description - This is the place where the author/publisher is needed to provide all the information on the activity. More detailed the better. As described in the abstract this is an explanation of that.
- Attachment - If the relevant activity contains any materials that may need viewers' attention, those can be attached here as well. For example, if we consider a research paper, the details on the paper can go on the description and the paper itself can be attached here.
- Ideal participant - This is a place where the author/publisher is able to mention details of the activities of the ideal participant. This is used to evaluate the feedback and provide a high-level search function for future authors/publishers. The ideal participant will be a character that the author/publisher is mainly aiming to communicate with. For example, if we consider a research paper, the researcher is free to choose their ideal participant by focusing on their research area or its outcome. The age range of a participant, hobbies of the participants, and gender of the participants… likewise can be used to describe the activities of an ideal participant. The advanced search feature will be explained more in the document.
- Participant count - This is a place to showcase how many participants were involved in the activity and this number is taken into consideration when marking as taking a percentile from the feedback received. This is not a required field depending on the activity.
- References - References are the places where the author/publisher is needed to provide the places where they have used to generate ideas. This is not a required field depending on the activity.
- Citations - This is the place where the author/publisher is needed to provide citation information used within the activity. This is also not a required field depending on the activity.
- co-author(s) - If the activity contains more than 1 author, they can be mentioned here. Also, more co-authors can be created if the author decides that a certain viewer has given considerable insight into the activity. ( will be explained further )
- Associated with - The activities association can be mentioned here if any. This can be used to showcase the activity base as if the activity is organized by or for what. This is also not a required field depending on the activity.
- Self-evaluation - This is the place where the author/publisher is required to rate their own work using a questionnaire which is created by analyzing the category, subcategory, and activity chosen by the author/ publisher.
The above-explained fields are used to analyze and showcase the activity to the viewers to provide feedback.
After the activity is published, the activity is not editable. If the author/publisher wishes to provide any updates to the submitted activity, they will undergo different versions of the activity. Even though the activity undergoes any updates, the version history will be still available to the general public.
When an activity is submitted, a discussion panel and a comment section will be open on that activity. The discussion section can be used to provide any suggestions or provide global feedback on the activity. The term Global Feedback will be explained later. The discussion section can be explained as a comment section but the information provided in the discussion section can be taken into consideration. The comments given in the discussion section can only be answered by the author/publisher of the event. The whole discussion section can be explained as a discussion between the author/publisher and users who gave feedback. If the author/publisher decided that a certain individual provided valuable feedback and comment on the activity, they can be added as co-authors as well. Discussion section activities are taken into consideration when resting the odyssey index value. If a different version of the acidity is published, the discussion page will be closed on the first version and a new one will be opened.
The Comment section will be used to recommend that activity to another user on the metric platform and that is the only place where the index will get a value from the comment section. All users on the metric platform can comment on this section and this can be explained as a place of discussion among the viewers about the relevant activity.
The Odyssey index value will be created on activity and will be continuously changing over time. Depending on more feedback arrivals and comparison to other projects available. The last version of any activity will be the place where the odyssey index will provide the value of effectiveness.
After any activity is submitted a time period of 48 hours will be issued before scoring using the index. That time is given to gather local feedback. The term local feedback will also be explained later.
Metrics used within…
Feedback based
First, we must know what the feedback is and we must understand the event's goal. In any event, lecture, activity, task, or research, the final goal is to give a clear idea of what they are expressing to the audience. For example, we can get a lecture about a specific topic. In that lecture, the lecturer’s sole purpose is to give a clear idea about that specific topic. So, in an event/activity, the organizer’s sole purpose is to give a clear idea about that activity. To accomplish this goal they use various methods. In the end, to ensure the receiver has received the message and interpreted it correctly as it was intended by the senders, feedback is a reliable method to use.
In our index, we use two types of feedback to ensure the receiver has received the message and interpreted it correctly as it was intended. Feedback is one of the main factors that we use to calculate any open science activity’s efficiency. There are two main types of feedback that we consider to our calculation these two are local feedback and global feedback. Next, we talk about local feedback.
Local feedback
In local feedback, our only concern is about people that participate in that event or activity. We present a questionnaire to participants and then get their feedback. These are unique questionnaires for participants according to the Author’s previous selection of topics and subtopics of the activity. We can present it as this: if the Author selects a topic as Life Sciences and a subtopic as Anatomy then the participant’s questionnaire must be according to life sciences and anatomy. If we consider another example, if the author selects Earth Sciences as a topic and Meteorology as a subtopic, then participants’ questionnaire must be according to Earth Sciences and Meteorology.
Global Feedback
In global feedback, we are concerned about all the people that are not participants in the above event or activity. The process of feedback is the same as local feedback, the author selects an activity and adds a topic & subtopic, and the questionnaire is based on the topic & subtopic of the activity. Questionnaire is unique based on the activity, topic & subtopic.
Self-evaluation
In normal terms, self-evaluation means a process of looking at oneself. Likewise in this odyssey Index self-evaluation is a process of looking at your open science activity by yourself. In this self-evaluation process for the event/activity, the author had to fill out a questionnaire about how he had organized an event/activity. This questionnaire is a unique questionnaire likewise global feedback & local feedback questionnaire. The questionnaire always depends on activity, topic, and subtopic.
Author based
Who is the author, according to our Odyssey index anyone who submits an open science activity, project, or event considers an author. Every Author can submit an open science activity, also they can give their point of view or guidance through comments and feedback about other authors’ activities. These comments & guidance have a unique value in Odyssey Index according to commenting, feedback has given the author’s highest education qualification, prior experience, the verification status of the author, and follow count in the Odyssey Index. At the same time, the self-evaluation process has unique value according to the author’s highest education qualification, prior experience, the verification status of the author, and follow count in the Odyssey Index. Also, our target is to verify every single author in our index. This verification process ensures validation of data that is present in the Odyssey Index, this also increases the accuracy of our result.
Education qualification
In this factor, our main concern is what is the highest educational level of the author, is our author at the college level, high school level, graduate level, or above. Points will be given according to their highest education level. These points will be considered in the self-evaluation process as well as commenting and giving feedback to other authors. This ensures the highest accuracy data will be added to the index and our result will be more accurate.
Prior Experience
What experience means is knowledge or skill in a particular activity, which a person has gained because a person has done that activity for a long time. In the odyssey index, prior experience means previously submitted open science activities and projects in the odyssey index. A person submitting accurate activities more means that person’s values for prior experience in the Odyssey index are increasing. More experience means more efficient open science activity. Our target is finding the efficiency of open science activity, So Prior experience is a very important factor in the Odyssey index.
Follow count
In the odyssey index, every user has the chance to follow an author. Also, the author can follow fellow authors as well. If users follow an author in the Odyssey Index after that they can see the author’s open science projects on their Odyssey Index wall. In the Odyssey Index, the author gets a value for each user follow, if that user follow comes from one of the author’s open science activity referral links. This “follow count” is one of the main factors in the Odyssey index.
Article based
There are three main categories when we consider article-based evaluation. Site score, recommendations & favorites are the above mentioned main three factors. Also we consider abstract & ideal participants when we give marks for article based evaluation. Abstract means it contains a high level summary of the event and the ideal participant is the most suitable participant or participants group.
Cite score
Cite score measures both the productivity and citation impact of the article. For that we use an existing metric known as H-Index. H-Index is a world recognized metric, so our purpose is to use that index to get the author’s productivity and citation impact of articles. This H-Index is used for only this site score which means that the existing Index gives us a very small value for our new Odyssey Index.
Recommendations
Our Odyssey Index currently uses a web based platform. In that platform, users can recommend some specific open science activity to another user or author by tagging them in the discussion section, but tagging cannot be done by the activity author in his own activity. If some user tags another person in an open science activity some marks will be assigned to that activity. If the same user is tagged multiple times, then in that case marks will be assigned only once. Also, what is the mark depends on the tagged user and that user’s status in the Odyssey index.
Favorites
In Odyssey Index users can mark their favorite authors, favorite activities, and projects. Use of this feature is Once you mark some activity as that user favorite, then that user can easily access that activity in their favorite list. If some user added activity as that user’s favorite, then that activity receives marks when calculating Odyssey Index. As in recommendations what is the mark is deepened on the user that activity added as their favorite. Also, that deepened on that user’s status in the Odyssey Index.
For the odyssey metric, Overall marks are calculated out of 100 and that was based on three sub categories .
- Feedback Based
- Activity Based
- Author Based
“Feedback Based “ category mainly targets the feedback from three different parties. This is weighted 50% from overall marks.this sub category is made with feedback from global community who read or watch about the open science activity,feedbacks from local community who were present at the activity on field. And finally feedback from authors self evaluation. And when considering all three feedback types, we will be mostly considering the local feedback with a 50% mark. The global feedback is taken 30% and self evaluation at 20% to cover the feedback based weight on the index value.. The reason for the low count on self evaluation is to reduce human factor errors on the value.
Author based category covers 20% from the whole index value. To cover that 20%, Odyssey metric will be using the highest educational background of the author/publisher and their experience in relevant fields of studies or work. Those two factors will cover 90% of ground and the remaining 10% will be covered with the follow count of the author/publisher.
Activity based category will cover 30% from the index value and it consists 75% from cite score and impact factor evaluation. 10% from recommendations and another 10% from saving the activity. Other 5% will be covered by user upvotes and downvotes on the activity.
How to use/Guidelines
Activity, category and sub-category
First, the most important thing is to select the most accurate activity among other activities, all other things will depend on that decision so be very careful when selecting activity.
In submission form activity, category, and sub-category will be provided as lists, if your activity,category or subcategory can not be found in these lists, please inform our OI help panel, they will go through your request and get back to you within 48 hours.
Abstract
In abstract, you have to come up with a specific and highly detailed summary about your event or activity, and you have to maintain a word count of less than a hundred .
Feedback evaluation
Local feedback is only for the participants of that event, after submitting your activity submission form, participants have 48 hours to submit their local feedback,
Global feedback is open to any one other than event participants and activity authors. Global feedback has not had any time frame.
In feedback evaluation (local feedback/global feedback/self-evaluation) you have to be honest with yourself and have to provide accurate value.
Comment & discussion
In comment section & discussion section users must have use appropriate manner and have to respect other users all the time
Submitting activity submission form
In activity submitting authors only have one chance, after submission all other submissions show as updates of original submission. So be very careful with activity submission.
Screenshots of some pages of Odyssey are attached below for your reference.
This image above shows the page where an article can be submitted to Odyssey.
Sign up page to create an account in Odyssey.
Space Agency Data
https://science.nasa.gov/open-science-overview
https://science.nasa.gov/open-science/transform-to-open-science
https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2020EA001562
https://www.esa.int/About_Us/Digital_Agenda/Open_Science
https://www.asc-csa.gc.ca/eng/publications/open-science-action-plan-2021-2024.asp
Hackathon Journey
As a team and as individuals we really enjoyed our journey in NSAC 2022. We learnt a lot of things we didn’t have any idea of before in terms of science and development.
As a team, in the initial stage we went through all the challenges in the competition and listed down our 5 most favourite challenges based on our interests and capabilities. Finally we chose “Measuring Open Science” as our challenge.
We hardly found any good metric based evaluation platform for open science in our research. First we collected all the content needed for development and then started it. As not so experts in development, we had challenges and setbacks but we didn’t give up at any point. We fought back and resolved the challenges thanks to the internet and internal determination.
We would like to thank NASA and organizers of NSAC 2022 for all the experience and support throughout.
References
https://en.wikipedia.org/wiki/H-index
https://scholar.google.com/#d=gs_hdr_drw&t=1664225721452
https://en.wikipedia.org/wiki/Multiple-criteria_decision_analysis
https://www.scopus.com/sources
https://scholarometer.indiana.edu/#api
https://libguides.lib.uct.ac.za/tracking_your_academic_footprint/h-index
https://en.wikipedia.org/wiki/Author-level_metrics
https://science.nasa.gov/open-science-overview
https://science.nasa.gov/open-science/transform-to-open-science
https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2020EA001562
https://en.wikipedia.org/wiki/Maximum_likelihood_estimation
https://www.asc-csa.gc.ca/eng/publications/open-science-action-plan-2021-2024.asp
https://www.altmetric.com/about-our-data/our-sources-2/
https://en.wikipedia.org/wiki/Bibliometrics
https://www.lib.ncsu.edu/measuring-research-impact/your-impact
https://plumanalytics.com/learn/about-metrics/
https://ui.adsabs.harvard.edu/
https://plumanalytics.com/learn/about-metrics/
https://plos.org/publish/metrics/
https://journals.plos.org/plosbiology/s/reviewer-guidelines
https://journals.plos.org/plosbiology/s/submission-guidelines
https://plumanalytics.com/learn/about-metrics/
https://www.scopus.com/home.uri
https://www.elsevier.com/?a=69451
ttps://dashboardpack.com/live-demo-preview/?livedemo=290?utm_source=colorlib&utm_medium=reactlist&utm_campaign=architecthtml
https://www.kaggle.com/
Tags
#MeasuringOpenScience #OpenScience #Odyssey #OdysseyMetric #OdysseyIndex #OI #OM #OS

