Awards & Nominations

Skylab has received the following awards and nominations. Way to go!

Global Nominee

Medical Artificial Intelligence for Astronauts health care

High-Level Project Summary

There has been an increase in digital information as a result of the widespread adoption of computer-based technology. Medical professionals struggle to accurately analyze symptoms and detect diseases at an early stage. Astronauts are in an extremely risky situation while they are in space because of the terrible radiation from space. Using our technique, our main objective is to find these problems as soon as possible before the situation worsens. Meanwhile, they are unable to obtain enough facilities for machine testing. We develop our disease prediction app using a variety of NASA datasets, APIs, and tools to support astronauts before their conditions deteriorate and to fill in the gaps.

Link to Project "Demo"

Detailed Project Description

Main motivation:


Our solution addresses the health issues of astronauts. During space missions, astronauts need quick health support to prevent sickness they may face. The artificial intelligence we made will help to detect and diagnose various types of health-related problems during traveling in space. This project will assist astronauts in staying physically well during missions. When an astronaut has a conversation with this AI, it may seem like taking advice from a real-life doctor.


The Basic Model Structure for training and testing:

To build this system we have to use various deep learning models. The all-model details are given below with all details.


Deep Learning models


Resnet50

ResNet-50 is a convolutional neural network that is 50 layers deep. You can load a pre-trained version of the network trained on more than a million images from the ImageNet database [1]. The pretrained network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals.



Inception-Resnet-v2

Inception-ResNet-v2 is a convolutional neural network that is trained on more than a million images from the ImageNet database [1]. The network is 164 layers deep and can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals.


VGG19

VGG-19 is a convolutional neural network that is 19 layers deep. You can load a pre-trained version of the network trained on more than a million images from the ImageNet database [1]. The pretrained network can classify images into 1000 object categories, such as keyboard, mouse, pencil, and many animals



Those are the three most important models we have used in our project.


Deep learning NLP model:

Also, we have used various deep learning NLP models that are used to make Artificial intelligence.

The NLP Transfer learning models are BERT, GPT2, and ELMO.


NLP Model:

GPT2

Generative Pre-trained Transformer 2 is an open-source artificial intelligence program created by OpenAI in February 2019.


BERT

Bidirectional Encoder Representations from Transformers (BERT) is a transformer-based machine learning technique for natural language processing (NLP) pre-training developed by Google. BERT was created and published in 2018 by Jacob Devlin and his colleagues from Google.



ELMO

Consisting of one forward and one backward language model, Elmo's hidden states have access to both the next word and the previous world. Each hidden layer is a bidirectional LSTM, such that its language model can view hidden states from either direction.


Space Agency Data

We have used NASA GENELAB (https://genelab.nasa.gov/) data and to access those data NASA GENELAB API has connected to the backend.

  1. GLDS-344: Real-time quantitative PCR analysis of human cardiovascular progenitor cells flown aboard the International Space Station.
  2. GLDS-410: Dataset for dose and time-dependent transcriptional response to ionizing radiation exposure.
  3. GLDS-369: Bystander responses to 0.5Gy of alpha-particles in a human 3-dimensional skin model in 4h after exposure to ionizing radiation.
  4. GLDS-368: Biological response to a low dose of alpha-particles in a human 3-dimensional skin model, in 1 and 16h after exposure to ionizing radiation. 
  5. GLDS-367: Bystander response to 0.5 Gy of alpha-particles in a human 3-dimensional skin model in 16h after exposure to ionizing radiation.
  6. GLDS-370: Insulin resistance induced by physical inactivity is associated with multiple transcriptional changes in skeletal muscle in young men


Hackathon Journey

This is our first time participating in the NASA Space App Challenge. This is a beautiful experience. My teammates and I worked hard and spent a sleepless night for the hackathon. We learned a lot of things during the hackathon. My mentors are so polite and help us a lot. Also, we did so many fun activities and took a break after hard work. That was always will be the most beautiful memory of my life.

References

[1]https://humans-in-space.jaxa.jp/en/biz-lab/med-in-space/healthcare/

[2]https://mashable.com/feature/nasa-astronaut-healthcare-congress

[3]https://hbr.org/2017/07/how-nasa-uses-telemedicine-to-care-for-astronauts-in-space

[4]https://www.nasa.gov/johnson/HWHAP/deep-space-healthcare/

[5]https://www.ncbi.nlm.nih.gov/books/NBK223779/

Tags

#Artificial intelligence #Artemis#Space Exploration