Researcher's best friend, Flash.

High-Level Project Summary

We developed a web user interface, simple and agile to colect a context of a voice or text research. By feeding an AI NLP to make to make optimized research on the NASA corpus(NTRS). The interface is simple and accessible for the researcher, making the search for data needed precise and personalized. Due to this implementation most of the time that once was wasted on trying to finding the right articles and people will now be saved. Also, in a future aplication, the interface will be able to connect similar researchers improving the information exchange. As a consequence, science dicoverys are going to be much faster, making science evolution walk in a even faster and precise pace.

Link to Project "Demo"

Detailed Project Description

Our project is a web interface that uses Natural Laguage Processing (NLP) AI to improve accessibility, speed and accuracy of the results found on reseach. By using text or voice input to capture a research context (the research intention), it applies a NLP layer to select useful information and utilise it as a parameter to colect the data form the NASA API. We hope that our application saves time for every research, and improve the quality of the results found. In a global scale, we expect to accelerate even more science's develpment pace. Also, in future application we hope to reconize patterns in users recent activity and areas of expertise and connect them for possible positive impact in overall reseach development.

We used the following applications:





  • vscode.com
  • heroku.com
  • github.com
  • mongodb.com
  • pypi.org/project/pip/
  • www.python.org/
  • fastapi(0.85.0)
  • fpdf(1.7.2)
  • googletrans(3.1.0a0)
  • Jinja2(3.1.2)
  • pip(22.2.2)
  • pydantic(1.10.2)
  • pymongo(4.2.0)
  • PyPDF2(2.11.0)
  • requests(2.28.1)
  • starlette(0.20.4)
  • uvicorn(0.18.3)
  • Werkzeug(2.2.2)
  • matplotlib(3.6.0)
  • nltk(3.5)
  • numpy(1.23.3)

Space Agency Data

https://ntrs.nasa.gov/api/openapi/


We used it by integrating the endpoints citations/search by fullfiling the query params with the results of the processing of the recorded voice in our NLP module.

Hackathon Journey

We as a team could describe the Space Apps hackaton experience as one of the best intensive learning projects we ever participaded. We learned that we can look at a problem, even if uncertain of how to solve it, and we can figure it out. Also, because of our specific challenge, all of us (team members) could learn more about how AI and Natural Language Processing (NLP) works. We looked at the problem of wasted time of researchers during their job due to inneficiency on their ways to find information, and the lack of connection between similar scientist across the earth. Then, we realized that if we could make scientific research even a little faster, we would be increasing scientific discovery to a much faster pace, bringing the future faster. Thus, we chose to break apart and sort the problems we would face. Then we delegated each problem to the team members acording to each one's competence. As the project was developing, we exchanged information and tasks, this way solved other issues that appeared along the way. We would like to say our spacial thanks to all local team and support, including teachers, and other teams that helped us along this journey.

References

vscode.com

github.com

www.heroku.com

code.visualstudio.com

mongodb.com

canvas.com

powtoon.com

google.com

pypi.org/project/pip/

www.alura.com.br/artigos/guia-nlp-conceitos-tecnicas

www.python.org/

fastapi(0.85.0)

fpdf(1.7.2)

googletrans(3.1.0a0)

Jinja2(3.1.2)

pip(22.2.2)

pydantic(1.10.2)

pymongo(4.2.0)

PyPDF2(2.11.0)

requests(2.28.1)

starlette(0.20.4)

uvicorn(0.18.3)

Werkzeug(2.2.2)

matplotlib(3.6.0)

nltk(3.5)

numpy(1.23.3)

https://www.itproportal.com/features/study-reveals-how-much-time-is-wasted-on-unsuccessful-or-repeated-data-tasks/

https://www.remove.bg/pt-br

https://github.com/LefrancoisCronapp/bitbull

Tags

#research, #AI, #software, #accessibility, #efficiency, #connection, #data, #people