High-Level Project Summary
Imagine the possibility to visualize information in 3D through the use of your mobile phone camera. This is an Augmented Reality visualizer that allows user to get information such as papers, images, videos related to an object through a captured image by user's smartphone. The information to display is going to provide resources based on Artificial Intelligence using image recognition, after this AI process, the app will research at NASA resources and exposes related information to the user through the smartphone screen. Example: If the user is taking an image of a spaceship then the app will show more details about it, some videos or technical information about that object.
Link to Final Project
Link to Project "Demo"
Detailed Project Description
Visualiz-AR
As Public-AR team we decided to propose a solution for the challenge Can AI Preserve Our Science Legacy? Through the “Visualiz-AR” project.
· "Information is power"Hobbes at El Leviatán book.
· "Information is knowledge"Francis Bacon.
· “Information must be dynamic and 3D formattable”Public-AR team.
Despite the fact, as humans, we already deprecated some information sources as books replacing them with web sites and online information, we still using digital devices that showing us information without taking into consideration the use of emerging technologies such as Artificial Intelligence and Augmented Reality combined, except of course by the proposal of Hollywood Science Fiction movies. Currently there are many recent frameworks that allow us to develop solutions using these emerging technologies and they only need smartphones to visualize information in a video, audio and images/3D objects format
We are developing a 3D augmented Reality visor that allow users to search and visualize information based on what the smartphone camera is ‘watching’ or even it will be displaying information based on the user location.
Just imagine you are inside a NASA campus, and you want for information about the organization, about a place or about an object you are seeing at that moment. Then you will be able to get your smartphone, open the webpage (Because yes, we're using JS web technologies) and then visualize 3D content regarding your interest just with the use of a cellphone camera (and internet connection of course).
The application will be connecting to AI and AR systems to provide you with the requested information about science or another interested topic using emerging technologies that allow us to be pioneers on this way of providing information. We also consider that this technology or way to present information will reducing print media pollution by suppressing the need for paper formats.
Space Agency Data
We decide to use data from https://ntrs.nasa.gov/ to display at the web app in image, video and pdf formats. But is not closed to consume only that information source, the information could be provided by any government or public instance.
Hackathon Journey
It encourages us to generate a POC of our project based on JS and web technologies, this help us to have functionality in every smartphone regardless the O.S. that it runs. Also we ask us some questions about business model and viability of our project and write it down into a document for future references. We went dive into cloud services information and implementation that can power us to reach our goals of design and run our project.
References
-Google cloud patform https://cloud.google.com/vision/docs/
-AR.js https://ar-js-org.github.io/AR.js-Docs/
-Firebase https://firebase.google.com/docs/
Tags
#AI, #AR, #ML, #ComputerVision, #Firebase, #GoogleCloudServices

