Openpale

High-Level Project Summary

Openpale is an online platform that unifies all climate-changing tackling data solutions in one clean and user-friendly interface. Openpale is also powered by end-to-end real-time deep learning models that leverage that data to provide meaningful and essential insights on our envirements, such as water shortage detection, forest fire prediction and alert, and various other models. Our platform also offers personalized action items to its users as well as valuable key indicators on the viability of their home city, their home country, or even neighborhood by giving the percentage of CO2 emissions and other green house gases.

Link to Project "Demo"

Detailed Project Description

# Project Architecture


We've established a general AI engine that can run inference on multiple state-of-the-art AI and deep learning models and provide real-time results to our users. We're also using ElasticSearch as our DBMS and AWS services for various micromanagement solutions in our application. We've also relied on docker to create manageable container images to facilitate the deployment and integrate GitLab-ci for this purpose as well (our code repository is hosted on GitLab).


Technologies and frameworks that we used :

  • Python / Javascript / Typescript /React 18
  • Tensorflow/Keras/Onnxruntime
  • Semantic Segmentation / multi-class classifications / regression models
  • Docker/Docker-compose
  • Trition server / Tensorrt & TVM Neural network optimizers
  • AWS S3 / AWS EC2 & ECR


#Why Openpale?

  • GREEN AI MODELS: did you know that The Carbon Footprint of AI and Deep Learning is around 626,000 pounds of CO2? And that's just for training. Our AI models were optimized through several neural network optimization and quantization techniques and are capable of utilizing 80% fewer GPU resources than traditional models. Which cuts down on our CO2 footprints.
  • FAST AI MODELS: Due to the several optimizations we've conducted we've managed to accelerate our models by a factor of 7x (on the same machine) which allows us to give our users real-time results on demand.
  • ALL THE DATA ONE PLATFORM : We're planning on providing all key indicators / vital signs and valuable insights on climate change and the health of our planet as well as our user's local environment in one platform. And to raise awareness about the global as well as the individual impact of our actions.


#Our platform


We hope to empower individuals to take climate-friendly actions, by providing them with all the insights and key figures.

We also hope that through our machine learning models and easy-to-use UI, our users will gain more knowledge on the dangers and risks that climate change is posing, and protect the future of our planet.

Space Agency Data

https://api.nasa.gov/

https://www.esa.int/Enabling_Support/Operations/Ground_Systems_Engineering/SLE_API

https://developers.arcgis.com/rest/

https://svs.gsfc.nasa.gov/

Hackathon Journey

Our journey has been an exciting one, we've managed to grow as a team, to establish new skills in fields/areas that are far from our specialty and careers but that we came to find as necessary for us as individuals to grow. There were times when the obstacles seemed too big, the difficulties too hard, but we took one small action at a time, one small choice through our journey.

There were many technical challenges, but our mentors helped us overcome them. Working remotely, and collaborating on the same project also posed some issues and we learned how to use git version control as well as various other technologies such as docker. There was also the issue of creating Eco-friendly AI, as the carbon footprint of training and running inference on AI models is almost equivalent to that of a car over the course of a month. That's why we've worked on several optimization techniques to allow our models to consume less GPU memory, reduce inference time and run parallelly on the same machine which greatly reduces our energy consumption.

The limits of hardware requirements were also a big hurdle but thanks to the various offers provided by Nasa Space App partners we managed to obtain Virtual machines to train our deep learning models before the deadline.

References

https://arxiv.org/abs/1505.04597

https://arxiv.org/abs/1512.03385

https://github.com/triton-inference-server/server

https://github.com/tensorflow/tensorflow

https://mlco2.github.io/impact/

https://arxiv.org/abs/1906.06821

https://github.com/NVIDIA/TensorRT

https://github.com/facebook/react

Tags

#AI #DATA #Climate #Globalwarming #firedetection #watershortage #objectdetection #classification #application