SolART

High-Level Project Summary

SolART is a web application that turns solar weather data from NASA into synth tones and moving artwork. By feeding real-time data into an AI artwork generator (Lucid Sonic Dreams), SolART creates beautiful visualizations that change in response to proton flux caused by solar activity. SolART further sonificates the data by representing solar activity as pitch.

Detailed Project Description

SolART allows for the creative visualization and sonification of real-time solar flare data. This app provides an engaging audiovisual feed for non-visual data, to provide users with an experience of scientific phenomena through the lens of AI generated art.

Our main language used throughout is Python, with some R and Javascript implementation. In particular, much inspiration was drawn from one package utilized called Sonic Lucid Dreams. This package was originally created to make beautiful videos that morph generated AI artwork to the audio waveform of a song.

We were inspired by this - and other audiovisualizers - to instead feed in real-time solar flare data from the Parker Solar Probe. Essentially, the proton flux of a given moment is read into the GAN, altering the weights of the neural net, thus creating artwork that morphs in concert with the current solar weather. We provide options for different styles of artwork as well, all from Justin Pinkney's pre-trained art GANs.

We added in a further toggle-able audio option for sonification of the data using PyAudio, wherein pitch represents the current proton flux.

While our app was unable to be completed within the allotted time, our team had a lot of fun brainstorming and beginning the creation of this application. As such, will continue to develop it despite the end of the Hackathon drawing near!

Space Agency Data

We utilized DONKI API data and that of the Parker Solar Probe Science Gateway.


The data was read into a package that traditionally takes in an audio waveform and changes the weights of a generative adversarial network to create moving images that morph to the music. Taking this approach, we sought to instead feed in the live solar flare data into this package to create art that morphs to the solar weather. We also fed this data into PyAudio to create a sonic "visualization" of the data through changes in pitch.

Hackathon Journey

How would you describe your Space Apps experience?

Our team has a diverse background in experience (from bioinformatics and data science, to machine learning, to front-end development) and have varying levels of coding experience (from members with only a few months of experience, to a member with a Master's degree in AI). As such, this experience with Space Apps allowed us to learn a lot of new things with from one another that may have previously felt intimidating to approach or "out of the wheelhouse" for our respective skillsets. Getting to sketch out project ideas, app design, and generally learning more about space was an excellent exercise in creatively implementing an idea from conception to completion while, strengthening our ability to delegate work and collaboratively learn.

What did you learn?

We learned a lot about working with real-time data, which many of us were unfamiliar with. We learned new and exciting packages to use in both our work and personal projects, as well as interesting ways to display data. We also got to learn a lot of really cool things about space weather and solar flares!

What inspired your team to choose this challenge?

We were inspired by the fact that the challenge was rather open-ended and left a ton of room for creativity! The challenge left open space for each of us to contribute to a unique facet of the whole and to build up new skills at the same time.

What was your approach to developing this project?

We approached the project development by first meeting up and sketching out an idea for a final product, divvying up our respective roles, and collaboratively tackling each aspect of the final product. Some individuals did research into solar flares and data itself, others worked on learning how to interact with realtime data from the API , others worked on learning to utilize the sonification and GAN visualization packages. With each of these pieces in place - we first created a shiny app skeleton to look at static data and then began to look into using the API data (before running short on time)

How did your team resolve setbacks and challenges?

We all did our best to help one another troubleshoot and took turns running issues and searching out errors for each other when errors cropped up. Importantly, we ensured to take breaks when needed and did not place unneeded pressure upon one another.

References

Lucid Sonic Dreams - Mikaela LaFriz (https://github.com/mikaelalafriz/lucid-sonic-dreams)

Pre-trained GANs - Justin Pinkney (https://github.com/justinpinkney/awesome-pretrained-stylegan2)

NVlab StyleGAN2 - Tero Karras, Samuli Laine, Miika Aittala, Janne Hellsten, Jaakko Lehtinen, Timo Aila (https://arxiv.org/abs/1912.04958)

PyAudio - jleb (https://github.com/jleb/pyaudio)

Parker Solar Probe data - (https://www.nasa.gov/content/goddard/parker-solar-probe)

DONKI NASA API - (https://api.nasa.gov/)

Tags

#art #visualization #sonification #spaceweather #solarflare