Awards & Nominations

Team Artemis has received the following awards and nominations. Way to go!

Global Nominee

SPACETUNE: Unifying through Sound

High-Level Project Summary

Spacetune allows all individuals to interact with space through sound. It delivers a clean, 3-step process to personalize real music coming from the night sky, turning personal stories and memories into custom songs. People around the world have limited exposure to NASA's work in space exploration. With Spacetune, people have a meaningful way to connect with space through sonification and synthesis of space images and music. This gives wings to the imagination of communities of all backgrounds to celebrate the bridge between art and STEM."The most beautiful thing we can experience is the mysterious. It is the source of all art and science." - Albert Einstein

Link to Project "Demo"

Detailed Project Description

WHAT DOES IT DO

Our solution is a web application which allows individuals and communities to interact with the James Webb Space Telescope and Hubble Space Telescope Mission data through music. Spacetune commemorates life's most precious moments through song with cosmic images of deep space. Rather than utilizing science and engineering as the primary educational tool for engagement, our application provides a unique musical journey that allows individuals across all cultural, socioeconomic, and ethnic backgrounds to write and share their personal stories.


Figma: https://www.figma.com/community/file/1158531594923427326


HOW DOES IT WORK

The individual needs to sign up with their email to partake in the STEAM (science, technology, engineering, arts and science) Spacetune Community. Once the user joins and creates an online profile, they follow a simple three-step process to engage with the music of the universe:


1) CHOOSE AN IMAGE: Filter by date to select an image of the night sky or upload your own.


2) CHOOSE A SONG: Upload a song file or have the song chosen for you based on emotional impact or occasion. You can even upload a recording of yourself singing or playing an instrument so that you can jam out with the stars.



3) CREATE SOMETHING NEW: Combine an audio translation of your cosmic image with your song and share it with your community. 



Our algorithm will convert the image into sound so that it's musically harmonious with the chosen song in both key signature (pitch) and time signature (tempo). The user gets to remix the image sonification with their chosen song in a variety of ways. For example, the user can change the relative volume of each audio track in order to decide when the image sonification will be best heard during the song.


BENEFITS FOR NASA

Our solution allows users to integrate NASA’s groundbreaking imagery into music with personal and cultural relevance. The emotional resonance of music can form a deep connection between space imagery and the personal lives and experiences of a diverse audience; increasing engagement, awareness, and the inherent meaning of NASA's data services.

Space Agency Data

Our team developed a web crawler to scrape high-resolution images and sited information from the Webb First Image (source: James Webb Space Telescope Mission database) and Barbara A. Mikulski Archive for Space Telescopes (MAST) (Hubble Space Telescope Mission database). Specific parameters and criteria were established in order to filter and select images as source data. Examples of criteria include: high resolution (HR) images, clarity of picture, and color contrast of the object in view which is convenient for the algorithm to run effectively. The image is converted into sound using a musical inverse spectrogram technique in which the image is scanned from left to right. Brightness is mapped to volume and vertical position is mapped to musical frequencies that are consonant with the chosen song. This sonification can then be inverted to recover the key features of the image.

Hackathon Journey

We have a dedicated team. Two of our members drove in from Michigan and Philadelphia to be present this weekend for the DC Space Apps Competition. Our hackathon journey was a rollercoaster ride and we are grateful for the time we were able to spend together coding, learning, and collaborating.


It is the third consecutive year that Team Artemis has participated in the NASA Space Apps Challenge. Our team was created by Hubble Financial, a startup launched at NASA Space Apps with a company mission to democratize data. We chose to participate in this challenge mainly because of the creative and artistic proposal. While many projects tend to use technology in order to execute a concept, we find that using our imaginations and creativity is fundamental in the aspiration of product development and new projects.


One of the biggest challenges our team had with this hackathon was determining how to convert the image into music. We had to decide between continuous vs. discrete mappings of image information into sound. We decided to use a continuous mapping so that it will be easier to combine with other music (there are no clear note onsets so there is less potential for rhythmic clashes). This limits the choice of musical sound to synthetic tones or instruments capable of sustaining notes but ensures that a harmonious combination of the sonification and song can be achieved. In the future, a different algorithm could be implemented to convert image data into discrete events but adequate constraints would need to be imposed.


Our solution evolved out of of genuine curiosity and excitement. How many more people today would be impacted by the wonders of deep space if there was previously a tool that allowed us to emotionally engage with mission data? This type of response is not intended to necessarily be resolved, but in order to provoke creativity, disseminated through music and culture.


Special thanks to Dr. Matt Russo for joining the team and for his research on sonification -- he was an absolute inspiration to the project as a whole. Dr. Matt Russo's website: https://www.astromattrusso.com/ . Ad Astra.

References

DATA AND RESOURCES:


Image Resources:


WebbTelescope.org

A developing gallery of images featuring astronomical observations and informative science content around the Webb telescope (JWST) mission. (72 kB)


Webb Space Telescope data: https://webbtelescope.org/resource-gallery/images


The Mikulski Archive for Space Telescopes (MAST) Portal lets astronomers search space telescope data, spectra, images and publications. Missions include Hubble, Kepler, GALEX, IUE, FUSE and more with a focus on the optical, ultraviolet, and near-infrared parts of the spectrum. https://www.mast.stsci.edu


Jackmcarthur/musical-key-finder

A python project that uses several standard/otherwise very common libraries to determine the key that a song (an .mp3) is in, i.e. F major or C# minor, with annotations and some examples.


Barbara A. Mikulski Archive for Space Telescopes (MAST) dataset

https://mast.stsci.edu/portal/Mashup/Clients/Mast/Portal.html


TOOLS


  • Jupyter Notebook
  • Python
  • MySQL
  • Figma
  • Wix
  • Canva
  • Musical-key-finder package: https://github.com/jackmcarthur/musical-key-finder
  • Wav file writer package: Think DSP: Digital Signal Processing in Python 1st Edition by Allen B. Downey
  • Other python libraries: librosa, PIL, pandas, lumpy, audiolazy, pydub, urllib


RESEARCH

The following websites were useful in determining which sonification algorithm would be best suited for our purposes and how to implement it: 


  • SYSTEM Sounds: https://www.system-sounds.com
  • Chandra’s Universe of Sound: https://chandra.si.edu/sound/
  • NASA’s Explore: From Space to Sound: https://www.nasa.gov/content/explore-from-space-to-sound
  • Data Sonification Archive: https://sonification.design

Tags

#TeamArtemis #HubbleFinancial #Hubble #Webb #Sonification #STEAM #Inclusion