Take MIDI or audio files and make a Frequency Fingerprint of the music (analysis; music theory) - calc average pitch, most common pitch, etc
The code is implemented in python on Google Colab Notebooks, which you can view and run here:
-
MIDI, runnable release version 2.0: https://colab.research.google.com/drive/1oNAScVK1Ap40fQN4WsAikZ2_kuU8RD1M?usp=sharing
-
Audio File, most current version (demo only for now): https://colab.research.google.com/drive/1ai3X9QErw4OvJ1RHZs2jn6AZ-0RVh8yc?usp=sharing
- From those share links, you can run the Notebook and process files to create Frequency Fingerprints. However, you will need to log into a Google account before running any code.
- You can also make a copy of the Notebook and use/edit it to your heart's content
Sample Results
- Chopin Preludes in 3 Seconds - Frequency Fingerprint of each Prelude shown & heard in rapid succession (Youtube)
- Sample real-time calculation of Frequency Fingerprint from an audio file (Youtube)
Wiki
The wiki has articles with more information about the idea of Frequency Fingerprints, how to run the software, and example analysis of music using the Fingerprints: