Analyze Key & Mood (Music Theory)
Homecrate can analyze any track’s audio to detect its musical key, scale, mood, tempo, and chord palette — all computed on-device using audio signal processing via the Tonic audio analysis framework. No internet connection is required.
Running the Analysis
Scroll down on the Now Playing screen to reveal the Track Info Panel. The “Analyze Key & Mood” card appears near the bottom.
[SCREENSHOT: "Analyze Key & Mood" card with Music icon]
Tap it to start. Analysis typically takes 5–20 seconds depending on track length and device speed.
[SCREENSHOT: Loading state — "Analysing audio..."]
The Analysis Result
[SCREENSHOT: Music Theory result card — key, mood, BPM, relative key grid, diatonic chords, heard-in-track chords]
The result card shows:
Key Grid
Four data cells in a 2×2 grid:
| Cell | What it shows |
|---|---|
| KEY | Detected key and scale, e.g. A minor, C major |
| MOOD | A derived mood label, e.g. melancholic, reflective or uplifting, joyful |
| BPM | Estimated tempo in beats per minute |
| RELATIVE | The relative major or minor key (e.g. C major is the relative major of A minor) |
Diatonic Chords
The seven chords naturally occurring in the detected key, shown as pill badges. These are the chords the track “should” use based on its key — useful for songwriting or jamming along.
[SCREENSHOT: Diatonic chord pills row]
Heard in Track
The most frequently detected chords in the actual audio, shown as a separate row of pills. These may differ from the diatonic set if the track uses borrowed chords, modal mixture, or chromatic passages.
Key Confidence
A small percentage note at the bottom shows how confident the key detection is. Below 60% suggests the track may have an ambiguous key — common with modal music, jazz, or very chromatic arrangements.
How Mood Is Derived
Mood is computed from audio signals in a specific priority order. BPM is the least reliable signal (the detector can return double or half the real tempo) so it’s used last. The primary signal is onset rate — how many distinct note attacks or hits occur per second — which correctly handles edge cases like slow-tempo heavy metal and fast-tempo ambient music.
The full signal hierarchy, from most to least reliable:
1. Onset rate — note attacks or drum hits per second. A tremolo-picked guitar or busy strumming pattern scores high regardless of what the BPM detector says. Thresholds:
- 8+ onsets/sec → very active (fast strumming, blast beats, tremolo picking)
- 4–8 onsets/sec → active (rock rhythm, up-tempo pop)
- 2–4 onsets/sec → moderate
- Under 2 onsets/sec → sparse (slow ballad, ambient, finger-picking)
2. Chord character — diminished and augmented chords signal tension and instability. Seventh chords signal emotional complexity.
3. Key mode + minor chord saturation — a major-key song that’s heavy with minor chords still reads as emotionally dark.
4. RMS level and bass dominance — overall loudness and low-frequency weight add to the energy score.
5. BPM — used only as a tiebreaker when onset rate and other signals land in a borderline zone.
The resulting mood label combines a darkness/brightness dimension with an energy tier, producing descriptions like:
| Dark + High energy | Dark + Low energy | Bright + High energy | Bright + Low energy |
|---|---|---|---|
| aggressive, energetic | melancholic, reflective | euphoric, triumphant | peaceful, serene |
| driving, intense | dark, introspective | uplifting, joyful | bittersweet, complex |
| tense, brooding | pensive, emotional | warm, confident |
Re-analyzing
Tap Re-analyse in the top-right corner of the result card to rerun the analysis. This is useful after an app update that improves the analyzer.
Where the Data Goes
Analysis results are saved to your local database and associated with the track permanently. Once stored, the data is immediately available to:
- Library AI — for mood-based playlist generation via the
searchTracksByMoodtool - Ask AI about a track — when the AI calls
getTrackMusicTheoryto answer theory questions - Track Info Panel — displays key, mood, BPM, and chords for the current track
The analysis data stays on your device and is never uploaded or shared.
Analyzing Your Whole Library
You can run music theory analysis on every unanalyzed track in your library at once using the Analyze Library tool in Library AI. See Analyze Your Library.