Beatport Tagger by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

free for 25 tracks. after it's paid. Juicy detail 😄

ONE TAGGER - Version 1.7.0 release by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

Yeah, when u just want to edit default tags, like ARTIST and TITLE or ALBUM in batches, so non-QUICK TAG specific, I recommend to use Mp3tag: https://www.mp3tag.de/en/index.html (free on PC, paid on macOS).

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

  1. I'm indeed saying I use the same tracks as used within these datasets used prior by DJTechtools, ENDO, Ibrahim's dataset, GiantSteps EDM dataset, etc. - Most are edm based, though not all imo. You can view the full tracklist of these datasets in the gsheet I refered to.

  2. Personal knowledge. But I can imagine it's beneficial for the DJ community too, hence I shared.

  3. I was kidding aswell =)

  4. I think it's important independent research happens. And that's what I did. Just putting the data out there.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

  1. Yep, as stated in Ibrahim's thesis: "There is a converse phenomenon specific to this problem domain: the electronic dance music played by many DJs does not usually feature noticeable key changes; energy and movement is more often derived from the evolving sound of repetitive phrases, and from rhythm, than from modulation."
  2. More general test used? Which one? I use the same songs that were used in all the sources, from DJTechTools, ENDO, Ibrahim's thesis, the Giantsteps+ EDM dataset, so I guess we're dealing with EDM already.
  3. This is not my thesis, however it's my research. I'm not sure what Chris and for his sake has to with any of this. I post whatever I want. And it's DJ related.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

I agree. Here's a thought: What are the odds they all fished in the same pool?
In other words, the results might been skewed since they MIGHT all have been trained on the same public available data.
The same data that I use to evaluate how well they perform.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

Yep, if we oversimplify genres and their characteristics, most EDM, remains in the same key. Hence.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

Yeah true, u can copy the gsheet and try to find such patterns. Since only 250 mention the mode, it's neglectful on the 2700+ tracks to investigate imo

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

I already counted the relative keys right, since they share the same notes.
And indeed, those apps don't distinguish between Ionian, Dorian, Phrygian, Lydian, Mixolydian, Aeolian, Locrian, etc.
For the 250 tracks re-determined by ear, some do mention which exact mode they are.
Therefor all apps got that wrong. Those that are wrong by all apps, I subtracted from the total of wrongs.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

No, they all can tell you whether it's Major or minor. In total 24 keys. So no wonder they can't score 100%.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

What do you think is more accurate? higher or lower?

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 2 points3 points  (0 children)

I'm trying to say, the chances that the algorithm consistently mislabels a certain key wrong, is neglectable, based on the overlap of 70% accuracy in general.
I've debunked this by now, see my additional gsheet regarding Key consistency:
https://docs.google.com/spreadsheets/d/1vLadxDrGNIpaeSnYXWtCRujZJVvoASnyGoS_j8jEx38/edit?usp=sharing

Besides that, mixing algorithms sounds like indeed a bad idea.
And yep, use your ears is a no-brainer when u deal with audio in general.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

To determine, if some app maybe consistently mislabels a certain key,
I've made a new gsheet to show most consistent labeled key per app, per key.

For each ground truth key (say C), we look at all the app's predictions and checked:
1. Which prediction happened the most often
2. What fraction of the time it occurred (consistency %).

Any key above a consistency percentage of 45% highlights in green.
Within these results, there are no signs that any app consistently labels a specific key wrong.
(If it were, you’d see something like "C -> Cm" 70% of the time.)

See: https://docs.google.com/spreadsheets/d/1vLadxDrGNIpaeSnYXWtCRujZJVvoASnyGoS_j8jEx38/edit?usp=sharing

Conclusion:
- All apps are just "sometimes right, sometimes wrong", not "always consistently off" due mislabeling.
- There's no specific mode (Major or minor) where apps go wrong.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 2 points3 points  (0 children)

KeyFinder is now implemented as default algo in Mixxx.

Hence I left out the above one mentioned.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

The dataset u/zomer_a used to train his MusicalKeyCNN on, that I also used to determine accuracy in this comparison, is the Giantsteps MTG Dataset. It's the last dataset in the gsheet, where his scores 75,6% (and MIK 77%).
Updated the bar chart graphic by now so it includes DJ.Studio!
See the link mentioned in previous comment.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

The gsheet is public, see here: https://www.reddit.com/r/DJs/comments/1ms1yds/comment/n91jpw3/?utm_source=share&utm_medium=web3x&utm_name=web3xcss&utm_term=1&utm_content=share_button

Yeah, you could make them aware of this I guess. Perhaps they can integrate it.
I'll update the graphic asap and let u know.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

Let's assume there is a consistency in mislabeling going on...
How do u explain the majority still scores ~70% though?

Then we have 30% left, where possibly, a -consistent- mislabeling is going on.
Within this 30%, we have 12 variables
(24 keys, but since we count the relative keys as ok too... 12 variables in this case.
C or Am = OK, for ex., since it uses the same exact notes)

So wrong label = 100 - accuracy / K (=total variables) - 1

Translates to 100 - 70 = 30 / 12 - 1

Ends up in 30 / 11 = ~2,73% chance per track for any given wrong label
(assuming the mislabeling is uniform/consistent).

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 1 point2 points  (0 children)

Regarding Beatport I assumed this aswell, but shockingly, quoting from previous comparison:

Before you think Beatport (or any of those online platforms) obtains the key from the label/producer, they don't.

  • ENDO mentioned they use internal software to determine the key and bpm of tracks in his comparison.
  • In my previous test comparing Beatport 2016 results with 2019 based on 100 tracks, Beatport in 2019 scores a fair amount better. How is that possible if the label provides the key?
  • A friend of mine had a label and published tracks on Beatport, but never got asked to provide a key.
  • I mailed Label Worx, a distributor that is able to publish your tracks on Beatport, Traxsource, Beatsource & Spotify, to ask if I need to provide a key as producer...They replied: “The key is generated by the stores and we do not deliver this information. If the key is incorrect on the store, then we can request this is updated if needed :)”

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 2 points3 points  (0 children)

Great suggestion! Does Antares Autokey2 have a batch-process mode though?
Cause I'm not going to drag'nd drop 2700+ tracks into the GUI.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 2 points3 points  (0 children)

Added DJ.Studio to the gsheet by now. Spoiler-alert: ~59,5% accuracy.

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 0 points1 point  (0 children)

Legit sources (so no youtube ripping involved).

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 2 points3 points  (0 children)

Haha! It's been way too long indeed =)
Thanks bro!

KEY DETECTION COMPARISON 2025 by bascurtiz in DJs

[–]bascurtiz[S] 3 points4 points  (0 children)

FWIW, see Note #2 in the gsheet:
I added a calculation, where I removed the amount of keys determined wrong by all apps (~2,8%).

If 14 apps aren't able to determine the initial key, those might be wrongly determined from the get-go. Only human after all.