Poll: How many songs do you have??

Viewing 2 posts - 21 through 22 (of 22 total)
  • Author
    Posts
  • #3501
    nooneuno
    Guest

    By “burnt to a crisp”, I mean that mediamonkey bogs down after about 60K tracks or so. By bogs down, I mean it takes multiple seconds per keypress when searching for tracks or artists or what-have-you. this is the same issue with iTunes. No system I know of can handle the large amount of metadata that is generated by having so many tracks.

    For now, I have split my library into 5 chunks with mt-daapd and that has acceptable performance in iTunes.

    I *do* really like MediaMonkey for tagging, etc., and I continue to keep it around.

    #3502
    wolfzell
    Participant

    @nooneuno wrote:

    By “burnt to a crisp”, I mean that mediamonkey bogs down after about 60K tracks or so. By bogs down, I mean it takes multiple seconds per keypress when searching for tracks or artists or what-have-you.

    Thanks for explaining. I am no native english speaker, so sometimes I have to ask.

    I don’t know if it is possible at all to find anything faster than MediaMonkey for that purposes. MM’s database part is based on MS Access and for complex hierarchic data I don’t know of any database system that is faster when used properly, even with a lot more than 60k entries.

    Actually I find the speed with my amount of tracks already amazing. MM does a within-search in realtime, while you are typing in the search keywords. Delays of around 1 second until the answer table is appearing. And with that amount of data, a within search and building the tables for the answers simply does take some time.

    I mean, for a really bad example how to do it, have a look at MusicMatch. With 4000 tracks, search times go up to minutes almost regardless of system performance! So some seconds with 60k tracks on a 1,8 GHz machine with only 512 MB RAM is rather good in my eyes.

    So I guess the best you can do is to boost your systems performance in processor speed and memory according to the amount of data you are processing.

    Just looked up your system configuration. 512 MB RAM? I guess you start upgrading here and add another 512 MB. I guess that will help a lot.

    My 12500 tracks have a database that is 20 MB big. So 60k tracks will have 100 MB data. Add the RAM amount needed to keep the data in memory necessary for processing and you might very well end up with much more. So you better make absolutely sure, you have enough RAM so the Access database structure can be cached in RAM completely. Windows will still swap, but I believe performance will be a lot better with more RAM in your case.

    bye
    Wolfgang

Viewing 2 posts - 21 through 22 (of 22 total)
  • The forum ‘General Discussion’ is closed to new topics and replies.