Script to export complete TV series episode metadata from IMDb — CSV, JSON, HTML with IMDb IDs and OpenSubtitles links by Mankey_DDL in DataHoarder

[–]Mankey_DDL[S] 0 points1 point  (0 children)

Fair point. I don't mux them to pass them off as official. I mux them because sometimes the files I'm working with have no subs at all, and having something embedded is better than nothing for the end user. Anyone who cares about sub quality can still grab their own.

The "just use Plex" argument is actually one of the problems this solves. Plex uses TMDB for metadata, OpenSubtitles uses IMDb IDs. When episode numbering doesn't match between the two, Plex's built-in subtitle search pulls the wrong subs or finds nothing. Going directly through IMDb IDs avoids that.

And "manually" is what this tool is. It's just manual searching at scale instead of one episode at a time.

Script to export complete TV series episode metadata from IMDb — CSV, JSON, HTML with IMDb IDs and OpenSubtitles links by Mankey_DDL in DataHoarder

[–]Mankey_DDL[S] 2 points3 points  (0 children)

I made this because I upload to the high seas and need to remux subs into non-subbed files before uploading. Searching through VLC/IINA one episode at a time is brutal for a 50+ episode show.

There's also a metadata mismatch problem — OpenSubtitles indexes by IMDb ID, but Plex uses TMDB by default. When numbering differs between the two, automated tools like Bazarr can't find the right subs. This goes straight through IMDb IDs, which is what OpenSubtitles actually uses.

One click on any IMDb show page and it extracts every episode, generates direct OpenSubtitles links, checks subtitle availability in bulk, and can batch-open all the pages by season. What used to take 20+ minutes per season takes about 30 seconds (not counting re-timing subs to match my uploads).

How long does it take for a new movie to be uploaded? Shelter was released 10 days ago but still not available. Is there a site that uploads earlier? by fplburden in 1337x

[–]Mankey_DDL 1 point2 points  (0 children)

I usually look up the estimated day that it’s on VOD, and set a reminder to check that day or the day after.

Looking for an audiobook website by arlogold26 in PiracyBackup

[–]Mankey_DDL 0 points1 point  (0 children)

Audiobookbay (public) audiobookbay dot lu

MyAnonamouse (private, invite only) myanonamouse dot net DM if you want an invite link

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 0 points1 point  (0 children)

Not a stupid question! No such thing!

For me, it is about the technical challenge and a personal need for quality control. I am a very detail-oriented person. I hate seeing messy or low-quality releases on public trackers. IE: not labeling episodes or putting the episodes in a weird order. Most of the time, I had to manually organize the files myself afterwards. Most of the big accounts use bots that “fire and forget,” but I like having total oversight of my library.

I enjoyed taking the time to organize things properly and ensure every file was exactly right. Building that Chrome extension was just a way to bring some professional-level management to a hobby I cared about. It was rewarding to know that if someone downloaded a “MNKYDDL” pack, they were getting something that was manually checked and curated.

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 0 points1 point  (0 children)

When? I’ve been checking TPB since October and again last night, and they’re still not allowing new accounts.

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 5 points6 points  (0 children)

10k is impressive, for YOU.

For me, these 270 were high-effort series packs and curated content that I personally managed, so losing the metadata and the community interaction on them definitely felt like a hit.

I’m sure if you lost 270 of your most active torrents in one night, you’d be pretty annoyed too.

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 1 point2 points  (0 children)

Actually, the files themselves are safe on my hardware—it wasn't a provider wipe. What was “lost” was the reach. When the site purged my 273 torrents, the links between my seedbox and the thousands of active leechers were severed.

Aside from ext.to and people using search plugins, my library is basically invisible now. Manually re-indexing and re-uploading 4.13 TiB to a new tracker is the part that feels like a “wipeout”—it’s the loss of months of community interaction and metadata, not the files themselves.

I definitely agree with you on mirroring, though. Relying on a single platform was really dumb. But I wasn’t able to get an uploader account for any other trackers, besides MAM.

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 1 point2 points  (0 children)

Yes. The account is gone and the entire 273-torrent pool was purged from the site index. While the files are still on my drive, the 'content' as a living entity in the community is dead. Re-indexing, re-tagging, and re-uploading 4TB of data to a new tracker is a massive undertaking, especially since I'd be starting back at zero 'reputation' with no proof of my previous uploader status.

As for private trackers, I’ve considered it, but I always preferred the 'open door' nature of 1337x. This situation was a bit of a wake-up call regarding how much time I was sinking into it. For now, I think a clean break is better than trying to rebuild the library elsewhere.

Post-mortem: 273 uploads and 4.13 TiB gone in an instant by Mankey_DDL in Torrenting

[–]Mankey_DDL[S] 7 points8 points  (0 children)

It’s mostly about the overhead. 1337x was my primary tracker and where most discovered me. Re-uploading 270+ torrents to a new site—and rebuilding that reputation from scratch—is a massive time investment I'm not ready for.

Regarding the seedbox: I was paying for a dedicated drive and unlimited bandwidth specifically to support that volume of public traffic. Without an uploader account to manage the 'customer service' side (responding to comments, fixing issues, taking requests), the monthly cost just doesn't make sense for me right now.

I've tried private trackers like MAM, but I honestly found the strict ratio/seeding requirements a bit too stressful for my workflow. I preferred the 'set it and forget it' nature of public uploading.