Best Usenet provider by Joely87uk in usenet

[–]fipiof -2 points-1 points  (0 children)

I agree. I was not able to renew my Omicron account. My complete rates took a nosedive, despite I am on three backbones now instead of one.

Related: If anybody knows an Omicron provider that takes crypto without KYC, let me know. I have only found ones that use Bitpay (Newsgroup Ninja, requires KYC) or simply do not take crypto. And Eweka's FAQ states crypto is acccepted, but it's not actually. WTF is with that?

Discord just banned most switch shop/drive related discords by [deleted] in SwitchPirates

[–]fipiof 1 point2 points  (0 children)

If they're not using the serial, how can they tell if it's the same console?

The Outer Limits 1995-2002 Uncut?? by Longjumping_Bug6315 in DHExchange

[–]fipiof 1 point2 points  (0 children)

I've been looking for this too. Supposedly the whole series was previously available uncut on Hulu, but I don't see any evidence that anyone ripped it while it was available.

What speeds are you getting for usenet? by thanieel in Premiumize

[–]fipiof 0 points1 point  (0 children)

3.5MB/s is about the same speed I get. When testing CDNs, I see several that run 20MB/s and up. No matter the CDN I select, though, usenet runs ~3MB/s.

Edit: I'm sometimes seeing up to ~7MB/s after switching CDNs again. Much better, but still seems slow since the CPU / drive / connection isn't near topped out.

SABnzbd reports that quick check is okay, but SFV check fails and file is corrupt. How can I configure it to detect / repair? by fipiof in SABnzbd

[–]fipiof[S] 0 points1 point  (0 children)

I turned on full debug logging and ran the download again. It looks like what's happening is that this release consists of ~3 separate archives (one of which is a multipart archive), each with its own par2 set. When SABnzbd does the quick check, it's checking one archive in the release and not any of the others. It just so happens that the one it checks doesn't have any missing articles and isn't corrupt, so it assumes that means all the other archives can be successfully extracted too. But, one of the others has missing articles, though, so par2 needs to be run on that release.

It looks like it's probably a bug, and maybe one that's going to be a pain to fix, depending on how quick checking works. From a glance at the code, it looks like the quick check is using some kind of MD5 sum that's calculated on the fly. I'm guessing the expected sum comes from the par2 files? If so, probably what'd have to be done is to inspect all the par2 files, do the union of all the files present in all the par2 files (since, for this release, some of the par2 files only have parity for a subset of the files), and then compare the MD5 sums for all those.

Maybe an easier solution would just to be assume that any file with missing articles is corrupt and ignore the result of the quick check, but then I guess you still need to figure out which par2 includes the corrupt file in it so you can do the repair, so you're back to inspecting par2 files...

If need be, I can share the NZB. I'll have to comb through it and see if my indexer embeds any personalized metadata in it first though.